Back to Search Start Over

Extending multimedia languages to support multimodal user interactions

Authors :
Simone Diniz Junqueira Barbosa
Álan L. V. Guedes
Roberto Gerson de Albuquerque Azevedo
Source :
Multimedia Tools and Applications. 76:5691-5720
Publication Year :
2016
Publisher :
Springer Science and Business Media LLC, 2016.

Abstract

Historically, the Multimedia community research has focused on output modalities, through studies on timing and multimedia processing. The Multimodal Interaction community, on the other hand, has focused on user-generated modalities, through studies on Multimodal User Interfaces (MUI). In this paper, aiming to assist the development of multimedia applications with MUIs, we propose the integration of concepts from those two communities in a unique high-level programming framework. The framework integrates user modalities --both user-generated (e.g., speech, gestures) and user-consumed (e.g., audiovisual, haptic)-- in declarative programming languages for the specification of interactive multimedia applications. To illustrate our approach, we instantiate the framework in the NCL (Nested Context Language) multimedia language. NCL is the declarative language for developing interactive applications for Brazilian Digital TV and an ITU-T Recommendation for IPTV services. To help evaluate our approach, we discuss a usage scenario and implement it as an NCL application extended with the proposed multimodal features. Also, we compare the expressiveness of the multimodal NCL against existing multimedia and multimodal languages, for both input and output modalities.

Details

ISSN :
15737721 and 13807501
Volume :
76
Database :
OpenAIRE
Journal :
Multimedia Tools and Applications
Accession number :
edsair.doi...........25c197c36a16e82b9b3276fc3149c33a
Full Text :
https://doi.org/10.1007/s11042-016-3846-8