XPL and the synchronization of multimodal user interfaces based on design pattern

被引:0
|
作者
Liotta, M. [1 ]
Santangelo, A. [1 ]
Giuffrida, F. [1 ]
Gentile, A. [1 ]
Vella, G. [2 ]
Ingraffia, N. [2 ]
机构
[1] Dipartimento Ingn Informat, Palermo, Italy
[2] Engn Ingn Informat SpA, Res & Dev Lab, Rome, Italy
来源
CISIS 2008: THE SECOND INTERNATIONAL CONFERENCE ON COMPLEX, INTELLIGENT AND SOFTWARE INTENSIVE SYSTEMS, PROCEEDINGS | 2008年
关键词
D O I
10.1109/CISIS.2008.127
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The great diversity of presentations in software applications deals with fulfilment of various types of user interface constructions related to different programming languages. Furthermore, the growing interest for multimodal applications entails that their user interfaces have to support multiple access channels within a single development framework. User Interfaces Design Patterns (UIDPs) are helpful to define interaction schemas between user and computer and they provide remarkable tools for the design and reuse of software components. This paper describes the eXtensible Presentation architecture and Language (XPL), a framework aimed at streamlining multichannel interface design process and enabling full component reuse. To illustrate the benefits of using XPL in a large software design project, a case study of an XPL application concerning a questionnaire form filling system is described, highlighting the benefits of employing visual and verbal design patterns and their synchronization during multimodal interface development.
引用
收藏
页码:723 / +
页数:2
相关论文
共 50 条
  • [41] A model for the implicit satisfaction of IHM multimodal user interfaces
    Kamel, Nadjet
    Selouani, Sid-Ahmed
    Hamam, Habib
    2008 CANADIAN CONFERENCE ON ELECTRICAL AND COMPUTER ENGINEERING, VOLS 1-4, 2008, : 267 - 270
  • [42] Integrative rapid-prototyping for multimodal user interfaces
    Schuller, B.W.
    Lang, M.K.
    VDI Berichte, 2002, (1678): : 279 - 284
  • [43] Cross-disciplinary approaches to multimodal user interfaces
    Volpe, Gualtiero
    Camurri, Antonio
    Dutoit, Thierry
    Mancini, Maurizio
    JOURNAL ON MULTIMODAL USER INTERFACES, 2010, 4 (01) : 1 - 2
  • [44] Integrative rapid-prototyping for multimodal user interfaces
    Schuller, BW
    Lang, MK
    USEWARE 2002, 2002, 1678 : 279 - 284
  • [45] Emotional facial expression classification for multimodal user interfaces
    Cerezo, Eva
    Hupont, Isabelle
    ARTICULATED MOTION AND DEFORMABLE OBJECTS, PROCEEDINGS, 2006, 4069 : 405 - 413
  • [46] Design of Communication in Multimodal Web Interfaces
    Neto, Americo Talarico
    de Mattos Fortes, Renata Pontin
    Assis, Alessandro Rubim
    Anacleto, Junia Coutinho
    SIGDOC'09: PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON DESIGN OF COMMUNICATION, 2009, : 81 - 88
  • [47] Generating accessible multimodal user interfaces using MDA-based adaptation approach
    Zouhaier, Lamia
    Hlaoui, Yousra Bendaly
    Ben Ayed, Leila Jemni
    2014 IEEE 38TH ANNUAL INTERNATIONAL COMPUTERS, SOFTWARE AND APPLICATIONS CONFERENCE (COMPSAC), 2014, : 535 - 540
  • [48] Towards a design based on the user task of human interfaces in cockpit
    Tabary, D
    Abed, M
    Mafhoudhi, A
    PEOPLE IN CONTROL, 2001, (481): : 184 - 189
  • [49] Multimodal interfaces - A generic design approach
    Carbonell, N
    UNIVERSAL ACCESS IN HEALTH TELEMATICS, 2005, 3041 : 209 - 223
  • [50] A Trajectory-Based Approach for Device Independent Gesture Recognition in Multimodal User Interfaces
    Wilhelm, Mathias
    Roscher, Dirk
    Blumendorf, Marco
    Albayrak, Sahin
    HAPTIC AND AUDIO INTERACTION DESIGN, 2010, 6306 : 197 - 206