- 1.Bolt, R. "Put-That-There": Voice and Gesture at the hics interface, Computer Graphics, 1980, 14 (3): 270. Google ScholarDigital Library
- 2.Cohen, P., Dalrymple, M., Moran, D. & Pereira, F. Synergistic use of direct manipulation and natural language, CHI 89 Conf. Proc., ACM: Addison Wesley, New York, 1989, 227-234. Google ScholarDigital Library
- 3.Kendon, A. Gesticulation and speech: Two aspects of the process of utterance, The Relationship of Verbal and Nonverbal Communication (ed. by M. Key), The Hague: Mouton, 1980, 207-227.Google ScholarCross Ref
- 4.Koons, D., Sparrell, C. & Thorisson, K. Integrating simultaneous input from speech, gaze, and hand gestures, Intelligent Multimedia Interfaces, ed. by M. Maybury, MIT Press: Cambridge, MA, 1993, 257-76. Google ScholarDigital Library
- 5.Levelt, W., Richardson, G. & Heu, W. Pointing and voicing in deictic expressions, Jour. of Memory and Language, 1985, 24, 133-164.Google ScholarCross Ref
- 6.McNeill, D. Hand and Mind: What gestures reveal about thought, Univ. of Chicago Press: Chicago, II1., 1992.Google Scholar
- 7.McNeill, D. Language as gesture (Gesture as language), Proc. of the Workshop on the Integration of Gesture in Language & Speech, ed. by L. Messing, Univ. of Delaware, Oct. 1996, 1-20.Google Scholar
- 8.Naughton, K. Spontaneous gesture and sign: A study of ASL signs co-occurring with speech, Proc. of the Workshop on the Integration of Gesture in Language & Speech, ed. by L. Messing, Univ. of Delaware, Oct. 1996, 125-34.Google Scholar
- 9.Neal, J. & Shapiro, S. Intelligent multi-media interface technology, in Intelligent User Interfaces (J. Sullivan & S. Tyler, eds.), ACM: Addison Wesley, New York, 1991, ch. 3, 45-68. Google ScholarDigital Library
- 10.Oviatt, S.L. Multimodal interfaces for dynamic interactive maps, CHI '96 Conf Proc., New York, ACM Press, 1996, 95-102. Google ScholarDigital Library
- 11.Oviatt, S., Cohen, P., Fong, M., & Frank, M. A rapid semi-automatic simulation technique for investigating interactive speech and handwriting, Proc. of the Intl. Conf. on Spoken Language Processing, 1992, 2, 1351-54.Google Scholar
- 12.Oviatt, S., Cohen, P. Johnston, M. & Kuhn, K. Multimodal language: Linguistic features and processing requirements, forthcoming.Google Scholar
- 13.Oviatt, S., Cohen, P. & Wang, M. Toward interface design for human language technology: Modality and structure as determinants of linguistic complexity, Speech Communication, 1994,15 (3-4), 283-300. Google ScholarDigital Library
- 14.Oviatt, S. & Olsen, E. Integration themes in multimodal human-computer interaction, Proc. of the lntl Conf. on Spoken Language Processing, 1994, 2, 551-554.Google Scholar
- 15.Oviatt, S. L. & vanGent, R. Error resolution during multimodal human-computer interaction, Proc. of the Intl. Conf. on Spoken Language Processing, 1996.Google ScholarCross Ref
- 16.Pittman, J., Cohen, P., Smith, I., Yang, T. & Oviatt, S. Quickset: A multimodal interface for distributed interactive simulations, Proc. of the 6th Conf. on Computer-Generated Forces & Behavior Representation, Univ. of Central Florida, Orlando, FL., 1996, 217-24.Google Scholar
Index Terms
- Integration and synchronization of input modes during multimodal human-computer interaction
Recommendations
Toward a theory of organized multimodal integration patterns during human-computer interaction
ICMI '03: Proceedings of the 5th international conference on Multimodal interfacesAs a new generation of multimodal systems begins to emerge, one dominant theme will be the integration and synchronization requirements for combining modalities into robust whole systems. In the present research, quantitative modeling is presented on ...
Integration and synchronization of input modes during multimodal human-computer interaction
ReferringPhenomena '97: Referring Phenomena in a Multimedia Context and their Computational TreatmentOur ability to develop robust multimodal systems will depend on knowledge of the natural integration patterns that typify people's combined use of different input modes. To provide a foundation for theory and design, the present research analyzed ...
A mobile multimodal dialogue system for public transportation navigation evaluated
MobileHCI '06: Proceedings of the 8th conference on Human-computer interaction with mobile devices and servicesAs the technical capabilities of latest mobile devices are combined with mobile broadband internet access, we are ready to make use of free and natural speech in mobile services by utilizing optional and complementary means of input. In these kinds of ...
Comments