Abstract
The continued advancement in computer interfaces to support 3D tasks requires a better understanding of how users will interact with 3D user interfaces in a virtual workspace. This article presents two studies that investigated the effect of visual, auditory, and haptic sensory feedback modalities presented by a virtual button in a 3D environment on task performance (time on task and task errors) and user rating. Although we expected task performance to improve for conditions that combined two or three feedback modalities over a single modality, we instead found a significant emergent behavior that decreased performance in the trimodal condition. We found a significant increase in the number of presses when a user released the button before closing the virtual switch, suggesting that the combined visual, auditory, and haptic feedback led participants to prematurely believe they actuated a button. This suggests that in the design of virtual buttons, considering the effect of each feedback modality independently is not sufficient to predict performance, and unexpected effects may emerge when feedback modalities are combined.
Supplemental Material
Available for Download
Supplemental movie, appendix, image and software files for, Emergent effects in multimodal feedback from virtual buttons
- E. M. Altinsoy. 2003. Perceptual aspects of auditory-tactile asynchrony. In Proc. ICSV 10. 3831--3838.Google Scholar
- J. L. Burke, M.S. Prewett, A. A. Gray, L. Yang, F. R. B. Stilson, M. D. Coovert, L. R. Elliot, and E. Redden. 2006. Comparing the effects of visual-auditory and visual-tactile feedback on user performance: a meta-analysis. In Proc. ICMI’06. ACM. Google ScholarDigital Library
- G. Chu, T. Moscovich, and R. Balakrishnan. 2009. Haptic conviction widgets. In GI’09: Proceedings of Graphics Interface 2009. 207--210. Google ScholarDigital Library
- C. R. Clare. 1976. Human factors: a most important ingredient in keyboard designs. EDN Magazine 21, 8 (1976), 99--102.Google Scholar
- A. Cockburn and S. Brewster. 2005. Multimodal feedback for the acquisition of small targets. Ergonomics 48, 9 (2005), 1129--1150.Google ScholarCross Ref
- K. M. Cohen. 1982. Membrane keyboards and human performance. Human Factors and Ergonomics Society Annual Meeting Proceedings 26 (1982), 424--424.Google ScholarCross Ref
- F. B. Colavita. 1974. Human sensory dominance. Perception and Psycophysics 16, 2 (1974), 409--412.Google ScholarCross Ref
- D. E. DiFranco, G. L. Beauregard, and M.A. Srinivasan. 1997. The effect of auditory cues on the haptic perception of stiffness in virtual environments. In ASME Dynamic Systems and Control Division, Vol. 61. ASME, 17--22.Google Scholar
- M. Fukumoto and T. Sugimura. 2001. Active click: tactile feedback for touch panels. In ACM CHI’01 Extended Abstracts on Human Factors in Computing Systems. 121--122. Google ScholarDigital Library
- Y. Guiard. 1987. Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. Journal of Motor Behavior 19 (1987), 486--517.Google ScholarCross Ref
- H3DAPI. 2013. Open Source Haptics. Retrieved from http://www.h3dapi.org.Google Scholar
- M. Hall, E. Hoggan, and S. Brewster. 2008. T-Bars: towards tactile user interfaces for mobile touchscreens. In MobileHCI’08. 411--414. Google ScholarDigital Library
- D. Hecht and M. Reiner. 2009. Sensory dominance in combinations of audio, visual and haptic stimuli. Experimental Brain Research 193 (Jan 2009), 307--314. Retrieved http://www.springerlink.com/index/Q69168K166M86757.pdfGoogle Scholar
- T. Hempel and E. Altinsoy. 2005. Multimodal user interfaces: designing media for the auditory and the tactile channel. In Handbook of HCI for Web Design, R.W. Proctor and K.-P. L. Vu (Eds.). Lawrence Erlbaum Associates, Mahwah, New Jersey, 134--155.Google Scholar
- E. Hoggan and S. Brewster. 2006. Crossmodal icons for information display. In ACM CHI’06 Extended Abstracts on Human Factors in Computing Systems. 857--862. Google ScholarDigital Library
- E. Hoggan, A. Crossan, S. Brewster, and T. Kaaresoja. 2009. Audio or tactile feedback: which modality when?. In ACM CHI’09. 2253--2256. Google ScholarDigital Library
- E. Hoggan, T. Kaaresoja, P. Laitinen, and S. Brewster. 2008. Crossmodal congruence: the look, feel and sound of touchscreen widgets. Proc. IMCI’08. 157--164. Google ScholarDigital Library
- R. D. Kinkead and B. K. Gonzalez. 1969. Human Factors Design Recommendations for Touch-operated Keyboards—Final Report. Technical Report 12091-FR. Honeywell, Inc.Google Scholar
- R. Klatzky and S. J. Lederman. 2008. Object recognition by touch. In Blindness and Brain Plasticity in Navigation and Object Perception. 185--207.Google Scholar
- A. Kohlrausch and S. van de Par. 1999. Auditory-visual interaction: from fundamental research in cognitive psychology to (possible) applications. In Proc. SPIE. 34--44.Google Scholar
- A. Lécuyer, J.-M. Burkhardt, S. Coquillart, and P. Coiffet. 2001. “Boundary of illusion:” An experiment of sensory integration with a pseudo-haptic system. In Proc. VR’01. IEEE Computer Society, Washington, DC, 115--122. Google ScholarDigital Library
- S. Lee and S. Zhai. 2009. The performance of touch screen soft buttons. In ACM CHI’09. 309--318. Google ScholarDigital Library
- R. Leung, K. MacLean, M. B. Bertelsen, and M. Saubhasik. 2007. Evaluation of haptically augmented touchscreen gui elements under cognitive load. In Proc. ICMI’07. ACM, New York, NY, 374--381. DOI: http://dx.doi.org/10.1145/1322192.1322258 Google ScholarDigital Library
- V. I. Levenshtein. 1965. Binary codes capable of correcting deletions, insertions, and reversals. Problems in Information Transmission (1965), 8--17.Google Scholar
- J. R. Lewis, K. M. Potosnak, and R. L. Magyar. 1997. Keys and Keyboards. Elsevier, Amsterdam, Chapter 54, 1285--1315.Google Scholar
- J. Long. 1976. Effects of delayed irregular feedback on unskilled and skilled keying performance. Ergonomics 19, 2 (1976), 183--202.Google ScholarCross Ref
- T. H. Massie and J. K. Salisbury. 1994. The PHANTOM haptic interface: A device for probing virtual objects. In ASME Winter Annual Meeting.Google Scholar
- H. McGurk and J. MacDonald. 1976. Hearing lips and seeing voices. Nature 264, 5588 (Dec. 1976), 746--748.Google ScholarCross Ref
- N. Miner, B. Gillespie, and T. Caudell. 1996. Examining the influence of audio and visual stimuli on a haptic display. In Proc. IMAGE’96.Google Scholar
- A. Nashel and S. Razzaque. 2003. Tactile virtual buttons for mobile devices. In ACM CHI’03. 854--855. Google ScholarDigital Library
- S. Oviatt. 2007. Multimodal Interfaces. CRC Press, 413--432.Google Scholar
- D. Pollard and M. B. Cooper. 1979. The effect of feedback on keying performance. Applied Ergonomics 10, 4 (1979), 194--200.Google ScholarCross Ref
- I. Poupyrev, S. Maruyama, and Jun Rekimoto. 2002. Ambient touch: designing tactile interfaces for handheld devices. In ACM UIST’02. 51--60. Google ScholarDigital Library
- Pure Data. 2013. PD Community Site. Retrieved from http://puredata.info/.Google Scholar
- C. J. Roe, W. H. Muto, and T. Blake. 1984. Feedback and key discrimination on membrane keypads. In Human Factors and Ergonomics Society Annual Meeting 28 (1984), 277--281.Google Scholar
- L. Rosenberg and S. Brave. 1996. Using force feedback to enhance human performance in graphical user interfaces. In CHI’96: Conference Companion on Human Factors in Computing Systems. ACM, New York, NY, 291--292. DOI: http://dx.doi.org/10.1145/257089.257327 Google ScholarDigital Library
- Sensable. 2013. PHANTOM Omni Haptic Device. Retrieved from http://www.sensable.com/haptic-phantom-omni.htm.Google Scholar
- M. A. Srinivasan, G. L. Beauregard, and D. L. Brock. 1996. The impact of visual information on the haptic perception of stiffness in virtual environments. In ASME Winter Annual Meeting, Vol. 58. 555--559. Retrieved from http://touchlab.mit.edu/publications/1996_006.pdf.Google Scholar
- H. S. Vitense, J. A. Jacko, and V. K. Emery. 2002. Multimodal feedback: establishing a performance baseline for improved access by individuals with visual impairments. In Assets’02: Proceedings of the 5th International ACM Conference on Assistive Technologies. ACM, 49--56. Google ScholarDigital Library
- R. A. Wagner and M. J. Fischer. 1974. The String-to-String Correction Problem. J. ACM 21, 1 (1974), 168--173. Google ScholarDigital Library
- C. D. Wickens. 2002. Multiple resources and performance prediction. Theoretical Issues in Ergonomics Science 3, 2 (2002), 159--177.Google ScholarCross Ref
Index Terms
- Emergent effects in multimodal feedback from virtual buttons
Recommendations
Effects of modality on virtual button motion and performance
ICMI '12: Proceedings of the 14th ACM international conference on Multimodal interactionThe simple action of pressing a button is a multimodal interaction with an interesting depth of complexity. As the development of computer interfaces supporting 3D tasks progresses, there is a need to understand how users will interact with virtual ...
Virtual Buttons for Eyes-Free Interaction: A Study
Human-Computer Interaction – INTERACT 2015AbstractThe touch screen of mobile devices, such as smart phones and tablets, is their primary input mechanism. While designed to be used in conjunction with its output capabilities, eyes-free interaction is also possible and useful on touch screens. ...
A hindi virtual keyboard interface with multimodal feedback: a case study with a dyslexic child
HCI '18: Proceedings of the 32nd International BCS Human Computer Interaction ConferenceUp to 15% of the Indian school-going children suffer from dyslexia. This paper aims to determine the extent to which existing knowledge about the eye-tracking based human-computer interface can be used to assist these children in their reading and ...
Comments