ABSTRACT
Eye gaze is an important conversational resource that until now could only be supported across a distance if people were rooted to the spot. We introduce EyeCVE, the world’s first tele-presence system that allows people in different physical locations to not only see what each other are doing but follow each other’s eyes, even when walking about. Projected into each space are avatar representations of remote participants, that reproduce not only body, head and hand movements, but also those of the eyes. Spatial and temporal alignment of remote spaces allows the focus of gaze as well as activity and gesture to be used as a resource for non-verbal communication. The temporal challenge met was to reproduce eye movements quick enough and often enough to interpret their focus during a multi-way interaction, along with communicating other verbal and non-verbal language. The spatial challenge met was to maintain communicational eye gaze while allowing free movement of participants within a virtually shared common frame of reference. This paper reports on the technical and especially temporal characteristics of the system.
- M. Argyle and J. Graham. The central europe experiment - looking at persons and looking at things. Journal of Environmental Psychology and Nonverbal Behaviour, 1:6-16, 1977.Google ScholarCross Ref
- C. Goodwin. The interactive construction of a sentence in natural conversation. In G. Psathas, editor, Everyday Language: Studies in Ethnomethodology, pages 97-12. New York: Irvington, 1979.Google Scholar
- C. Goodwin. Conversational Organization: Interaction Between Speakers and Hearers. New York: Academic Press, 1981.Google Scholar
- C. Goodwin. Action and embodiment within situated human interaction. Journal of Pragmatics, 32:1489-1522, 2000.Google Scholar
- M. Gross, S. Wrmlin, M. Naef, E. Lamboray, C. Spagno, A. Kunz, E. Koller-Meier, T. Svoboda, L. Van Gool, S. Lang, K. Strehlke, A. V. Moere, and O. Staadt. blue-c: a spatially immersive display and 3d video portal for telepresence. ACM Transactions on Graphics (TOG), 22(2):819- 827, 2003. Google ScholarDigital Library
- J. Hauber, H. Regenbrecht, M. Billinghurst, and A. Cockburn. Spatiality in videoconferencing. In Proceedings of ACM Computer Supported Cooperative Work, pages 413- 422, 2006. Google ScholarDigital Library
- E. Isaacs and J. Tang. What video can and cannot do for collaboration. Multimedia Systems, 2(2):63-73, 1994. Google ScholarDigital Library
- A. Kenny, H. Koesling, D. Delaney, S. McLoone, and T. Ward. A preliminary investigation into eye gaze data in a first person shooter game. In R. Z. E. K. Y. Merkuryev, editor, Proceedings of Modelling and Simulation, pages 733- 740, Riga, Latvia, 2005. ECMS.Google Scholar
- S. C. Khullar and N. I. Badler. Where to look? automating attending behaviors of virtual human characters. In Proceedings of Autonomous Agents, pages 16-23, New York, NY, USA, 1999. ACM. Google ScholarDigital Library
- D. Kirk and D. S. Fraser. Comparing remote gesture technologies for supporting collaborative physical tasks. In Proceedings of SIGCHI Human Factors in computing systems, pages 1191-1200, New York, NY, USA, 2006. ACM. Google ScholarDigital Library
- H. Kuzuoka, T. Kosuge, and M. Tanaka. Gesturecam: a video communication system for sympathetic remote collaboration. In Proceedings ofACM Computer Supported Co-operative Work, pages 35-43, New York, NY, USA, 1994. ACM. Google ScholarDigital Library
- S. P. Lee, J. B. Badler, and N. I. Badler. Eyes alive. ACM Transactions on Graphics (TOG), 21(3):637-644, 2002. Google ScholarDigital Library
- R. Leigh and D. Zee. The neurology of eye movements. Oxford University Press, Oxford, 3 edition, 1999.Google Scholar
- G. Lerner. Selecting next speaker: the context-sensitive operation of a context free organization. Language in Society, 32(2):177-201, 2003.Google ScholarCross Ref
- M. Meehan, S. Razzaque, M. Whitton, and J. Brooks, F.P. Effect of latency on presence in stressful virtual environments. In Proceedings of IEEE Virtual Reality, pages 141- 148, 2003. Google ScholarDigital Library
- A. Murgia, R. Wolff, W. Steptoe, P. Sharkey, D. Roberts, E. de V. Guimaraes, A. Steed, and J. Rae. A tool for replay and analysis of gaze-enhanced multiparty sessions captured in immersive collaborative environments. In Distributed Simulation and Real-Time Applications, Vancouver, Canada, 2008. Google ScholarDigital Library
- N. Murray and D. Roberts. Comparison of head gaze and head and eye gaze within an immersive environment. In Proceedings of IEEE Distributed Simulation and Real-Time Applications, pages 70-76, Washington, DC, USA, 2006. IEEE Computer Society. Google ScholarDigital Library
- D. Nguyen and J. Canny. Multiview: Spatially faithful group video conferencing. In Proceedings of SIGCHI Human Factors in computing systems, pages 512-521. ACM Press, 2004. Google ScholarDigital Library
- K. Okada, F. Maeda, Y. Ichikawaa, and Y. Matsushita. Multiparty videoconferencing at virtual social distance: Majic design. In Proceedings of ACM Computer Supported Cooperative Work, pages 385-393, New York, 1994. ACM. Google ScholarDigital Library
- B. Pan, H. A. Hembrooke, G. K. Gay, L. A. Granka, M. K. Feusner, and J. K. Newman. The determinants of web page viewing behavior: an eye-tracking study. In Proceedings of Eye tracking Research & Applications, pages 147-154, New York, NY, USA, 2004. ACM. Google ScholarDigital Library
- R. Raskar, G. Welch, M. Cutts, A. Lake, L. Stesin, and H. Fuchs. The ofce of the future: A unied approach to image-based modeling and spatially immersive displays. In Proceedings of Computer Graphics and Interactive Techniques , pages 179-188. ACM Press, 1998. Google ScholarDigital Library
- K. Rayner. Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 85:618-660, 1998.Google ScholarCross Ref
- D. Roberts, D. Marshall, S. McLoone, D. Delaney, and R. Aspin. Exploring the use of local inconsistency measures as thresholds for dead reckoning update packet generation. In Distributed Simulation and Real-Time Applications, pages 95-202, Montreal, Canada, 2005. Google ScholarDigital Library
- D. J. Roberts, I. Heldal, O. Otto, and R. Wolff. Factors influencing flow of object focussed collaboration in collaborative virtual environments. Virtual Reality Journal, Special Issue on Collaborative Virtual Environments for Creative People, 10(2):116-133, 2006. Google ScholarDigital Library
- W. Steptoe, R. Wolff, A. Murgia, E. de V. Guimaraes, J. Rae, and A. Steed. Eye-tracking for avatar eye-gaze and interactional analysis in immersive collaborative virtual environments. In Proceedings of ACM Computer Supported Cooperative Work, San Diego, California, USA, 2008. Google ScholarDigital Library
- R. Vertegaal, G. V. der Veer, and H. Vons. Effects of gaze on multiparty mediated communication. In Proceedings of Graphics Interface 2000, pages 95-102. Morgan Kaufmann, 2000.Google Scholar
- R. Vertegaal, I. Weevers, C. Sohn, and C. Cheung. Gaze- 2: conveying eye contact in group video conferencing using eye-controlled camera direction. In Proceedings of SIGCHI Human Factors in Computing Systems, pages 521-528, New York, NY, USA, 2003. ACM. Google ScholarDigital Library
Index Terms
- Communicating Eye Gaze across a Distance without Rooting Participants to the Spot
Recommendations
Non-intrusive eye gaze estimation without knowledge of eye pose
FGR' 04: Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognitionEye gaze estimation is to detect the eye gaze of human. By detection of the eye gaze, a machine can read the intention. Eye gaze is very useful to read human intention. In this paper, a new eye gaze estimation method is suggested. The method allows ...
Communicating Eye-gaze Across a Distance: Comparing an Eye-gaze enabled Immersive Collaborative Virtual Environment, Aligned Video Conferencing, and Being Together
VR '09: Proceedings of the 2009 IEEE Virtual Reality ConferenceEye gaze is an important and widely studied non-verbal resource in co-located social interaction. When we attempt to support tele-presence between people, there are two main technologies that can be used today: video-conferencing (VC) and collaborative ...
Gaze from Head: Gaze Estimation Without Observing Eye
Pattern RecognitionAbstractWe propose a gaze estimation method not from eye observation but from head motion. This proposed method is based on physiological studies about the eye-head coordination, and the gaze direction is estimated from observation of head motion by using ...
Comments