ABSTRACT
We present a computer vision technique to detect when the user brings their thumb and forefinger together (a pinch gesture) for close-range and relatively controlled viewing circumstances. The technique avoids complex and fragile hand tracking algorithms by detecting the hole formed when the thumb and forefinger are touching; this hole is found by simple analysis of the connected components of the background segmented against the hand. Our Thumb and Fore-Finger Interface (TAFFI) demonstrates the technique for cursor control as well as map navigation using one and two-handed interactions.
Supplemental Material
Available for Download
Slides from the presentation
Supplemental material for Robust computer vision-based detection of pinching for one and two-handed gesture input
- Balakrishnan, R. and MacKenzie, I. S., Performance differences in the fingers, wrist, and forearm in computer input control. in Proceedings of the CHI '97 Conference on Human Factors in Computing Systems, (1997), 303--310. Google ScholarDigital Library
- Baudel, T. and Beaudouin-Lafon, M. Charade: remote control of objects using free-hand gestures. Communications of the ACM, 36 (7). 28--35. Google ScholarDigital Library
- Bowman, D., Wingrave, C., Campbell, J., Ly, V. and Rhoton, C. Novel uses of pinch gloves for virtual environment interaction techniques. Virtual Reality, 6 (3). 122--129.Google Scholar
- Buxton, W., A Three-State Model of Graphical Input. in INTERACT '90, (1990), 449--456. Google ScholarDigital Library
- Hinckley, K., Czerwinski, M. and Sinclair, M., Interaction and modeling techniques for desktop two-handed input. in Proc. of the ACM UIST'98 Symposium on User Interface Software and Technology, (1998), 49--58. Google ScholarDigital Library
- Horn, B. K. P. Robot Vision. MIT Press, Cambridge, MA, 1986. Google ScholarDigital Library
- Krueger, M. W. Artificial Reality II. Addison-Wesley, Menlo Park, 1991.Google Scholar
- Kurtenbach, G., Fitzmaurice, G., Baudel, T. and Buxton, W., The design and evaluation of a GUI paradigm based on two-hands tablets and transparency. in Proceedings of the CHI '97 Conference on Human Factors in Computing Systems, (1997). Google ScholarDigital Library
- MacKenzie, I. S. and Oniszczak, A., A comparison of three selection techniques for touchpads. in Proc. ACM CHI' 98 Conf. on Human Factors in Computing Systems, (1998), 336--343. Google ScholarDigital Library
- Malik, S. and Laszlo, J., Visual touchpad: a two-hand gestural input device. in Proceedings of the 6th International Conference on Multimodal Interfaces, (State College, PA, 2004), 289--296. Google ScholarDigital Library
- Quek, F., Mysliwiec, T. and Zhao, M., FingerMouse: A freehand pointing interface. in IEEE Automatic Face and Gesture Recognition, (1995), 372--377.Google Scholar
- Wilson, A., PlayAnywhere: a compact tabletop computer vision system. in Proceedings of the 18th Annual ACM Symposium on User Interface Software Technology, (2005), 83--92. Google ScholarDigital Library
- Wilson, A. and Cutrell, E., FlowMouse: A computer vision-based pointing and gesture input device. in INTERACT '05, (Rome, Italy, 2005). Google ScholarDigital Library
Index Terms
- Robust computer vision-based detection of pinching for one and two-handed gesture input
Recommendations
Bimanual natural user interaction for 3D modelling application using stereo computer vision
CHINZ '12: Proceedings of the 13th International Conference of the NZ Chapter of the ACM's Special Interest Group on Human-Computer InteractionThis paper presents a system that allows the user to perform 3D modelling and sculpting using postures and 3D movements of their hands. The system utilises the concept of a Natural User Interface using computer vision techniques. This enables the user ...
Real-time 3D hand-computer interaction: optimization and complexity reduction
NordiCHI '08: Proceedings of the 5th Nordic conference on Human-computer interaction: building bridgesThis paper presents a low-cost method for enabling 3D hand-computer interaction. The method, accompanied by a system, uses the frame capturing functionality of a single consumer-grade webcam. Our recent work has been focused on examining and realizing a ...
Smartwatch-based Early Gesture Detection 8 Trajectory Tracking for Interactive Gesture-Driven Applications
The paper explores the possibility of using wrist-worn devices (specifically, a smartwatch) to accurately track the hand movement and gestures for a new class of immersive, interactive gesture-driven applications. These interactive applications need two ...
Comments