skip to main content
10.5555/1375714.1375754guideproceedingsArticle/Chapter ViewAbstractPublication PagesgiConference Proceedingsconference-collections
research-article
Free Access

SurfaceFusion: unobtrusive tracking of everyday objects in tangible user interfaces

Published:28 May 2008Publication History

ABSTRACT

Interactive surfaces and related tangible user interfaces often involve everyday objects that are identified, tracked, and augmented with digital information. Traditional approaches for recognizing these objects typically rely on complex pattern recognition techniques, or the addition of active electronics or fiducials that alter the visual qualities of those objects, making them less practical for real-world use. Radio Frequency Identification (RFID) technology provides an unobtrusive method of sensing the presence of and identifying tagged nearby objects but has no inherent means of determining the position of tagged objects. Computer vision, on the other hand, is an established approach to track objects with a camera. While shapes and movement on an interactive surface can be determined from classic image processing techniques, object recognition tends to be complex, computationally expensive and sensitive to environmental conditions. We present a set of techniques in which movement and shape information from the computer vision system is fused with RFID events that identify what objects are in the image. By synchronizing these two complementary sensing modalities, we can associate changes in the image with events in the RFID data, in order to recover position, shape and identification of the objects on the surface, while avoiding complex computer vision processes and exotic RFID solutions.

Skip Supplemental Material Section

Supplemental Material

p235-olwal-msmpeg4v2.avi

avi

15.2 MB

p235-olwal-cinepak.mov

mov

118.5 MB

References

  1. Boukraa, M. and Ando, S. Tag-based vision: assisting 3D scene analysis with radio-frequency tags. Image Processing 2002 (2002), I-269--I-2722.Google ScholarGoogle Scholar
  2. Dietz, P. and Leigh, D. DiamondTouch: a multi-user touch technology. UIST '01 (2001), 219--226. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Fishkin, K., Jiang, b., Philipose, M. and Roy, S. I Sense a Disturbance in the Force: Unobtrusive Detection of Interactions with RFID-tagged Objects. IRS-TR-04-013. (2004).Google ScholarGoogle Scholar
  4. Gonzalez, R. C., and Woods, G. Digital Image Processing. Addison Wesley (1993). Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Ishii, H. and Ullmer, B. Tangible bits: towards seamless interfaces between people, bits and atoms. CHI '97 (1997), 234--241. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Krahnstoever, N., Rittscher, J., Tu, P., Chean, K. and Tomlinson, T. Activity Recognition using Visual Tracking and RFID. WACV/MOTIONS '05 (2005), 494--500. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Microsoft Surface. http://www.microsoft.com/surface/. (Sep 2007).Google ScholarGoogle Scholar
  8. Olwal, A. LightSense: Enabling Spatially Aware Handheld Interaction Devices. ISMAR '06 (2006), 119--122. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Patten, J., Ishii, H., Hines, J., and Pangaro, G. Sensetable: a wireless object tracking platform for tangible user interfaces. CHI '01 (2001), 253--260. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Philips Entertaible. http://www.research.philips.com/initiatives/entertaible/. (Sep 2007).Google ScholarGoogle Scholar
  11. Rahimi, A, and Recht, B. Estimating Observation Functions in Dynamical Systems using Unsupervised Regression. NIPS (2006).Google ScholarGoogle Scholar
  12. Raskar, R., Beardsley, P., Dietz, P., and van Baar, J. Photosensing wireless tags for geometric procedures. Commun. ACM 48, 9 (2005), 46--51. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Reilly, D., Rodgers, M., Argue, R., Nunes, M., and Inkpen, K. Marked-up maps: combining paper maps and electronic information resources. Personal Ubiquitous Comput. 10, 4 (2006), 215--226. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Rekimoto, J. and Saitoh, M. Augmented surfaces: a spatially continuous work space for hybrid computing environments. CHI '99 (1999), 378--385. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Rekimoto, J., Ullmer, B., and Oba, H. DataTiles: a modular platform for mixed physical and graphical interactions. CHI '01 (2001), 269--276. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Sugimoto, M., Kusunoki, F,. and Hashizume, H. Supporting Face-toface Group Activities with a Sensor-Embedded Board. CSCW Workshop on Shared Environments to Support Face-to-Face Collaboration (2000).Google ScholarGoogle Scholar
  17. Ullmer, B., Ishii, H., and Jacob, R. Tangible Query Interfaces: Physically Constrained Tokens for Manipulating Database Queries. INTERACT '03 (2003), 279--286.Google ScholarGoogle Scholar
  18. Ullmer, B. and Ishii, H. 1997. The metaDESK: models and prototypes for tangible user interfaces. UIST '97. (1997) 223--232. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Ullmer, B., Ishii, H., and Glas, D. mediaBlocks: physical containers, transports, and controls for online media. SIGGRAPH '98 (1998), 379--386. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Want, R., Fishkin, K. P., Gujar, A., and Harrison, B. L. Bridging physical and virtual worlds with electronic tags. CHI '99 (1999), 370--377. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Wellner, P. Interacting with paper on the DigitalDesk. Commun. ACM 36, 7 (1993), 87--96. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Wilson, A. D. and Sarin, R. BlueTable: connecting wireless mobile devices on interactive surfaces using vision-based handshaking. Graphics Interface 2007 (2007), 119--125. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Wilson, A. D. PlayAnywhere: a compact interactive tabletop projection-vision system. UIST '05. (2005), 83--92. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. SurfaceFusion: unobtrusive tracking of everyday objects in tangible user interfaces

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image Guide Proceedings
      GI '08: Proceedings of Graphics Interface 2008
      May 2008
      301 pages
      ISBN:9781568814230

      Publisher

      Canadian Information Processing Society

      Canada

      Publication History

      • Published: 28 May 2008

      Qualifiers

      • research-article

      Acceptance Rates

      GI '08 Paper Acceptance Rate34of85submissions,40%Overall Acceptance Rate206of508submissions,41%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader