skip to main content
10.1145/2207676.2208296acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Unlocking the expressivity of point lights

Authors Info & Claims
Published:05 May 2012Publication History

ABSTRACT

Small point lights (e.g., LEDs) are used as indicators in a wide variety of devices today, from digital watches and toasters, to washing machines and desktop computers. Although exceedingly simple in their output - varying light intensity over time - their design space can be rich. Unfortunately, a survey of contemporary uses revealed that the vocabulary of lighting expression in popular use today is small, fairly unimaginative, and generally ambiguous in meaning. In this paper, we work through a structured design process that points the way towards a much richer set of expressive forms and more effective communication for this very simple medium. In this process, we make use of five different data gathering and evaluation components to leverage the knowledge, opinions and expertise of people outside our team. Our work starts by considering what information is typically conveyed in this medium. We go on to consider potential expressive forms -- how information might be conveyed. We iteratively refine and expand these sets, concluding with ideas gathered from a panel of designers. Our final step was to make use of thousands of human judgments, gathered in a crowd-sourced fashion (265 participants), to measure the suitability of different expressive forms for conveying different information content. This results in a set of recommended light behaviors that mobile devices, such as smartphones, could readily employ.

Skip Supplemental Material Section

Supplemental Material

paperfile689-3.mov

mov

22 MB

References

  1. Amazon Mechanical Turk. http://www.mturk.comGoogle ScholarGoogle Scholar
  2. Android API, "Adding flashing lights." Retrieved September 13, 2011. http://developer.android.com/ guide/topics/ui/notifiers/notifications.html#Lights.Google ScholarGoogle Scholar
  3. Arons, B. SpeechSkimmer: interactively skimming recorded speech. In Proc. UIST '93, 187--196. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Baumann, K. and Thomas, B. (2001). User Interface Design of Electronic Appliances, 1st Ed. CRC Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Blattner, M. M., Sumikawa, D. A., and Greenberg, R. M. Earcons and icons: their structure and common design principles. Human-Comp. Inter. 4, 1 (1989), 11--44. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Brewster, S. and Brown, L. M. Tactons: structured tactile messages for non-visual information display. In Proc. Australasian User Interface '04. 15--23. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Brewster, S. A., Wright, P. C. and Edwards, A. An evaluation of earcons for use in auditory humancomputer interfaces. In Proc. CHI '93. 222--227. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Chen, C., Forlizzi, J., and Jennings, P. ComSlipper: an expressive design to support awareness and availability. In CHI '06 Ext. Abs. 369--374. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Dahley, A., Wisneski, C., and Ishii, H. Water lamp and pinwheels: ambient projection of digital information into architectural space. In Proc. CHI '98. 269--270. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Easterby, R. The Perception of Symbols for Machine Displays. Ergonomics 13, 1 (1970), 149--158.Google ScholarGoogle ScholarCross RefCross Ref
  11. Enriquez, M. J. and MacLean, K. E. The Hapticon Editor: A Tool in Support of Haptic Communication Research. In Proc. HAPTICS '03. 356--362. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Gaver, W. Auditory Icons: Using Sound in Compute Interfaces. Human-Comp. Interac. 2, 2 (1986), 167--177. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Gaver, W. The Sonic Finder: An Interface that Uses Auditory Icons. Human-Computer Interaction, 4, 1 (1989), 67--94. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Hankinson, J. C. and Edwards, A. D. Designing earcons with musical grammars. SIGCAPH Comput. Phys. Handicap. 65 (Sep. 1999), 16--20. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Harrison, C. and Hudson, S. E. Texture displays: a passive approach to tactile presentation. In Proc. CHI '09. 2261--2264. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Harrison, C., Hsieh, G., Willis, K. D. D., Forlizzi, J. and Hudson, S. E. Kineticons: using iconographic motion in graphical user interface design. In Proc. CHI '11. 1999--2008 Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Holmquist L. E., and Skog, T. Informative art: information visualization in everyday environments. In Proc. GRAPHITE '03. 229--235. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Huppi, B. Q., Stringer, C. J., Bell, J. and Capener, C. J. United States Patent 6658577: Breathing status LED indicator, 2003.Google ScholarGoogle Scholar
  19. Kittur, A., Chi, E. H., and Suh, B. Crowdsourcing user studies with Mechanical Turk. In Proc. CHI '08. 453--456. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Lee, S. H., and Blake, R. Visual form created solely from temporal structure. Science, 284 (1999). 11651168.Google ScholarGoogle Scholar
  21. Lim, B. Y. Shick, A., Harrison, C., and Hudson, S. E. Pediluma: motivating physical activity through contextual information and social influence. In Proc. TEI '11. 173--180. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Lodding, K. Iconic Interfacing. IEEE Computer Graphics and Applications. 3, 2 (1983), 11--20. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. MacIntyre, B., Mynatt, E. D., Voida, S., Hansen, K. M, Tullio, J., Corso, G. M., Support for multitasking and background awareness using interactive peripheral displays. In Proc. UIST '01. 41--50. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Mankoff, J., Dey, A. K., Hsieh, G., Kientz, J., Lederer, S. and Ames. M. Heuristic evaluation of ambient displays. In Proc. CHI '03. 169--176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Pintus, A. V. Tangible lightscapes. In Proc. TEI '10. 379--380. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Pintus, A. V. A collection of light behaviours (video). Retrieved Sept. 7, 2011. http://vimeo.com/8612242Google ScholarGoogle Scholar
  27. Rzeszotarski, J. and Kittur, A., Instrumenting the crowd: Using implicit behavioral measures to predict task performance. In Proc. UIST '11. 13--22. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Seitinger, S., Taub, D. M. and Taylor, A. S. Light bodies: exploring interactions with responsive lights. In Proc. TEI '10. 113--120. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Weiser, M. and Brown, J. S. 1997. The coming age of calm technology. In Beyond Calculation: The Next Fifty Years of Computing. Denning P. J. and Metcalfe, R. M. (eds), Springer-Verlag. 75--85. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Wessolek, D. 2008. Semiotic Foundations of Illuminants as Time-Based Medium in Space: Experiments and Artifacts. MFA Thesis, Bauhaus University Weimar, Germany.Google ScholarGoogle Scholar
  31. Wessolek, D. Bouncing glow: methods of creating content elements for one-pixel-displays. In Proc. TEI '11. 443--444. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Williams, A., Farnham, S., and Counts, S. Exploring wearable ambient displays for social awareness. In CHI '06 Ext. Abs. 1529--1534. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Unlocking the expressivity of point lights

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        CHI '12: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
        May 2012
        3276 pages
        ISBN:9781450310154
        DOI:10.1145/2207676

        Copyright © 2012 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 5 May 2012

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate6,199of26,314submissions,24%

        Upcoming Conference

        CHI '24
        CHI Conference on Human Factors in Computing Systems
        May 11 - 16, 2024
        Honolulu , HI , USA

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader