skip to main content
research-article

EngageMon: Multi-Modal Engagement Sensing for Mobile Games

Authors Info & Claims
Published:26 March 2018Publication History
Skip Abstract Section

Abstract

Understanding the engagement levels players have with a game is a useful proxy for evaluating the game design and user experience. This is particularly important for mobile games as an alternative game is always just an easy download away. However, engagement is a subjective concept and usually requires fine-grained highly disruptive interviews or surveys to determine accurately. In this paper, we present EngageMon, a first-of-its-kind system that uses a combination of sensors from the smartphone (touch events), a wristband (photoplethysmography and electrodermal activity sensor readings), and an external depth camera (skeletal motion information) to accurately determine the engagement level of a mobile game player. Our design was guided by feedback obtained from interviewing 22 mobile game developers, testers, and designers. We evaluated EngageMon using data collected from 64 participants (54 in a lab-setting study and another 10 in a more natural setting study) playing six games from three different categories including endless runner, 3D motorcycle racing, and casual puzzle. Using all three sets of sensors, EngageMon was able to achieve an average accuracy of 85% and 77% under cross-sample and cross-subject evaluations respectively. Overall, EngageMon can accurately determine the engagement level of mobiles users while they are actively playing a game.

References

  1. App Annie. 2017. Top Apps on iOS Store, United States, Overall, Feb 13, 2017. Available at: https://www.appannie.com/apps/ios/top/. (2017).Google ScholarGoogle Scholar
  2. Simon Attfield, Gabriella Kazai, Mounia Lalmas, and Benjamin Piwowarski. 2011. Towards a science of user engagement (position paper). In WSDM workshop on user modelling for Web applications. 9--12.Google ScholarGoogle Scholar
  3. Frank Biocca, Taeyong Kim, and Mark R Levy. 1995. The vision of virtual reality. Communication in the age of virtual reality (1995), 3--14. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Wolfram Boucsein. 2012. Electrodermal activity. Springer Science 8 Business Media.Google ScholarGoogle Scholar
  5. Jeanne H Brockmyer, Christine M Fox, Kathleen A Curtiss, Evan McBroom, Kimberly M Burkhart, and Jacquelyn N Pidruzny. 2009. The development of the Game Engagement Questionnaire: A measure of engagement in video game-playing. Journal of Experimental Social Psychology 45, 4 (2009), 624--634.Google ScholarGoogle ScholarCross RefCross Ref
  6. John P Charlton and Ian DW Danforth. 2007. Distinguishing addiction and high engagement in the context of online game playing. Computers in Human Behavior 23, 3 (2007), 1531--1548. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Mihaly Csikszentmihalyi. 1997. Finding flow: The psychology of engagement with everyday life. Basic Books.Google ScholarGoogle Scholar
  8. Artyom Dogtiev. 2016. App Store Statistics Roundup. Available at: http://www.businessofapps.com/app-store-statistics-roundup/. (2016).Google ScholarGoogle Scholar
  9. Empatica. 2017. E4 Wristband. Available at: https://www.empatica.com/e4-wristband/. (2017).Google ScholarGoogle Scholar
  10. Jennifer A Fredricks, Phyllis C Blumenfeld, and Alison H Paris. 2004. School engagement: Potential of the concept, state of the evidence. Review of educational research 74, 1 (2004), 59--109.Google ScholarGoogle Scholar
  11. Christopher D Frith and Heidelinde A Allen. 1983. The skin conductance orienting response as an index of attention. Biological psychology 17, 1 (1983), 27--39.Google ScholarGoogle Scholar
  12. Yuan Gao, Nadia Bianchi-Berthouze, and Hongying Meng. 2012. What does touch tell us about emotions in touchscreen-based gameplay? ACM Transactions on Computer-Human Interaction (TOCHI) 19, 4 (2012), 31. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Google. 2017. Top Grossing Android Apps. Available at: https://play.google.com/store/apps/collection/topgrossing?hl=en. (2017).Google ScholarGoogle Scholar
  14. Joseph Grafsgaard, Joseph B Wiggins, Kristy Elizabeth Boyer, Eric N Wiebe, and James Lester. 2013. Automatically recognizing facial expression: Predicting engagement and frustration. In Educational Data Mining 2013.Google ScholarGoogle Scholar
  15. John T Guthrie, Emily Anderson, et al. 1999. Engagement in reading: Processes of motivated, strategic, knowledgeable, social readers. Engaged reading: Processes, practices, and policy implications (1999), 17--45.Google ScholarGoogle Scholar
  16. Javier Hernandez, Zicheng Liu, Geoff Hulten, Dave DeBarr, Kyle Krum, and Zhengyou Zhang. 2013. Measuring the engagement level of TV viewers. In Automatic Face and Gesture Recognition (FG), 2013 10th IEEE International Conference and Workshops on. IEEE, 1--7.Google ScholarGoogle ScholarCross RefCross Ref
  17. Javier Hernandez, Ivan Riobo, Agata Rozga, Gregory D Abowd, and Rosalind W Picard. 2014. Using electrodermal activity to recognize ease of engagement in children during social interactions. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 307--317. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Jantina Huizenga, Wilfried Admiraal, Sanne Akkerman, and G ten Dam. 2009. Mobile game-based learning in secondary education: engagement, motivation and learning in a mobile city game. Journal of Computer Assisted Learning 25, 4 (2009), 332--344.Google ScholarGoogle ScholarCross RefCross Ref
  19. Sinh Huynh, Rajesh Krishna Balan, and Youngki Lee. 2015. Towards Recognition of Rich Non-Negative Emotions Using Daily Wearable Devices. In Proceedings of the 13th ACM Conference on Embedded Networked Sensor Systems. ACM, 471--472. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Wijnand IJsselsteijn, Yvonne De Kort, Karolien Poels, Audrius Jurgelionis, and Francesco Bellotti. 2007. Characterising and measuring user experiences in digital games. In International conference on advances in computer entertainment technology, Vol. 2. 27.Google ScholarGoogle Scholar
  21. Kyung Hwan Kim, Seok Won Bang, and Sang Ryong Kim. 2004. Emotion recognition system using short-term monitoring of physiological signals. Medical and biological engineering and computing 42, 3 (2004), 419--427.Google ScholarGoogle Scholar
  22. Andrea Kleinsmith, Nadia Bianchi-Berthouze, and Anthony Steed. 2011. Automatic recognition of non-acted affective postures. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 41, 4 (2011), 1027--1038. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Robert LiKamWa, Yunxin Liu, Nicholas D Lane, and Lin Zhong. 2013. Moodscope: Building a mood sensor from smartphone usage patterns. In Proceeding of the 11th annual international conference on Mobile systems, applications, and services. ACM, 389--402. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Qi-rong Mao, Xin-yu Pan, Yong-zhao Zhan, and Xiang-jun Shen. 2015. UsingKinect for real-time emotion recognition via facial expressions. Frontiers of Information Technology 8 Electronic Engineering 16, 4 (2015), 272--282.Google ScholarGoogle Scholar
  25. Rosa Mikeal Martey, Kate Kenski, James Folkestad, Laurie Feldman, Elana Cordis, Adrienne Shaw, Jennifer Stromer-Galley, Ben Clegg, Hui Zhang, Nissim Kaufman, et al. 2014. Five Approaches to Measuring Engagement: Comparisons by Video Game Characteristics. Simulation 8 Gaming Vol. 45 (January 2014), 528--547.Google ScholarGoogle Scholar
  26. Akhil Mathur, Nicholas D. Lane, and Fahim Kawsar. 2016. Engagement-aware Computing: Modelling User Engagement from Mobile Contexts. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16). ACM, New York, NY, USA, 622--633. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Daniel K Mayes and James E Cotton. 2001. Measuring engagement in video games: A questionnaire. In Proceedings of the human factors and ergonomics society annual meeting, Vol. 45. SAGE Publications, 692--696.Google ScholarGoogle ScholarCross RefCross Ref
  28. Aske Mottelson and Kasper Hornbaek. 2016. An affect detection technique using mobile commodity sensors in the wild. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 781--792. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Sebastian C Müller and Thomas Fritz. 2015. Stuck and frustrated or in flow and happy: Sensing developers' emotions and progress. In Software Engineering (ICSE), 2015 IEEE/ACM 37th IEEE International Conference on, Vol. 1. IEEE, 688--699. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Heather L O'Brien and Elaine G Toms. 2008. What is user engagement? A conceptual framework for defining user engagement with technology. Journal of the American Society for Information Science and Technology 59, 6 (2008), 938--955. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. The Statistics Portal. 2016. Most popular Apple App Store categories in December 2016, by share of available apps. Available at: https://www.statista.com/statistics/270291/popular-categories-in-the-app-store/. (2016).Google ScholarGoogle Scholar
  32. Niklas Ravaja, Timo Saari, Mikko Salminen, Jari Laarni, and Kari Kallinen. 2006. Phasic emotional reactions to video game events: A psychophysiological investigation. Media Psychology 8, 4 (2006), 343--367.Google ScholarGoogle ScholarCross RefCross Ref
  33. Bryan Reimer, Bruce Mehler, Joseph F Coughlin, Kathryn M Godfrey, and Chuanzhong Tan. 2009. An on-road assessment of the impact of cognitive workload on physiological arousal in young adult drivers. In Proceedings of the 1st international conference on automotive user interfaces and interactive vehicular applications. ACM, 115--118. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Yvan Saeys, Thomas Abeel, and Yves Van de Peer. 2008. Robust feature selection using ensemble feature selection techniques. Machine learning and knowledge discovery in databases (2008), 313--325.Google ScholarGoogle Scholar
  35. Fernando Silveira, Brian Eriksson, Anmol Sheth, and Adam Sheppard. 2013. Predicting audience responses to movie content from electro-dermal activity signals. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing. ACM, 707--716. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Imangi Studios. 2016. Temple Run. Available at: https://play.google.com/store/apps/details?id=com.imangi.templerun. (2016).Google ScholarGoogle Scholar
  37. Brandon Taylor, Anind Dey, Daniel Siewiorek, and Asim Smailagic. 2015. Using physiological sensors to detect levels of user frustration induced by system delays. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 517--528. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Michael Voit and Rainer Stiefelhagen. 2008. Deducing the visual focus of attention from head pose estimation in dynamic multi-view meeting scenarios. In Proceedings of the 10th international conference on Multimodal interfaces. ACM, 173--180. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Peter Vorderer, Tilo Hartmann, and Christoph Klimmt. 2003. Explaining the enjoyment of playing video games: the role of competition. In Proceedings of the second international conference on Entertainment computing. Carnegie Mellon University, 1--9. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Allan Wigfield and John T Guthrie. 2000. Engagement and motivation in reading. Handbook of reading research 3 (2000), 403--422.Google ScholarGoogle Scholar

Index Terms

  1. EngageMon: Multi-Modal Engagement Sensing for Mobile Games

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Published in

        cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
        Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 2, Issue 1
        March 2018
        1370 pages
        EISSN:2474-9567
        DOI:10.1145/3200905
        Issue’s Table of Contents

        Copyright © 2018 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 26 March 2018
        • Accepted: 1 January 2018
        • Revised: 1 December 2017
        • Received: 1 May 2017
        Published in imwut Volume 2, Issue 1

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article
        • Research
        • Refereed

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader