skip to main content
10.1145/2909824.3020216acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article
Public Access

Using Facially Expressive Robots to Calibrate Clinical Pain Perception

Published:06 March 2017Publication History

ABSTRACT

In this paper, we introduce a novel application of social robotics in healthcare: high fidelity, facially expressive, robotic patient simulators (RPSs), and explore their usage within a clinical experimental context. Current commercially-available RPSs, the most commonly used humanoid robots worldwide, are substantially limited in their usability and fidelity due to the fact that they lack one of the most important clinical interaction and diagnostic tools: an expressive face. Using autonomous facial synthesis techniques, we synthesized pain both on a humanoid robot and comparable virtual avatar. We conducted an experiment with 51 clinicians and 51 laypersons (n = 102), to explore differences in pain perception across the two groups, and also to explore the effects of embodiment (robot or avatar) on pain perception. Our results suggest that clinicians have lower overall accuracy in detecting synthesized pain in comparison to lay participants. We also found that all participants are overall less accurate detecting pain from a humanoid robot in comparison to a comparable virtual avatar, lending support to other recent findings in the HRI community. This research ultimately reveals new insights into the use of RPSs as a training tool for calibrating clinicians' pain detection skills.

References

  1. S. Andrist, X. Z. Tan, M. Gleicher, and B. Mutlu. Conversational gaze aversion for humanlike robots. Proceedings of the 2014 ACM/IEEE international Conference on Human-Robot Interaction, pages 25--32, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. A. B. Ashraf, S. Lucey, J. F. Cohn, T. Chen, Z. Ambadar, K. M. Prkachin, and P. E. Solomon. The painful face--pain expression recognition using active appearance models. Image and Vision Computing, 27(12):1788--1796, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. A. L. Back and et al. Efficacy of communication skills training for giving bad news and discussing transitions to palliative care. Archives of Internal Medicine, 167(5), 2007.Google ScholarGoogle Scholar
  4. T. Baltrusaitis, L. D. Riek, and P. Robinson. Synthesizing expressions using facial feature point tracking: how emotion is conveyed. In Proceedings of the 3rd ACM international workshop on Affective interaction in natural environments, pages 27--32, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. T. Baltrusaitis, P. Robinson, and L. Morency. 3D constrained local model for rigid and non-rigid facial tracking. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pages 2610--2617, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. S. Baron-Cohen, A. Cox, G. Baird, J. Swettenham, N. Nightingale, K. Morgan, A. Drew, and T. Charman. Psychological markers in the detection of autism in infancy in a large population. The British Journal of Psychiatry, 168(2):158--163, 1996.Google ScholarGoogle ScholarCross RefCross Ref
  7. C. Bartneck, J. Reichenbach, and v. A. Breemen. In your face, robot! the influence of a character's embodiment on how users perceive its emotional expressions. In Proceedings of the Design and Emotion, pages 32--51, 2004.Google ScholarGoogle Scholar
  8. D. Bazo, R. Vaidyanathan, A. Lentz, and C. Melhuish. Design and testing of a hybrid expressive face for a humanoid robot. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2010.Google ScholarGoogle ScholarCross RefCross Ref
  9. C. C. Bennett and S. Sabanović. Deriving minimal features for human-like facial expressions in robotic faces. International Journal of Social Robotics, 6(3):367--381, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  10. K. Berns and J. Hirth. Control of facial expressions of the humanoid robot head roman. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2006.Google ScholarGoogle ScholarCross RefCross Ref
  11. D. Blanch-Hartigan, S. A. Andrzejewski, and K. M. Hill. The effectiveness of training to improve person perception accuracy: a meta-analysis. Basic and Applied Social Psychology, 34(6):483--498, 2012.Google ScholarGoogle ScholarCross RefCross Ref
  12. C. Breazeal, C. D. Kidd, A. L. Thomaz, G. Hoffman, and M. Berlin. Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 708--713, 2005.Google ScholarGoogle ScholarCross RefCross Ref
  13. P. Briggs, M. Scheutz, and L. Tickle-Degnen. Are robots ready for administering health status surveys': First results from an hri study with subjects with parkinson's disease. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pages 327--334, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. J. Brown. How clinical communication has become a core part of medical education in the UK. Medical Education, 42(3), 2008.Google ScholarGoogle Scholar
  15. A. J. Card. Patient safety: this is public health. Journal of Healthcare Risk Management, 34(1):6--12, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  16. S. W. Chew, P. Lucey, S. Lucey, J. Saragih, J. F. Cohn, and S. Sridharan. Person-independent facial expression detection using constrained local models. In IEEE International Conference on Automatic Face and Gesture Recognition, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  17. K. D. Craig. The social communication model of pain. Canadian Psychology/Psychologie canadienne, 50(1):22, 2009.Google ScholarGoogle ScholarCross RefCross Ref
  18. D. Cristinacce and T. F. Cootes. Feature detection and tracking with constrained local models. In British Machine Vision Conference (BMVC), volume 17, pages 929--938, 2006.Google ScholarGoogle ScholarCross RefCross Ref
  19. S. K. Das. Realistic interaction with social robots via facial expressions and neck-eye coordination. Master's thesis, The University of Texas at Arlington, USA, 2015.Google ScholarGoogle Scholar
  20. A. Foster, N. Chaudhary, T. Kim, J. L. Waller, J. Wong, M. Borish, A. Cordar, B. Lok, and P. F. Buckley. Using virtual patients to teach empathy: A randomized controlled study to enhance medical students' empathic communication. Simulation in Healthcare, 11(3):181--189, 2016.Google ScholarGoogle ScholarCross RefCross Ref
  21. A. J. Giannini, J. D. Giannini, and R. K. Bowman. Measurement of nonverbal receptive abilities in medical students. Perceptual and Motor Skills, 90(3c):1145--1150, 2000.Google ScholarGoogle ScholarCross RefCross Ref
  22. M. J. Gonzales, J. M. Henry, A. W. Calhoun, and L. D. Riek. Visual task: A collaborative cognitive aid for acute care resuscitation. 10th EAI International Conference on Pervasive Computing Technologies for Healthcare (Pervasive Health), pages 1--8, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. P. Gulbrandsen, B. F. Jensen, A. Finset, and D. Blanch-Hartigan. Long-term effect of communication training on the relationship between physicians' self-efficacy and performance. Patient Education and Counseling, 91(2):180--185, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  24. A. Habib, S. K. Das, I.-C. Bogdan, D. Hanson, and D. O. Popa. Learning human-like facial expressions for android phillip k. dick. In IEEE International Conference on Automation Science and Engineering (CASE), pages 1159--1165, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  25. T. Hadjistavropoulos, K. D. Craig, and S. Fuchs-Lacelle. Social influences and the communication of pain. Pain: Psychological Perspectives, 2004.Google ScholarGoogle ScholarCross RefCross Ref
  26. D. Hanson. Exploring the aesthetic range for humanoid robots. In Proceedings of the ICCS/CogSci-2006 Long Symposium: Toward Social Mechanisms of Android Science, pages 39--42. Citeseer.Google ScholarGoogle Scholar
  27. M. Hojat, M. J. Vergare, K. Maxwell, G. Brainard, S. K. Herrine, G. A. Isenberg, J. Veloski, and J. S. Gonnella. The devil is in the third year: a longitudinal study of erosion of empathy in medical school. Academic Medicine, 84(9):1182--1191, 2009.Google ScholarGoogle ScholarCross RefCross Ref
  28. J. T. James. A new, evidence-based estimate of patient harms associated with hospital care. Journal of patient safety, 9(3), 2013.Google ScholarGoogle Scholar
  29. A. Janiw, L. Woodrick, and L. Riek. Patient situational awareness support appears to fall with advancing levels of nursing student education (submission\# 968). Simulation in Healthcare, 8(6):345, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  30. J. Jansen, J. C. van Weert, J. de Groot, S. van Dulmen, T. J. Heeren, and J. M. Bensing. Emotional and informational patient cues: the impact of nurses' responses on recall. Patient Education and Counseling, 79(2):218--224, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  31. C. D. Kidd. Phd thesis:sociable robots: The role of presence and task in human-robot interaction. Citeseer, 2003.Google ScholarGoogle Scholar
  32. S. Kiesler, A. Powers, S. R. Fussell, and C. Torrey. Anthropomorphic interactions with a robot and robot-like agent. Social Cognition, 26(2):169, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  33. T. Kishi, T. Otani, N. Endo, P. Kryczka, K. Hashimoto, K. Nakata, and A. Takanishi. Development of expressive robotic head for bipedal humanoid robot. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 4584--4589, 2012.Google ScholarGoogle ScholarCross RefCross Ref
  34. M. Leonard. The human factor: the critical importance of effective teamwork and communication in providing safe care. Quality and Safety in Health Care, 13, 2004.Google ScholarGoogle Scholar
  35. W. Levinson, R. Gorawara-Bhat, and J. Lamb. A study of patient clues and physician responses in primary care and surgical settings. Jama, 284(8):1021--1027, 2000.Google ScholarGoogle ScholarCross RefCross Ref
  36. A. Li, M. Florendo, L. Miller, H. Ishiguro, and A. P. Saygin. Robot form and motion influences social attention. In ACM/IEEE International Conference on Human-Robot Interaction, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. J. Li, R. Kizilcec, J. Bailenson, and W. Ju. Social robots and virtual agents as lecturers for video instruction. Computers in Human Behavior, 55:1222--1230, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. P. Lucey, J. F. Cohn, K. M. Prkachin, P. E. Solomon, and I. Matthews. Painful data: The unbc-mcmaster shoulder pain expression archive database. In IEEE International Conference on Automatic Face & Gesture Recognition, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  39. P. Lucey and et al. Automatically detecting pain using facial actions. In 3rd International Conference on Affective Computing and Intelligent Interaction (ACII), 2009.Google ScholarGoogle Scholar
  40. L. R. Martin and H. S. Friedman. Nonverbal communication and health care. Applications of Nonverbal Communication, 2005.Google ScholarGoogle Scholar
  41. M. Moosaei, M. J. Gonzales, and L. D. Riek. Naturalistic pain synthesis for virtual patients. International Conference on Intelligent Virtual Agents (IVA), pages 295--309, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  42. M. Moosaei, C. J. Hayes, and L. D. Riek. Performing facial expression synthesis on robot faces: A real-time software system. In Proceedings of the 4th International AISB Symposium on New Frontiers in Human-Robot Interaction, 2015.Google ScholarGoogle Scholar
  43. D. S. Morse, E. A. Edwardsen, and H. S. Gordon. Missed opportunities for interval empathy in lung cancer communication. Archives of Internal Medicine, 168(17):1853--1858, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  44. A. Paiva, J. Dias, D. Sobral, R. Aylett, P. Sobreperez, S. Woods, C. Zoll, and L. Hall. Caring for agents and agents that care: Building empathic relations with synthetic agents. In Proceedings of the Third International Joint Conference on Autonomous Agents and Multiagent Systems-Volume 1, pages 194--201, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. M. Pantic, M. Valstar, R. Rademaker, and L. Maat. Web-based database for facial expression analysis. In IEEE International Conference on Multimedia and Expo (ICME), 2005.Google ScholarGoogle ScholarCross RefCross Ref
  46. K. M. Prkachin, S. Berzins, and S. R. Mercer. Encoding and decoding of pain expressions: a judgement study. Pain, 58(2), 1994.Google ScholarGoogle Scholar
  47. K. M. Prkachin and K. D. Craig. Expressing pain: The communication and interpretation of facial pain signals. Journal of Nonverbal Behavior, 19(4), 1995.Google ScholarGoogle ScholarCross RefCross Ref
  48. K. M. Prkachin, P. E. Solomon, and J. Ross. Underestimation of pain by health-care providers: towards a model of the process of inferring pain in others. Canadian Journal of Nursing Research (CJNR), 39(2):88--106, 2007.Google ScholarGoogle Scholar
  49. L. Riek. Healthcare robotics. Communications of the ACM, in review.Google ScholarGoogle Scholar
  50. L. D. Riek. The social co-robotics problem space: Six key challenges. Proceedings of Robotics Challenges and Vision (RCV), 2013.Google ScholarGoogle Scholar
  51. L. D. Riek. Robotics technology in mental health care. Artificial Intelligence in Behavioral and Mental Health Care, pages 185--203, 2015.Google ScholarGoogle Scholar
  52. L. D. Riek, T.-C. Rabinowitch, P. Bremner, A. G. Pipe, M. Fraser, and P. Robinson. Cooperative gestures: Effective signaling for humanoid robots. In 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pages 61--68, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. L. D. Riek and P. Robinson. Real-time empathy: Facial mimicry on a robot. In Workshop on Affective Interaction in Natural Environments (AFFINE) at the International ACM Conference on Multimodal Interfaces (ICMI 08), 2008.Google ScholarGoogle Scholar
  54. L. D. Riek and P. Robinson. Using robots to help people habituate to visible disabilities. In IEEE International Conference on Rehabilitation Robotics, pages 1--8, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  55. P. Riva, S. Sacchi, L. Montali, and A. Frigerio. Gender effects in pain detection: Speed and accuracy in decoding female and male pain expressions. European Journal of Pain, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  56. K. Ruhland, C. E. Peters, S. Andrist, J. B. Badler, N. I. Badler, M. Gleicher, B. Mutlu, and R. McDonnell. A review of eye gaze in virtual agents, social robotics and hci: Behaviour generation, user interaction and perception. Computer Graphics Forum, 34(6):299--326, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  57. J. A. Russell. Is there universal recognition of emotion from facial expressions' a review of the cross-cultural studies. Psychological bulletin, 115(1), 1994.Google ScholarGoogle Scholar
  58. J. M. Satterfield and E. Hughes. Emotion skills training for medical students: a systematic review. Medical education, 41(10):935--941, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  59. Y. Tadesse, S. Priya, H. Stephanou, D. Popa, and D. Hanson. Piezoelectric actuation and sensing for facial robotics. Ferroelectrics, 345(1):13--25, 2006.Google ScholarGoogle ScholarCross RefCross Ref
  60. L. Tickle-Degnen, M. Scheutz, and R. C. Arkin. Collaborative robots in rehabilitation for social self-management of health. Proceedings of RESNA 2014, 2014.Google ScholarGoogle Scholar
  61. N. Tottenham and et al. The nimstim set of facial expressions: judgments from untrained research participants. Psychiatry Research, 168(3), 2009.Google ScholarGoogle Scholar
  62. Valve Software: Source SDK. source.valvesoftware.com/sourcesdk.php.Google ScholarGoogle Scholar
  63. M. L. Walters, M. Lohse, M. Hanheide, B. Wrede, D. S. Syrdal, K. L. Koay, A. Green, H. Hüttenrauch, K. Dautenhahn, G. Sagerer, et al. Evaluating the robot personality and verbal behavior of domestic robots using video-based studies. Advanced Robotics, 25(18):2233--2254, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  64. L. D. Wandner, J. E. Letzen, C. A. Torres, B. Lok, and M. E. Robinson. Using virtual human technology to provide immediate feedback about participants' use of demographic cues and knowledge of their cue use. The Journal of Pain, 15(11):1141 -- 1147, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  65. A. Weiss. Creating service robots for and with people: A user-centered reflection on the interdisciplinary research field of human-robot interaction. In 15th Annual STS Conference Graz, Critical Issues in Science, Technology, and Society Studies, 2016.Google ScholarGoogle Scholar
  66. A. C. d. C. Williams, H. T. O. Davies, and Y. Chadury. Simple pain rating scales hide complex idiosyncratic meanings. Pain, 85(3), 2000.Google ScholarGoogle Scholar
  67. S. N. Woods, M. L. Walters, K. L. Koay, and K. Dautenhahn. Methodological issues in hri: A comparison of live and video-based methods in robot to human approach direction trials. In The 15th IEEE International Symposium on Robot and Human Interactive Communication (ROMAN), pages 51--58, 2006.Google ScholarGoogle ScholarCross RefCross Ref
  68. X. Zhang, L. Yin, J. F. Cohn, S. Canavan, M. Reale, A. Horowitz, P. Liu, and J. M. Girard. BP4D-spontaneous: a high-resolution spontaneous 3D dynamic facial expression database. Image and Vision Computing, 32(10):692--706, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  69. R. Zhao, T. Sinha, A. Black, and J. Cassell. Socially-aware virtual agents: Automatically assessing dyadic rapport from temporal patterns of behavior. In 16th International Conference on Intelligent Virtual Agents, 2016.Google ScholarGoogle ScholarCross RefCross Ref
  70. C. Zimmermann, L. Del Piccolo, and A. Finset. Cues and concerns by patients in medical consultations: a literature review. Psychological bulletin, 133(3):438, 2007.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Using Facially Expressive Robots to Calibrate Clinical Pain Perception

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          HRI '17: Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction
          March 2017
          510 pages
          ISBN:9781450343367
          DOI:10.1145/2909824

          Copyright © 2017 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 6 March 2017

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          HRI '17 Paper Acceptance Rate51of211submissions,24%Overall Acceptance Rate242of1,000submissions,24%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader