skip to main content
Foundations for a theory of mind for a humanoid robot
Publisher:
  • Massachusetts Institute of Technology
  • 201 Vassar Street, W59-200 Cambridge, MA
  • United States
Order Number:AAI0803464
Pages:
1
Bibliometrics
Skip Abstract Section
Abstract

Human social dynamics rely upon the ability to correctly attribute beliefs, goals, and percepts to other people. The set of abilities that allow an individual to infer these hidden mental states based on observed actions and behavior has been called a “theory of mind” (Premack & Woodruff, 1978). Existing models of theory of mind have sought to identify a developmental progression of social skills that serve as the basis for more complex cognitive abilities. These skills include detecting eye contact, identifying self-propelled stimuli, and attributing intent to moving objects.

If we are to build machines that interact naturally with people, our machines must both interpret the behavior of others according to these social rules and display the social cues that will allow people to naturally interpret the machine's behavior.

Drawing from the models of Baron-Cohen (1995) and Leslie (1994), a novel architecture called embodied theory of mind was developed to link high-level cognitive skills to the low-level perceptual abilities of a humanoid robot. The implemented system determines visual saliency based on inherent object attributes, high-level task constraints, and the attentional states of others. Objects of interest are tracked in real-time to produce motion trajectories which are analyzed by a set of naive physical laws designed to discriminate animate from inanimate movement. Animate objects can be the source of attentional states (detected by finding faces and head orientation) as well as intentional states (determined by motion trajectories between objects). Individual components are evaluated by comparisons to human performance on similar tasks, and the complete system is evaluated in the context of a basic social learning mechanism that allows the robot to mimic observed movements. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.)

Cited By

  1. ACM
    Perez-Osorio J, Wiese E and Wykowska A Theory of Mind and Joint Attention The Handbook on Socially Interactive Agents, (311-348)
  2. Liu P, Glas D, Kanda T and Ishiguro H (2016). Data-Driven HRI: Learning Social Behaviors by Example From Human–Human Interaction, IEEE Transactions on Robotics, 32:4, (988-1008), Online publication date: 1-Aug-2016.
  3. Hoque M, Kobayashi Y and Kuno Y (2014). A proactive approach of robotic framework for making eye contact with humans, Advances in Human-Computer Interaction, 2014, (8-8), Online publication date: 1-Jan-2014.
  4. Kuiper D and Wenkstern R Virtual agent perception combination in multi agent based systems Proceedings of the 2013 international conference on Autonomous agents and multi-agent systems, (611-618)
  5. ACM
    Kim K and Lipson H Towards a simple robotic theory of mind Proceedings of the 9th Workshop on Performance Metrics for Intelligent Systems, (131-138)
  6. ACM
    Kim K and Lipson H Towards a "theory of mind" in simulated robots Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers, (2071-2076)
  7. Breazeal C and Berlin M Spatial scaffolding for sociable robot learning Proceedings of the 23rd national conference on Artificial intelligence - Volume 3, (1268-1273)
  8. ACM
    Berg-Cross G Applying developmental-inspired principles to the field of developmental robotics Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems, (288-292)
  9. Déniz O, Castrillón M, Lorenzo J and Antón-Canalís L Natural Interaction with a Robotic Head Proceedings of the 2nd international work-conference on The Interplay Between Natural and Artificial Computation, Part I: Bio-inspired Modeling of Cognitive Tasks, (71-80)
  10. Fonooni B, Moshiri B and Lucas C Applying data fusion in a rational decision making with emotional regulation 50 years of artificial intelligence, (320-331)
  11. ACM
    Movellan J, Tanaka F, Fasel I, Taylor C, Ruvolo P and Eckhardt M The RUBI project Proceedings of the ACM/IEEE international conference on Human-robot interaction, (333-339)
  12. ACM
    Raidt S, Bailly G and Elisei F Basic components of a face-to-face interaction with a conversational agent Proceedings of the 2005 joint conference on Smart objects and ambient intelligence: innovative context-aware services: usages and technologies, (247-252)
  13. Steels L The evolution of communication systems by adaptive agents Adaptive agents and multi-agent systems, (125-140)
  14. Breazeal C (2003). Emotion and sociable humanoid robots, International Journal of Human-Computer Studies, 59:1-2, (119-155), Online publication date: 1-Jul-2003.
  15. ACM
    Brooks R (2002). Humanoid robots, Communications of the ACM, 45:3, (33-38), Online publication date: 1-Mar-2002.
Contributors
  • Massachusetts Institute of Technology

Recommendations