skip to main content
10.1145/2993148.2993170acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
short-paper

Estimating self-assessed personality from body movements and proximity in crowded mingling scenarios

Authors Info & Claims
Published:31 October 2016Publication History

ABSTRACT

This paper focuses on the automatic classification of self-assessed personality traits from the HEXACO inventory during crowded mingle scenarios. We exploit acceleration and proximity data from a wearable device hung around the neck. Unlike most state-of-the-art studies, addressing personality estimation during mingle scenarios provides a challenging social context as people interact dynamically and freely in a face-to-face setting. While many former studies use audio to extract speech-related features, we present a novel method of extracting an individual’s speaking status from a single body worn triaxial accelerometer which scales easily to large populations. Moreover, by fusing both speech and movement energy related cues from just acceleration, our experimental results show improvements on the estimation of Humility over features extracted from a single behavioral modality. We validated our method on 71 participants where we obtained an accuracy of 69% for Honesty, Conscientiousness and Openness to Experience. To our knowledge, this is the largest validation of personality estimation carried out in such a social context with simple wearable sensors.

References

  1. M. Ashton, K. Lee, M. Perugini, P. Szarota, R. De Vries, L. Di Blas, K. Boies, and B. De Raad. A six-factor structure of personality-descriptive adjectives: Solutions from psycholexical studies in seven languages. Journal of personality and social psychology, 2004.Google ScholarGoogle Scholar
  2. L. Batrinca, N. Mana, B. Lepri, F. Pianesi, and N. Sebe. Please, tell me about yourself: automatic personality assessment using short self-presentations.Google ScholarGoogle Scholar
  3. O. Celiktutan, F. Eyben, E. Sariyanidi, H. Gunes, and B. Schuller. MAPTRAITS 2014: The First Audio/Visual Mapping Personality Traits Challenge. ICMI, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. E. Gedik and H. Hung. Speaking status detection from body movements using transductive parameter transfer. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, pages 69–72. ACM, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. H. Hung, G. Englebienne, and J. Kools. Classifying social actions with a single accelerometer. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing, pages 207–210. ACM, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. K. Lee and M. Ashton. Psychometric properties of the HEXACO personality inventory. Multivariate Behavioral Researc, 2004.Google ScholarGoogle ScholarCross RefCross Ref
  7. C. Martella, M. Dobson, A. van Halteren, and M. van Steen. From Proximity Sensing to Spatio-Temporal Social Graphs. PerCom, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  8. D. McNeill. Language and gesture, volume 2. Cambridge University Press, 2000.Google ScholarGoogle ScholarCross RefCross Ref
  9. K. P. Murphy. Machine learning: a probabilistic perspective. MIT press, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. F. Pianesi, N. Mana, A. Cappelletti, B. Lepri, and M. Zancanaro. Multimodal Recognition of Personality Traits in Social Interactions. ICMI, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Y. Rubner, C. Tomasi, and L. J. Guibas. The earth mover’s distance as a metric for image retrieval. International journal of computer vision, 40(2):99–121, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. A. Vinciarelli and G. Mahammadi. A survey of personality computing. IEEE Trans. on Affective Computing, 2014.Google ScholarGoogle Scholar
  13. G. Zen, B. Lepri, E. Ricci, and O. Lanz. Space Speaks-Towards Socially and Personality Aware Visual Surveillance. MPVA, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. G. Zen, E. Sangineto, E. Ricci, and N. Sebe. Unsupervised domain adaptation for personalized facial emotion recognition. In Proceedings of the 16th International Conference on Multimodal Interaction, pages 128–135. ACM, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Estimating self-assessed personality from body movements and proximity in crowded mingling scenarios

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          ICMI '16: Proceedings of the 18th ACM International Conference on Multimodal Interaction
          October 2016
          605 pages
          ISBN:9781450345569
          DOI:10.1145/2993148

          Copyright © 2016 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 31 October 2016

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • short-paper

          Acceptance Rates

          Overall Acceptance Rate453of1,080submissions,42%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader