Capturing social cues with imaging glasses

Lauren Murray, Philip Hands, Ross Goucher, Juan Ye

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Capturing visual social cues in social conversations can prove a difficult task for visually impaired people. Their lack of ability to see facial expressions and body postures expressed by their conversation partners can lead them to misunderstand or misjudge the social situations. This paper presents a system that infers social cues from streaming video recorded by a pair of imaging glasses and feedbacks the inferred social cues to the users. We have implemented the prototype and evaluated the effectiveness and usefulness of the system in real-world conversation situations.
Original languageEnglish
Title of host publicationProceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct
Place of PublicationNew York, NY
PublisherACM
Pages968-972
Number of pages5
ISBN (Print)9781450344623
DOIs
Publication statusPublished - 12 Sept 2016
Event2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16) - Kongresshaus Stadthalle Heidelberg, Heidelberg, Germany
Duration: 12 Sept 201616 Sept 2016
http://ubicomp.org/ubicomp2016/

Conference

Conference2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16)
Abbreviated titleUbiComp
Country/TerritoryGermany
CityHeidelberg
Period12/09/1616/09/16
Internet address

Keywords

  • Affective Computing
  • Imaging glasses
  • Emotion
  • Recognition

Fingerprint

Dive into the research topics of 'Capturing social cues with imaging glasses'. Together they form a unique fingerprint.

Cite this