Breathin: a breath pattern sensing approach for user computer interaction

Rohan Hundia, Aaron Quigley

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)


New interaction modalities in human computer interaction often explore common sensory inputs including touch, voice, gesture or motion. However, these modalities are not inclusive of the entire population type, and cannot be utilized by a group of people who suffer from any limitation of that sensory input. Here we propose BreathIn: an interface tool for enabling interaction with computer applications by using discreet exhalation patterns. The intent is that such patterns can be issued by anyone who can breathe. Our concept is based on detecting a user's forced exhalation patterns in a time duration using a MEMS microphone placed below the user's nose. We breakdown the signal into FFT components and identify peak frequencies for forced voluntary "breath events" and use that in real-time to distinguish between "exhalation events" and noise. We show two major applications of such an interaction tool: a) adaptation of computer applications using breath, b) using the breath interface as a discreet, emergency signal for prospective victims of crime.
Original languageEnglish
Title of host publicationOZCHI'19
Subtitle of host publicationProceedings of the 31st Australian Conference on Human-Computer-Interaction
Place of PublicationNew York
ISBN (Electronic)9781450376969
Publication statusPublished - 2 Dec 2019
EventOZCHI'19: 31st Australian Conference on Human-Computer Interaction - Perth/Freemantle, Australia
Duration: 2 Dec 20195 Dec 2019
Conference number: 31


Abbreviated titleOZCHI'19
Internet address


  • Breath
  • BreathIn
  • Breath sensing
  • Exhale


Dive into the research topics of 'Breathin: a breath pattern sensing approach for user computer interaction'. Together they form a unique fingerprint.

Cite this