Abstract
New interaction modalities in human computer interaction often explore common sensory inputs including touch, voice, gesture or motion. However, these modalities are not inclusive of the entire population type, and cannot be utilized by a group of people who suffer from any limitation of that sensory input. Here we propose BreathIn: an interface tool for enabling interaction with computer applications by using discreet exhalation patterns. The intent is that such patterns can be issued by anyone who can breathe. Our concept is based on detecting a user's forced exhalation patterns in a time duration using a MEMS microphone placed below the user's nose. We breakdown the signal into FFT components and identify peak frequencies for forced voluntary "breath events" and use that in real-time to distinguish between "exhalation events" and noise. We show two major applications of such an interaction tool: a) adaptation of computer applications using breath, b) using the breath interface as a discreet, emergency signal for prospective victims of crime.
Original language | English |
---|---|
Title of host publication | OZCHI'19 |
Subtitle of host publication | Proceedings of the 31st Australian Conference on Human-Computer-Interaction |
Place of Publication | New York |
Publisher | ACM |
Pages | 581-584 |
ISBN (Electronic) | 9781450376969 |
DOIs | |
Publication status | Published - 2 Dec 2019 |
Event | OZCHI'19: 31st Australian Conference on Human-Computer Interaction - Perth/Freemantle, Australia Duration: 2 Dec 2019 → 5 Dec 2019 Conference number: 31 http://ozchi2019.visemex.org/wp/ |
Conference
Conference | OZCHI'19 |
---|---|
Abbreviated title | OZCHI'19 |
Country/Territory | Australia |
City | Perth/Freemantle |
Period | 2/12/19 → 5/12/19 |
Internet address |
Keywords
- Breath
- BreathIn
- Breath sensing
- Exhale