Abstract
Radar signals penetrate, scatter, absorb and reflect energy into proximate objects and ground penetrating and aerial radar systems are well established. We describe a highly accurate system based on a combination of a monostatic radar (Google Soli), supervised machine learning to support object and material classification based Uls. Based on RadarCat techniques, we explore the development of tangible user interfaces without modification of the objects or complex infrastructures. This affords new forms of interaction with digital devices, proximate objects and micro-gestures.
Original language | English |
---|---|
Title of host publication | SA '17 SIGGRAPH Asia 2017 Emerging Technologies |
Place of Publication | New York |
Publisher | ACM |
Number of pages | 2 |
ISBN (Electronic) | 9781450354042 |
DOIs | |
Publication status | Published - 27 Nov 2017 |
Event | 10th ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia - Bangkok, Thailand Duration: 27 Nov 2017 → 30 Nov 2017 Conference number: 10 https://sa2017.siggraph.org/ |
Conference
Conference | 10th ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia |
---|---|
Country/Territory | Thailand |
City | Bangkok |
Period | 27/11/17 → 30/11/17 |
Internet address |
Keywords
- Radar sensing
- Tangible interaction
- Object recognition