Tangible UI by object and material classification with radar

Hui Shyong Yeo, Barrett Ens, Aaron John Quigley

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Radar signals penetrate, scatter, absorb and reflect energy into proximate objects and ground penetrating and aerial radar systems are well established. We describe a highly accurate system based on a combination of a monostatic radar (Google Soli), supervised machine learning to support object and material classification based Uls. Based on RadarCat techniques, we explore the development of tangible user interfaces without modification of the objects or complex infrastructures. This affords new forms of interaction with digital devices, proximate objects and micro-gestures.
Original languageEnglish
Title of host publicationSA '17 SIGGRAPH Asia 2017 Emerging Technologies
Place of PublicationNew York
PublisherACM
Number of pages2
ISBN (Electronic)9781450354042
DOIs
Publication statusPublished - 27 Nov 2017
Event10th ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia - Bangkok, Thailand
Duration: 27 Nov 201730 Nov 2017
Conference number: 10
https://sa2017.siggraph.org/

Conference

Conference10th ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia
Country/TerritoryThailand
CityBangkok
Period27/11/1730/11/17
Internet address

Keywords

  • Radar sensing
  • Tangible interaction
  • Object recognition

Fingerprint

Dive into the research topics of 'Tangible UI by object and material classification with radar'. Together they form a unique fingerprint.

Cite this