Robust audio sensing with multi-sound classification

Juan Ye, Peter Haubrick

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)
1 Downloads (Pure)


Audio data is a highly rich form of information, often containing patterns with unique acoustic signatures. In pervasive sensing environments, because of the empowered smart devices, we have witnessed an increasing research interest in sound sensing to detect ambient environment, recognise users' daily activities, and infer their health conditions. However, the main challenge is that the real-world environment often contains multiple sound sources, which can significantly compromise the robustness of the above environment, event, and activity detection applications. In this paper, we explore different approaches in multi-sound classification, and propose a stacked classifier based on the recent advance in deep learning. We evaluate our proposed approach in a comprehensive set of experiments on both sound effect and real-world datasets. The results have demonstrated that our approach can robustly identify each sound category among mixed acoustic signals, without the need of any a priori knowledge about the number and signature of sounds in the mixed signals.
Original languageEnglish
Title of host publication2019 IEEE International Conference on Pervasive Computing and Communications (PerCom)
PublisherIEEE Computer Society
Number of pages7
ISBN (Electronic)9781538691489
Publication statusPublished - 22 Jul 2019
EventIEEE International Conference on Pervasive Computing and Communications (PerCom 2019) - Kyoto, Japan
Duration: 12 Mar 201914 Mar 2019
Conference number: 17

Publication series

NamePervasive Computing and Communications (PerCom)
ISSN (Print)2474-2503
ISSN (Electronic)2474-249X


ConferenceIEEE International Conference on Pervasive Computing and Communications (PerCom 2019)
Abbreviated titlePerCom 2019
Internet address


Dive into the research topics of 'Robust audio sensing with multi-sound classification'. Together they form a unique fingerprint.

Cite this