KCAR: a knowledge-driven approach for concurrent activity recognition

Juan Ye, Graeme Turnbull Stevenson, Simon Andrew Dobson

Research output: Contribution to journalArticlepeer-review

63 Citations (Scopus)
205 Downloads (Pure)


Recognising human activities from sensors embedded in an environment or worn on bodies is an important and challenging research topic in pervasive computing. Existing work on activity recognition is mainly concerned with identifying single user sequential activities from well-scripted or pre-segmented sequences of sensor events. However a real-world environment often contains multiple users, with each performing activities simultaneously, in their own way and with no explicit instructions to follow. Recognising multi-user concurrent activities is challenging, but essential for designing applications for real environments. This paper presents a novel Knowledge-driven approach for Concurrent Activity Recognition (KCAR). Within KCAR, we explore the semantics underlying each sensor event and use semantic dissimilarity to segment a continuous sensor sequence into fragments, each of which corresponds to one ongoing activity. We exploit the Pyramid Match Kernel, with a strength in approximate matching on hierarchical concepts, to recognise activities of varying grained constraints from a potentially noisy sensor sequence. We conduct an empirical evaluation on a large-scale real-world data set that is collected over one year and consists of 2.8 millions of sensor events. Our results demonstrate that KCAR achieves an average recognition accuracy of 91%.

Original languageEnglish
Pages (from-to)47-70
Number of pages24
JournalPervasive and Mobile Computing
Early online date22 Feb 2014
Publication statusPublished - May 2015


  • Ontologies
  • Smart home
  • Concurrent activity recognition
  • Semantics
  • Domain knowledge
  • Pyramid match kernel


Dive into the research topics of 'KCAR: a knowledge-driven approach for concurrent activity recognition'. Together they form a unique fingerprint.

Cite this