ContrasGAN: unsupervised domain adaptation in Human Activity Recognition via adversarial and contrastive learning

Andrea Rosales Sanabria*, Franco Zambonelli, Simon Andrew Dobson, Juan Ye

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

21 Citations (Scopus)
26 Downloads (Pure)

Abstract

Human Activity Recognition (HAR) makes it possible to drive applications directly from embedded and wearable sensors. Machine learning, and especially deep learning, has made significant progress in learning sensor features from raw sensing signals with high recognition accuracy. However, most techniques need to be trained on a large labelled dataset, which is often difficult to acquire. In this paper, we present ContrasGAN, an unsupervised domain adaptation technique that addresses this labelling challenge by transferring an activity model from one labelled domain to other unlabelled domains. ContrasGAN uses bi-directional generative adversarial networks for heterogeneous feature transfer and contrastive learning to capture distinctive features between classes. We evaluate ContrasGAN on three commonly-used HAR datasets under conditions of cross-body, cross-user, and cross-sensor transfer learning. Experimental results show a superior performance of ContrasGAN on all these tasks over a number of state-of-the-art techniques, with relatively low computational cost.
Original languageEnglish
Article number101477
Pages (from-to)1-34
Number of pages34
JournalPervasive and Mobile Computing
VolumeIn Press
Early online date6 Nov 2021
DOIs
Publication statusE-pub ahead of print - 6 Nov 2021

Keywords

  • Human activity recognition
  • Unsupervised domain adaptation
  • GAN
  • Contrastive loss

Fingerprint

Dive into the research topics of 'ContrasGAN: unsupervised domain adaptation in Human Activity Recognition via adversarial and contrastive learning'. Together they form a unique fingerprint.

Cite this