Deep transfer learning for time series data based on sensor modality classification

Frédéric Li*, Kimiaki Shirahama, Muhammad Adeel Nisar, Xinyu Huang, Marcin Grzegorzek

*Corresponding author for this work
3 Citations (Scopus)


The scarcity of labelled time-series data can hinder a proper training of deep learning models. This is especially relevant for the growing field of ubiquitous computing, where data coming from wearable devices have to be analysed using pattern recognition techniques to provide meaningful applications. To address this problem, we propose a transfer learning method based on attributing sensor modality labels to a large amount of time-series data collected from various application fields. Using these data, our method firstly trains a Deep Neural Network (DNN) that can learn general characteristics of time-series data, then transfers it to another DNN designed to solve a specific target problem. In addition, we propose a general architecture that can adapt the transferred DNN regardless of the sensors used in the target field making our approach in particular suitable for multichannel data. We test our method for two ubiquitous computing problems—Human Activity Recognition (HAR) and Emotion Recognition (ER)—and compare it a baseline training the DNN without using transfer learning. For HAR, we also introduce a new dataset, Cognitive Village-MSBand (CogAge), which contains data for 61 atomic activities acquired from three wearable devices (smartphone, smartwatch, and smartglasses). Our results show that our transfer learning approach outperforms the baseline for both HAR and ER.

Original languageEnglish
Article number4271
JournalSensors (Switzerland)
Issue number15
Pages (from-to)1-25
Number of pages25
Publication statusPublished - 01.08.2020

Research Areas and Centers

  • Centers: Center for Artificial Intelligence Luebeck (ZKIL)


Dive into the research topics of 'Deep transfer learning for time series data based on sensor modality classification'. Together they form a unique fingerprint.

Cite this