TY - JOUR
T1 - Deep transfer learning for time series data based on sensor modality classification
AU - Li, Frédéric
AU - Shirahama, Kimiaki
AU - Nisar, Muhammad Adeel
AU - Huang, Xinyu
AU - Grzegorzek, Marcin
N1 - Funding Information:
Funding: Research and development activities leading to this article have been supported by the German Research Foundation (DFG) as part of the research training group GRK 1564 "Imaging New Modalities”, and the German Federal Ministry of Education and Research (BMBF) within the projects CognitiveVillage (Grant No. 16SV7223K) and ELISE (Grant No. 16SV7512, www.elise-lernen.de).
Publisher Copyright:
© 2020 by the authors. Licensee MDPI, Basel, Switzerland.
Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2020/8/1
Y1 - 2020/8/1
N2 - The scarcity of labelled time-series data can hinder a proper training of deep learning models. This is especially relevant for the growing field of ubiquitous computing, where data coming from wearable devices have to be analysed using pattern recognition techniques to provide meaningful applications. To address this problem, we propose a transfer learning method based on attributing sensor modality labels to a large amount of time-series data collected from various application fields. Using these data, our method firstly trains a Deep Neural Network (DNN) that can learn general characteristics of time-series data, then transfers it to another DNN designed to solve a specific target problem. In addition, we propose a general architecture that can adapt the transferred DNN regardless of the sensors used in the target field making our approach in particular suitable for multichannel data. We test our method for two ubiquitous computing problems—Human Activity Recognition (HAR) and Emotion Recognition (ER)—and compare it a baseline training the DNN without using transfer learning. For HAR, we also introduce a new dataset, Cognitive Village-MSBand (CogAge), which contains data for 61 atomic activities acquired from three wearable devices (smartphone, smartwatch, and smartglasses). Our results show that our transfer learning approach outperforms the baseline for both HAR and ER.
AB - The scarcity of labelled time-series data can hinder a proper training of deep learning models. This is especially relevant for the growing field of ubiquitous computing, where data coming from wearable devices have to be analysed using pattern recognition techniques to provide meaningful applications. To address this problem, we propose a transfer learning method based on attributing sensor modality labels to a large amount of time-series data collected from various application fields. Using these data, our method firstly trains a Deep Neural Network (DNN) that can learn general characteristics of time-series data, then transfers it to another DNN designed to solve a specific target problem. In addition, we propose a general architecture that can adapt the transferred DNN regardless of the sensors used in the target field making our approach in particular suitable for multichannel data. We test our method for two ubiquitous computing problems—Human Activity Recognition (HAR) and Emotion Recognition (ER)—and compare it a baseline training the DNN without using transfer learning. For HAR, we also introduce a new dataset, Cognitive Village-MSBand (CogAge), which contains data for 61 atomic activities acquired from three wearable devices (smartphone, smartwatch, and smartglasses). Our results show that our transfer learning approach outperforms the baseline for both HAR and ER.
UR - http://www.scopus.com/inward/record.url?scp=85088959664&partnerID=8YFLogxK
U2 - 10.3390/s20154271
DO - 10.3390/s20154271
M3 - Journal articles
C2 - 32751855
AN - SCOPUS:85088959664
SN - 1424-8220
VL - 20
SP - 1
EP - 25
JO - Sensors (Switzerland)
JF - Sensors (Switzerland)
IS - 15
M1 - 4271
ER -