Pervasive Self-Learning with Multi-modal Distributed Sensors

Nicola Bicocchi, Marco Mamei, Andrea Prati, Rita Cucchiara, Franco Zambonelli, Matteo Lasagni

Abstract

Truly ubiquitous computing poses new and significant challenges. One of the key aspects that will condition the impact of these new tecnologies is how to obtain a manageable representation ofthe surrounding environment starting from simple sensing capabilities. This will make devices able to adapt their computing activities on an ever-changing environment. This paper presents a framework to promote unsupervised training processes among different sensors. This framework allows different sensors to exchange the needed knowledge to create a model to classify events. In particular we developed, as a case study, a multi-modal multi-sensor classification system combining data from a camera and a body-worn accelerometer to identify the user motion state. The body-worn accelerometer learns a model ofthe user behavior exploiting the information coming from the camera and uses it later on to classify the user motion in an autonomous way. Experiments demonstrate the accuracy ofthe proposed approach in different situations.

Original languageEnglish
Title of host publication 2008 Second IEEE International Conference on Self-Adaptive and Self-Organizing Systems Workshops
Number of pages6
PublisherIEEE
Publication date01.12.2008
Pages61-66
Article number4800654
ISBN (Print)978-0-7695-3553-1
DOIs
Publication statusPublished - 01.12.2008
Event2nd IEEE International Conference on Self-Adaptive and Self-Organizing Systems Workshops - Venice, Italy
Duration: 20.10.200824.10.2008
Conference number: 75868

Fingerprint

Dive into the research topics of 'Pervasive Self-Learning with Multi-modal Distributed Sensors'. Together they form a unique fingerprint.

Cite this