Continuous robust sound event classification using time-frequency features and deep learning

Ian McLoughlin*, Haomin Zhang, Zhipeng Xie, Yan Song, Wei Xiao, Huy Phan

*Corresponding author for this work
5 Citations (Scopus)

Abstract

The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification.

Original languageEnglish
Article numbere0182309
JournalPLoS ONE
Volume12
Issue number9
ISSN1553-7390
DOIs
Publication statusPublished - 01.09.2017

Fingerprint

Dive into the research topics of 'Continuous robust sound event classification using time-frequency features and deep learning'. Together they form a unique fingerprint.

Cite this