Soft-competitive Learning of Sparse Codes and its Application to Image Reconstruction

Abstract

We propose a new algorithm for the design of overcomplete dictionaries for sparse coding, neural gas for dictionary learning (NGDL), which uses a set of solutions for the sparse coefficients in each update step of the dictionary. In order to obtain such a set of solutions, we additionally propose the bag of pursuits (BOP) method for sparse approximation. Using BOP in order to determine the coefficients of the dictionary, we show in an image encoding experiment that in case of limited training data and limited computation time the NGDL update of the dictionary performs better than the standard gradient approach that is used for instance in the Sparsenet algorithm, or other state-of-the-art methods for dictionary learning such as the method of optimal directions (MOD) or the widely used K-SVD algorithm. In an application to image reconstruction, dictionaries trained with this algorithm outperform not only overcomplete Haar-wavelets and overcomplete discrete cosine transformations, but also dictionaries obtained with widely used algorithms like K-SVD.
Original languageEnglish
JournalNeurocomputing
Volume74
Issue number9
Pages (from-to)1418-1428
Number of pages11
ISSN0925-2312
DOIs
Publication statusPublished - 01.04.2011

Fingerprint

Dive into the research topics of 'Soft-competitive Learning of Sparse Codes and its Application to Image Reconstruction'. Together they form a unique fingerprint.

Cite this