Robust and Fast Learning of Sparse Codes With Stochastic Gradient Descent

Abstract

Particular classes of signals, as for example natural images, can be encoded sparsely if appropriate dictionaries are used. Finding such dictionaries based on data samples, however, is a difficult optimization task. In this paper, it is shown that simple stochastic gradient descent, besides being much faster, leads to superior dictionaries compared to the Method of Optimal Directions (MOD) and the K-SVD algorithm. The gain is most significant in the difficult but relevant case of highly overlapping subspaces, i.e., when the data samples are jointly represented by a restricted set of dictionary elements. Moreover, the so-called Bag of Pursuits method is introduced as an extension of Orthogonal Matching Pursuit, and it is shown that it provides an improved approximation of the optimal sparse coefficients and, therefore, significantly improves the performance of the here proposed gradient descent as well as of the MOD and K-SVD approaches. Finally, it is shown how the Bag of Pursuits and a generalized version of the Neural Gas algorithm can be used to derive an even more powerful method for sparse coding. Performance is analyzed based on both synthetic data and the practical problem of image deconvolution. In the latter case, two different dictionaries are learned for sample images of buildings and flowers, respectively. It is demonstrated that the learned dictionaries do indeed adapt to the image class and that they therefore yield superior reconstruction results.
Original languageEnglish
JournalIEEE Journal of Selected Topics in Signal Processing
Volume5
Issue number5
Pages (from-to)1048-1060
Number of pages13
ISSN1932-4553
DOIs
Publication statusPublished - 09.2011

Fingerprint

Dive into the research topics of 'Robust and Fast Learning of Sparse Codes With Stochastic Gradient Descent'. Together they form a unique fingerprint.

Cite this