Learning Orthogonal Bases for k-Sparse Representations

Abstract

Sparse Coding aims at nding a dictionary for a given data set, such that each sample can be represented by a linear combination of only few dictionary atoms. Generally, sparse coding dictionaries are overcomplete and not orthogonal. Thus, the processing substep to determine the optimal k-sparse representation of a given sample by the current dictionary is NP-hard. Usually, the solution is approximated by a greedy algorithm or by l1 convex relaxation. With an orthogonal dictionary, however, an optimal k-sparse representation can not only be eciently, but exactly computed, because a corresponding k-sparse coecient vector is given by the k largest absolute projections.
Original languageEnglish
Title of host publicationWorkshop New Challenges in Neural Computation 2013
EditorsBarbara Hammer, Thomas Martinetz, Thomas Villmann
Number of pages2
Volume02
Publication date2013
Pages119-120
Publication statusPublished - 2013
Event35th German Conference on Pattern Recognition 2013 - Saarbrücken, Germany
Duration: 03.09.201306.09.2013
https://www.techfak.uni-bielefeld.de/~bhammer/GINN/NC2_2013/call.html

Cite this