Learning Orthogonal Bases for k-Sparse Representations

Abstract

Sparse Coding aims at nding a dictionary for a given data set, such that each sample can be represented by a linear combination of only few dictionary atoms. Generally, sparse coding dictionaries are overcomplete and not orthogonal. Thus, the processing substep to determine the optimal k-sparse representation of a given sample by the current dictionary is NP-hard. Usually, the solution is approximated by a greedy algorithm or by l1 convex relaxation. With an orthogonal dictionary, however, an optimal k-sparse representation can not only be eciently, but exactly computed, because a corresponding k-sparse coecient vector is given by the k largest absolute projections.
OriginalspracheEnglisch
TitelWorkshop New Challenges in Neural Computation 2013
Redakteure/-innenBarbara Hammer, Thomas Martinetz, Thomas Villmann
Seitenumfang2
Band02
Erscheinungsdatum2013
Seiten119-120
PublikationsstatusVeröffentlicht - 2013
Veranstaltung35th German Conference on Pattern Recognition 2013 - Saarbrücken, Deutschland
Dauer: 03.09.201306.09.2013
https://www.techfak.uni-bielefeld.de/~bhammer/GINN/NC2_2013/call.html

Zitieren