Projects per year
Sparse Coding aims at nding a dictionary for a given data set, such that each sample can be represented by a linear combination of only few dictionary atoms. Generally, sparse coding dictionaries are overcomplete and not orthogonal. Thus, the processing substep to determine the optimal k-sparse representation of a given sample by the current dictionary is NP-hard. Usually, the solution is approximated by a greedy algorithm or by l1 convex relaxation. With an orthogonal dictionary, however, an optimal k-sparse representation can not only be eciently, but exactly computed, because a corresponding k-sparse coecient vector is given by the k largest absolute projections.
|Title of host publication||Workshop New Challenges in Neural Computation 2013|
|Editors||Barbara Hammer, Thomas Martinetz, Thomas Villmann|
|Number of pages||2|
|Publication status||Published - 2013|
|Event||35th German Conference on Pattern Recognition 2013 - Saarbrücken, Germany|
Duration: 03.09.2013 → 06.09.2013
FingerprintDive into the research topics of 'Learning Orthogonal Bases for k-Sparse Representations'. Together they form a unique fingerprint.
- 1 Finished