Projects per year
Abstract
Sparse Coding aims at nding a dictionary for a given data set, such that each sample can be represented by a linear combination of only few dictionary atoms. Generally, sparse coding dictionaries are overcomplete and not orthogonal. Thus, the processing substep to determine the optimal k-sparse representation of a given sample by the current dictionary is NP-hard. Usually, the solution is approximated by a greedy algorithm or by l1 convex relaxation. With an orthogonal dictionary, however, an optimal k-sparse representation can not only be eciently, but exactly computed, because a corresponding k-sparse coecient vector is given by the k largest absolute projections.
Original language | English |
---|---|
Title of host publication | Workshop New Challenges in Neural Computation 2013 |
Editors | Barbara Hammer, Thomas Martinetz, Thomas Villmann |
Number of pages | 2 |
Volume | 02 |
Publication date | 2013 |
Pages | 119-120 |
Publication status | Published - 2013 |
Event | 35th German Conference on Pattern Recognition 2013 - Saarbrücken, Germany Duration: 03.09.2013 → 06.09.2013 https://www.techfak.uni-bielefeld.de/~bhammer/GINN/NC2_2013/call.html |
Projects
- 1 Finished
-
SPP 1527, Subproject: Learning Efficient Sensing for Active Vision (Esensing)
Martinetz, T. (Speaker, Coordinator) & Barth, E. (Project Staff)
01.10.11 → 30.09.16
Project: DFG Projects › DFG Joint Research: Priority Programs