Projekte pro Jahr
Abstract
Sparse Coding aims at nding a dictionary for a given data set, such that each sample can be represented by a linear combination of only few dictionary atoms. Generally, sparse coding dictionaries are overcomplete and not orthogonal. Thus, the processing substep to determine the optimal k-sparse representation of a given sample by the current dictionary is NP-hard. Usually, the solution is approximated by a greedy algorithm or by l1 convex relaxation. With an orthogonal dictionary, however, an optimal k-sparse representation can not only be eciently, but exactly computed, because a corresponding k-sparse coecient vector is given by the k largest absolute projections.
Originalsprache | Englisch |
---|---|
Titel | Workshop New Challenges in Neural Computation 2013 |
Redakteure/-innen | Barbara Hammer, Thomas Martinetz, Thomas Villmann |
Seitenumfang | 2 |
Band | 02 |
Erscheinungsdatum | 2013 |
Seiten | 119-120 |
Publikationsstatus | Veröffentlicht - 2013 |
Veranstaltung | 35th German Conference on Pattern Recognition 2013 - Saarbrücken, Deutschland Dauer: 03.09.2013 → 06.09.2013 https://www.techfak.uni-bielefeld.de/~bhammer/GINN/NC2_2013/call.html |
Projekte
- 1 Abgeschlossen
-
SPP 1527, Teilprojekt: Lernen effizienter Abtastung für das aktive Sehen
01.10.11 → 30.09.16
Projekt: DFG-Projekte › DFG-Verbundforschung: Schwerpunktprogramme