Projects per year
Abstract
Sparse Coding aims at nding a dictionary for a given data set, such that each sample can be represented by a linear combination of only few dictionary atoms. Generally, sparse coding dictionaries are overcomplete and not orthogonal. Thus, the processing substep to determine the optimal ksparse representation of a given sample by the current dictionary is NPhard. Usually, the solution is approximated by a greedy algorithm or by l1 convex relaxation. With an orthogonal dictionary, however, an optimal ksparse representation can not only be eciently, but exactly computed, because a corresponding ksparse coecient vector is given by the k largest absolute projections.
Original language  English 

Title of host publication  Workshop New Challenges in Neural Computation 2013 
Editors  Barbara Hammer, Thomas Martinetz, Thomas Villmann 
Number of pages  2 
Volume  02 
Publication date  2013 
Pages  119120 
Publication status  Published  2013 
Event  35th German Conference on Pattern Recognition 2013  Saarbrücken, Germany Duration: 03.09.2013 → 06.09.2013 https://www.techfak.unibielefeld.de/~bhammer/GINN/NC2_2013/call.html 
Fingerprint
Dive into the research topics of 'Learning Orthogonal Bases for kSparse Representations'. Together they form a unique fingerprint.Projects
 1 Finished

SPP 1527, Subproject: Learning Efficient Sensing for Active Vision (Esensing)
01.10.11 → 30.09.16
Project: DFG Projects › DFG Joint Research: Priority Programs