On the Problem of Finding the Least Number of Features by L1-norm Minimisation

Sascha Klement, Thomas Martinetz


Recently, the so-called Support Feature Machine (SFM) was proposed as a novel approach to feature selection for classification. It relies on approximating the zero-norm minimising weight vector of a separating hyperplane by optimising for its one-norm. In contrast to the L1-SVM it uses an additional constraint based on the average of data points. In experiments on artificial datasets we observe that the SFM is highly superior in returning a lower number of features and a larger percentage of truly relevant features. Here, we derive a necessary condition that the zero-norm and 1-norm solution coincide. Based on this condition the superiority can be made plausible.
Original languageEnglish
Title of host publicationArtificial Neural Networks and Machine Learning – ICANN 2011
EditorsTimo Honkela, Włodzisław Duch, Mark Girolami, Samuel Kaski
Number of pages8
PublisherSpringer Berlin Heidelberg
Publication date2011
ISBN (Print)978-3-642-21734-0
ISBN (Electronic)978-3-642-21735-7
Publication statusPublished - 2011
Event21st International Conference on Artificial Neural Networks - Espoo, Finland
Duration: 14.06.201117.06.2011


Dive into the research topics of 'On the Problem of Finding the Least Number of Features by L1-norm Minimisation'. Together they form a unique fingerprint.

Cite this