Feature Weighting by Maximum Distance Minimization

Jens Hocke, Thomas Martinetz

Abstract

The k-NN algorithm is still very popular due to its simplicity and the easy interpretability of the results. However, the often used Euclidean distance is an arbitrary choice for many datasets. It is arbitrary because often the data is described by measurements from different domains. Therefore, the Euclidean distance often leads to a bad classification rate of k-NN. By feature weighting the scaling of dimensions can be adapted and the classification performance can be significantly improved. We here present a simple linear programming based method for feature weighting, which in contrast to other feature weighting methods is robust to the initial scaling of the data dimensions. An evaluation is performed on real-world datasets from the UCI repository with comparison to other feature weighting algorithms and to Large Margin Nearest Neighbor Classification (LMNN) as a metric learning algorithm.


Original languageEnglish
Title of host publicationArtificial Neural Networks and Machine Learning – ICANN 2013 : 23rd International Conference on Artificial Neural Networks Sofia, Bulgaria, September 10-13, 2013. Proceedings
EditorsValeri Mladenov, Petia Koprinkova-Hristova, Günther Palm, Alessandro E.P. Villa, Bruno Appollini, Nikola Kasabov
Number of pages6
VolumeVol. 8131
PublisherSpringer Berlin Heidelberg
Publication date08.10.2013
Pages420-425
ISBN (Print)978-3-642-40727-7
ISBN (Electronic)978-3-642-40728-4
DOIs
Publication statusPublished - 08.10.2013
EventInternational Conference on Artificial Neural Networks 2013 - Sofia, Bulgaria
Duration: 10.09.201313.09.2013
https://link.springer.com/conference/icann

Fingerprint

Dive into the research topics of 'Feature Weighting by Maximum Distance Minimization'. Together they form a unique fingerprint.

Cite this