The k-NN algorithm is still very popular due to its simplicity and the easy interpretability of the results. However, the often used Euclidean distance is an arbitrary choice for many datasets. It is arbitrary because often the data is described by measurements from different domains. Therefore, the Euclidean distance often leads to a bad classification rate of k-NN. By feature weighting the scaling of dimensions can be adapted and the classification performance can be significantly improved. We here present a simple linear programming based method for feature weighting, which in contrast to other feature weighting methods is robust to the initial scaling of the data dimensions. An evaluation is performed on real-world datasets from the UCI repository with comparison to other feature weighting algorithms and to Large Margin Nearest Neighbor Classification (LMNN) as a metric learning algorithm.
|Title of host publication||Artificial Neural Networks and Machine Learning – ICANN 2013 : 23rd International Conference on Artificial Neural Networks Sofia, Bulgaria, September 10-13, 2013. Proceedings|
|Editors||Valeri Mladenov, Petia Koprinkova-Hristova, Günther Palm, Alessandro E.P. Villa, Bruno Appollini, Nikola Kasabov|
|Number of pages||6|
|Publisher||Springer Berlin Heidelberg|
|Publication status||Published - 08.10.2013|
|Event||International Conference on Artificial Neural Networks 2013 - Sofia, Bulgaria|
Duration: 10.09.2013 → 13.09.2013