Abstract
The k-NN classifier can be very competitive if an appropriate distance measure is used. It is often used in applications because the classification decisions are easy to interpret. Here, we demonstrate how to find a good Mahalanobis distance for k-NN classification by a simple gradient descent without any constraints. The cost term uses global distances and unlike other methods there is a soft transition in the influence of data points. It is evaluated and compared to other metric learning and feature weighting methods on datasets from the UCI repository, where the described gradient method also shows a high robustness. In the comparison the advantages of global approaches are demonstrated.
Original language | English |
---|---|
Title of host publication | Artificial Neural Networks and Machine Learning - ICANN 2014 |
Editors | Stefan Wermter, Cornelius Weber, Wlodzislaw Duch, Timo Honkela, Petia Koprinkova-Hristova, Sven Magg, Günther Palm, Allessandro E.P. Villa |
Number of pages | 7 |
Publisher | Springer Verlag |
Publication date | 01.01.2014 |
Edition | 1 |
Pages | 129-135 |
ISBN (Print) | 978-3-319-11178-0 |
ISBN (Electronic) | 978-3-319-11179-7 |
DOIs | |
Publication status | Published - 01.01.2014 |
Event | 24th International Conference on Artificial Neural Networks - Depat. of Informatics, Knowledge Technology, University of hamburg, Hamburg, Germany Duration: 15.09.2014 → 19.09.2014 http://icann2014.org |