Global Metric Learning by Gradient Descent

Jens Hocke, Thomas Martinetz

Abstract

The k-NN classifier can be very competitive if an appropriate distance measure is used. It is often used in applications because the classification decisions are easy to interpret. Here, we demonstrate how to find a good Mahalanobis distance for k-NN classification by a simple gradient descent without any constraints. The cost term uses global distances and unlike other methods there is a soft transition in the influence of data points. It is evaluated and compared to other metric learning and feature weighting methods on datasets from the UCI repository, where the described gradient method also shows a high robustness. In the comparison the advantages of global approaches are demonstrated.

Original languageEnglish
Title of host publicationArtificial Neural Networks and Machine Learning - ICANN 2014
EditorsStefan Wermter, Cornelius Weber, Wlodzislaw Duch, Timo Honkela, Petia Koprinkova-Hristova, Sven Magg, Günther Palm, Allessandro E.P. Villa
Number of pages7
PublisherSpringer Verlag
Publication date01.01.2014
Edition1
Pages129-135
ISBN (Print)978-3-319-11178-0
ISBN (Electronic)978-3-319-11179-7
DOIs
Publication statusPublished - 01.01.2014
Event24th International Conference on Artificial Neural Networks - Depat. of Informatics, Knowledge Technology, University of hamburg, Hamburg, Germany
Duration: 15.09.201419.09.2014
http://icann2014.org

Fingerprint

Dive into the research topics of 'Global Metric Learning by Gradient Descent'. Together they form a unique fingerprint.

Cite this