Abstract
CNNs are characterized in particular by the ability to independently learn suitable features from a given data set. However, the resulting latent space is optimized for the given training data. Especially for tasks that require a high generalization ability, like e.g. the segmentation of single cells in a microscopic image across various experiments, these specific solutions might not offer optimal results. In this work, we improve generalization with an additional unsupervised training step that operates in the latent space. First experiments with the Kaggle cell segmentation competition data show a strong improvement in the generalization of acquired knowledge when using a soft- and hard-competitive Neural-Gas algorithm for deep clustering with a standard CNN architecture.
Originalsprache | Englisch |
---|---|
Titel | 2020 International Joint Conference on Neural Networks (IJCNN) |
Herausgeber (Verlag) | IEEE |
Erscheinungsdatum | 07.2020 |
Aufsatznummer | 9207602 |
ISBN (Print) | 978-1-7281-6926-2 |
ISBN (elektronisch) | 978-1-7281-6927-9 |
DOIs | |
Publikationsstatus | Veröffentlicht - 07.2020 |
Veranstaltung | 2020 International Joint Conference on Neural Networks - Virtual, Glasgow, Großbritannien / Vereinigtes Königreich Dauer: 19.07.2020 → 24.07.2020 Konferenznummer: 163566 |
Strategische Forschungsbereiche und Zentren
- Querschnittsbereich: Intelligente Systeme
- Zentren: Zentrum für Künstliche Intelligenz Lübeck (ZKIL)
DFG-Fachsystematik
- 4.43-05 Bild- und Sprachverarbeitung, Computergraphik und Visualisierung, Human Computer Interaction, Ubiquitous und Wearable Computing