TY - JOUR
T1 - ANHIR: Automatic Non-Rigid Histological Image Registration Challenge
AU - Borovec, Jiri
AU - Kybic, Jan
AU - Arganda-Carreras, Ignacio
AU - Sorokin, Dmitry V.
AU - Bueno, Gloria
AU - Khvostikov, Alexander V.
AU - Bakas, Spyridon
AU - Chang, Eric I.Chao
AU - Heldmann, Stefan
AU - Kartasalo, Kimmo
AU - Latonen, Leena
AU - Lotz, Johannes
AU - Noga, Michelle
AU - Pati, Sarthak
AU - Punithakumar, Kumaradevan
AU - Ruusuvuori, Pekka
AU - Skalski, Andrzej
AU - Tahmasebi, Nazanin
AU - Valkonen, Masi
AU - Venet, Ludovic
AU - Wang, Yizhe
AU - Weiss, Nick
AU - Wodzinski, Marek
AU - Xiang, Yu
AU - Xu, Yan
AU - Yan, Yan
AU - Yushkevich, Paul
AU - Zhao, Shengyu
AU - Munoz-Barrutia, Arrate
N1 - Copyright:
This record is sourced from MEDLINE/PubMed, a database of the U.S. National Library of Medicine
PY - 2020/10/1
Y1 - 2020/10/1
N2 - Automatic Non-rigid Histological Image Registration (ANHIR) challenge was organized to compare the performance of image registration algorithms on several kinds of microscopy histology images in a fair and independent manner. We have assembled 8 datasets, containing 355 images with 18 different stains, resulting in 481 image pairs to be registered. Registration accuracy was evaluated using manually placed landmarks. In total, 256 teams registered for the challenge, 10 submitted the results, and 6 participated in the workshop. Here, we present the results of 7 well-performing methods from the challenge together with 6 well-known existing methods. The best methods used coarse but robust initial alignment, followed by non-rigid registration, used multiresolution, and were carefully tuned for the data at hand. They outperformed off-the-shelf methods, mostly by being more robust. The best methods could successfully register over 98% of all landmarks and their mean landmark registration accuracy (TRE) was 0.44% of the image diagonal. The challenge remains open to submissions and all images are available for download.
AB - Automatic Non-rigid Histological Image Registration (ANHIR) challenge was organized to compare the performance of image registration algorithms on several kinds of microscopy histology images in a fair and independent manner. We have assembled 8 datasets, containing 355 images with 18 different stains, resulting in 481 image pairs to be registered. Registration accuracy was evaluated using manually placed landmarks. In total, 256 teams registered for the challenge, 10 submitted the results, and 6 participated in the workshop. Here, we present the results of 7 well-performing methods from the challenge together with 6 well-known existing methods. The best methods used coarse but robust initial alignment, followed by non-rigid registration, used multiresolution, and were carefully tuned for the data at hand. They outperformed off-the-shelf methods, mostly by being more robust. The best methods could successfully register over 98% of all landmarks and their mean landmark registration accuracy (TRE) was 0.44% of the image diagonal. The challenge remains open to submissions and all images are available for download.
UR - http://www.scopus.com/inward/record.url?scp=85087014178&partnerID=8YFLogxK
U2 - 10.1109/TMI.2020.2986331
DO - 10.1109/TMI.2020.2986331
M3 - Journal articles
C2 - 32275587
AN - SCOPUS:85087014178
SN - 0278-0062
VL - 39
SP - 3042
EP - 3052
JO - IEEE Transactions on Medical Imaging
JF - IEEE Transactions on Medical Imaging
IS - 10
ER -