Abstract
An Augmented Reality Microscope (ARM) displays additional information about the tissues being analyzed by a pathologist. The image analysis methods used by these microscopes must be robust to changes in magnification level in order to follow the displacements in the glass slides. In this paper, we propose to take advantage of features present in some key magnification levels that improve the results of other magnification levels. We propose a real-time method robust to changes in magnification levels by training deep neural networks on certain key levels of Whole Slide Images (WSI) and testing them on the levels of a microscope. We show that our approach outperforms or equals naive methods on a breast cancer dataset for the Inception- ResNet-v2 deep learning architecture.

This work is licensed under a Creative Commons Attribution 4.0 International License.
Copyright (c) 2022 Robin Heckenauer, Jonathan Weber, C´edric Wemmert, Michel Hassenforder, Pierre-Alain Muller, Germain Forestier
