"False"
Skip to content
printicon
Main menu hidden.

OOD in Deep Learning and its Applications in Medical Imaging

Fri
26
Nov
Time Friday 26 November, 2021 at 12:00 - 13:00
Place ZOOM

Deep Neural Networks (DNNs) are extensively deployed in today’s safety-critical systems thanks to their excellent performance, ranging from autonomous driving to medical imaging. However, they are known to make mistakes unpredictably, which causes significant concerns on their applications in such scenarios. One common cause for such mistakes is Out-of-Distribution (OOD) samples, input samples that fall outside of the data distribution of its training dataset. A well-trained DNN that is highly accurate for In-Distribution samples often gives incorrect yet over-confident predictions for OOD samples. For example, a DNN binary classifier trained on a dataset of images of cats and dogs will misclassify an image of a horse as either cat or dog, since it simply does not know about horses. Ideally, DNNs should “fail loudly” for such OOD images and ask for human intervention. Examples in medical imaging include: images of some unseen disease not in the training dataset; images acquired with a different device model at a different hospital;  images that are incorrectly prepared, e.g., blurry images, poor contrast, incorrect view of the anatomy (lateral views processed using a model trained with frontal views), etc. I will review the current state-of-the-art in OOD detection as an outsider of the medical imaging field, and seek potential collaborators on this topic.

Join on ZOOM: https://umu.zoom.us/j/66665668694

Organizer: Faculty of Medicine
Event type: Seminar
Contact
Anna Lundberg
Read about Anna Lundberg