26.08.2024
When: Thursday, August 29th, from 10:15 to 11:45 am
Who: Dr. Simone Schaub-Meyer from TU Darmstadt
Where: Joseph-von-Fraunhofer-Straße 25, Raum 303 & Zoom
Recent developments in deep learning have led to significant advances in many areas of computer vision. However, especially in safety critical scenarios, we are not only interested in task specific performance but there is a critical need to be able to explain the decision process of a deep neural networks despite its complexity. Visual explanations can help to demystify the inner workings of these models, providing insights into their decision-making processes. In my talk I will first talk about how we can obtain visual explanations efficiently and effectively in case of image classification. In the second part I will talk about potential metrics and frameworks for assessing the quality visual explanations. A challenging task due to the difficulty of obtaining ground truth explanations for evaluation.
Simone Schaub-Meyer is an independent research group leader at the Technical University of Darmstadt, as well as affiliated with the Hessian Center for Artificial Intelligence. She recently got the renowned Emmy Noether Programme (ENP) grant of the German Research Foundation (DFG) supporting her research on Interpretable Neural Networks for Dense Image and Video Analysis. The focus of her research is on developing efficient, robust, and understandable methods and algorithms for image and video analysis. Prior to joining TU Darmstadt, she was a postdoctoral researcher at the Media Technology Lab at ETH Zurich working on augmented reality. She also obtained her doctoral degree from ETH Zurich in 2018, where she developed novel methods for motion representation and video frame interpolation in collaboration with Disney Research Zurich.