## Valery Vishnevskiy: Catalogue data in Spring Semester 2021 |

Name | Dr. Valery Vishnevskiy |

Address | Professur für Biomed. Bildgebung ETH Zürich, ETZ F 93 Gloriastrasse 35 8092 Zürich SWITZERLAND |

vishnevskiy@biomed.ee.ethz.ch | |

Department | Information Technology and Electrical Engineering |

Relationship | Lecturer |

Number | Title | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|

227-0424-00L | Model- and Learning-Based Inverse Problems in Imaging | 4 credits | 2V + 1P | V. Vishnevskiy | |

Abstract | Reconstruction is an inverse problem which estimates images from noisy measurements. Model-based reconstructions use analytical models of the imaging process and priors. Data-based methods directly approximate inversion using training data. Combining these two approaches yields physics-aware neural nets and state-of-the-art imaging accuracy (MRI, US, CT, microscopy, non-destructive imaging). | ||||

Objective | The goal of this course is to introduce the mathematical models of imaging experiments and practice implementation of numerical methods to solve the corresponding inverse problem. Students will learn how to improve reconstruction accuracy by introducing prior knowledge in the form of regularization models and training data. Furthermore, students will practice incorporating imaging model knowledge into deep neural networks. | ||||

Content | The course is based on following fundamental fields: (i) numerical linear algebra, (ii) mathematical statistics and learning theory, (iii) convex optimization and (iv) signal processing. The first part of the course introduces classical linear and nonlinear methods for image reconstruction. The second part considers data-based regularization and covers modern deep learning approaches to inverse problems in imaging. Finally, we introduce advances in the actively developing field of experimental design in biomedical imaging (i.e. how to conduct an experiment in a way to enable the most accurate reconstruction). 1. Introduction: Examples of inverse problems, general introduction. Refresh prerequisites. 2. Linear algebra in imaging: Refresh prerequisites. Demonstrate properties of operators employed in imaging. 3. Linear inverse problems and regularization: Classical theory of inverse problems. Introduce notion of ill-posedness and regularization. 3. Compressed sensing: Sparsity, basis-CS, TV-CS. Notion of analysis and synthesis forms of reconstruction problems. Application of PGD and ADMM to reconstruction. 4. Advanced priors and model selection: Total generalized variation, GMM priors, vectorial TV, low-rank, and tensor models. Stein's unbiased risk estimator. 5. Dictionary and prior learning: Classical dictionary learning. Gentle intro to machine learning. A lot of technical details about patch-models. 6. Deep learning in image reconstruction: Generic convolutional-NN models (automap, residual filtering, u-nets). Talk about the data generation process. Characterized difference between model- and data-based reconstruction methods. Mode averaging. 7. Loop unrolling and physics-aware networks for reconstruction: Autograd, Variational Networks, a lot of examples and intuition. Show how to use them efficiently, e.g. adding preconditioners, attention, etc. 8. Generative models and uncertainty quantification: Amortized posterior, variational autoencoders, adversarial learning. Estimation uncertainty quantification. 9. Inversible networks for estimation: Gradient flows in networks, inversible neural networks for estimation problems. 10. Experimental design in imaging: Acquisition optimization for continuous models. How far can we exploit autograd? 11. Signal sampling optimization in MRI. Reinforcement learning: Acquisition optimization for discrete models. Reinforce and policy gradients, variance minimization for discrete variables (RELAX, REBAR). Cartesian under-sampling pattern design 12. Summary and exam preparation. | ||||

Lecture notes | Lecture slides with references will be provided during the course. | ||||

Prerequisites / Notice | Students are expected to know the basics of (i) numerical linear algebra, (ii) applied methods of convex optimization, (iii) computational statistics, (iv) Matlab and Python. |