# Search result: Catalogue data in Spring Semester 2021

Neural Systems and Computation Master | ||||||

Core Courses | ||||||

Elective Core Courses | ||||||

Neural Computation and Theoretical Neurosciences | ||||||

Number | Title | Type | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|---|

227-0395-00L | Neural Systems | W | 6 credits | 2V + 1U + 1A | R. Hahnloser, M. F. Yanik, B. Grewe | |

Abstract | This course introduces principles of information processing in neural systems. It covers basic neuroscience for engineering students, experiment techniques used in animal research and methods for inferring neural mechanisms. Students learn about neural information processing and basic principles of natural intelligence and their impact on artificially intelligent systems. | |||||

Objective | This course introduces - Basic neurophysiology and mathematical descriptions of neurons - Methods for dissecting animal behavior - Neural recordings in intact nervous systems and information decoding principles - Methods for manipulating the state and activity in selective neuron types - Neuromodulatory systems and their computational roles - Reward circuits and reinforcement learning - Imaging methods for reconstructing the synaptic networks among neurons - Birdsong and language - Neurobiological principles for machine learning. | |||||

Content | From active membranes to propagation of action potentials. From synaptic physiology to synaptic learning rules. From receptive fields to neural population decoding. From fluorescence imaging to connectomics. Methods for reading and manipulation neural ensembles. From classical conditioning to reinforcement learning. From the visual system to deep convolutional networks. Brain architectures for learning and memory. From birdsong to computational linguistics. | |||||

Prerequisites / Notice | Before taking this course, students are encouraged to complete "Bioelectronics and Biosensors" (227-0393-10L). As part of the exercises for this class, students are expected to complete a programming or literature review project to be defined at the beginning of the semester. | |||||

227-0973-00L | Translational Neuromodeling | W | 8 credits | 3V + 2U + 1A | K. Stephan | |

Abstract | This course provides a systematic introduction to Translational Neuromodeling (the development of mathematical models for diagnostics of brain diseases) and their application to concrete clinical questions (Computational Psychiatry/Psychosomatics). It focuses on a generative modeling strategy and teaches (hierarchical) Bayesian models of neuroimaging data and behaviour, incl. exercises. | |||||

Objective | To obtain an understanding of the goals, concepts and methods of Translational Neuromodeling and Computational Psychiatry/Psychosomatics, particularly with regard to Bayesian models of neuroimaging (fMRI, EEG) and behavioural data. | |||||

Content | This course provides a systematic introduction to Translational Neuromodeling (the development of mathematical models for inferring mechanisms of brain diseases from neuroimaging and behavioural data) and their application to concrete clinical questions (Computational Psychiatry/Psychosomatics). The first part of the course will introduce disease concepts from psychiatry and psychosomatics, their history, and clinical priority problems. The second part of the course concerns computational modeling of neuronal and cognitive processes for clinical applications. A particular focus is on Bayesian methods and generative models, for example, dynamic causal models for inferring neuronal processes from neuroimaging data, and hierarchical Bayesian models for inference on cognitive processes from behavioural data. The course discusses the mathematical and statistical principles behind these models, illustrates their application to various psychiatric diseases, and outlines a general research strategy based on generative models. Lecture topics include: 1. Introduction to Translational Neuromodeling and Computational Psychiatry/Psychosomatics 2. Psychiatric nosology 3. Pathophysiology of psychiatric disease mechanisms 4. Principles of Bayesian inference and generative modeling 5. Variational Bayes (VB) 6. Bayesian model selection 7. Markov Chain Monte Carlo techniques (MCMC) 8. Bayesian frameworks for understanding psychiatric and psychosomatic diseases 9. Generative models of fMRI data 10. Generative models of electrophysiological data 11. Generative models of behavioural data 12. Computational concepts of schizophrenia, depression and autism 13. Model-based predictions about individual patients Practical exercises include mathematical derivations and the implementation of specific models and inference methods. In additional project work, students are required to use one of the examples discussed in the course as a basis for developing their own generative model and use it for simulations and/or inference in application to a clinical question. Group work (up to 3 students) is required. | |||||

Literature | See TNU website: https://www.tnu.ethz.ch/en/teaching | |||||

Prerequisites / Notice | Good knowledge of principles of statistics, good programming skills (MATLAB, Julia, or Python) | |||||

252-1424-00L | Models of Computation | W | 6 credits | 2V + 2U + 1A | M. Cook | |

Abstract | This course surveys many different models of computation: Turing Machines, Cellular Automata, Finite State Machines, Graph Automata, Circuits, Tilings, Lambda Calculus, Fractran, Chemical Reaction Networks, Hopfield Networks, String Rewriting Systems, Tag Systems, Diophantine Equations, Register Machines, Primitive Recursive Functions, and more. | |||||

Objective | The goal of this course is to become acquainted with a wide variety of models of computation, to understand how models help us to understand the modeled systems, and to be able to develop and analyze models appropriate for new systems. | |||||

Content | This course surveys many different models of computation: Turing Machines, Cellular Automata, Finite State Machines, Graph Automata, Circuits, Tilings, Lambda Calculus, Fractran, Chemical Reaction Networks, Hopfield Networks, String Rewriting Systems, Tag Systems, Diophantine Equations, Register Machines, Primitive Recursive Functions, and more. |

- Page 1 of 1