Search result: Catalogue data in Autumn Semester 2022

DAS in Data Science Information
Specialisation Track
Neural Information Processing
NumberTitleTypeECTSHoursLecturers
227-0421-00LDeep Learning in Artificial and Biological Neuronal NetworksW4 credits3GB. Grewe
AbstractDeep-Learning (DL) a brain-inspired weak for of AI allows training of large artificial neuronal networks (ANNs) that, like humans, can learn real-world tasks such as recognizing objects in images. However, DL is far from being understood and investigating learning in biological networks might serve again as a compelling inspiration to think differently about state-of-the-art ANN training methods.
ObjectiveThe main goal of this lecture is to provide a comprehensive overview into the learning principles neuronal networks as well as to introduce a diverse skill set (e.g. simulating a spiking neuronal network) that is required to understand learning in large, hierarchical neuronal networks. To achieve this the lectures and exercises will merge ideas, concepts and methods from machine learning and neuroscience. These will include training basic ANNs, simulating spiking neuronal networks as well as being able to read and understand the main ideas presented in today’s neuroscience papers.
After this course students will be able to:
- read and understand the main ideas and methods that are presented in today’s neuroscience papers
- explain the basic ideas and concepts of plasticity in the mammalian brain
- implement alternative ANN learning algorithms to ‘error backpropagation’ in order to train deep neuronal networks.
- use a diverse set of ANN regularization methods to improve learning
- simulate spiking neuronal networks that learn simple (e.g. digit classification) tasks in a supervised manner.
ContentDeep-learning a brain-inspired weak form of AI allows training of large artificial neuronal networks (ANNs) that, like humans, can learn real-world tasks such as recognizing objects in images. The origins of deep hierarchical learning can be traced back to early neuroscience research by Hubel and Wiesel in the 1960s, who first described the neuronal processing of visual inputs in the mammalian neocortex. Similar to their neocortical counterparts ANNs seem to learn by interpreting and structuring the data provided by the external world. However, while on specific tasks such as playing (video) games deep ANNs outperform humans (Minh et al, 2015, Silver et al., 2018), ANNs are still not performing on par when it comes to recognizing actions in movie data and their ability to act as generalizable problem solvers is still far behind of what the human brain seems to achieve effortlessly. Moreover, biological neuronal networks can learn far more effectively with fewer training examples, they achieve a much higher performance in recognizing complex patterns in time series data (e.g. recognizing actions in movies), they dynamically adapt to new tasks without losing performance and they achieve unmatched performance to detect and integrate out-of-domain data examples (data they have not been trained with). In other words, many of the big challenges and unknowns that have emerged in the field of deep learning over the last years are already mastered exceptionally well by biological neuronal networks in our brain. On the other hand, many facets of typical ANN design and training algorithms seem biologically implausible, such as the non-local weight updates, discrete processing of time, and scalar communication between neurons. Recent evidence suggests that learning in biological systems is the result of the complex interplay of diverse error feedback signaling processes acting at multiple scales, ranging from single synapses to entire networks.
Lecture notesThe lecture slides will be provided as a PDF after each lecture.
Prerequisites / NoticeThis advanced level lecture requires some basic background in machine/deep learning. Thus, students are expected to have a basic mathematical foundation, including linear algebra, multivariate calculus, and probability. The course is not to be meant as an extended tutorial of how to train deep networks in PyTorch or Tensorflow, although these tools used.
The participation in the course is subject to the following conditions:

1) The number of participants is limited to 120 students (MSc and PhDs).

2) Students must have taken the exam in Deep Learning (263-3210-00L) or have acquired equivalent knowledge.
227-1033-00LNeuromorphic Engineering I Restricted registration - show details
Registration in this class requires the permission of the instructors. Class size will be limited to available lab spots.
Preference is given to students that require this class as part of their major.

Information for UZH students:
Enrolment to this course unit only possible at ETH. No enrolment to module INI404 at UZH.
Please mind the ETH enrolment deadlines for UZH students: Link
W6 credits2V + 3UT. Delbrück, G. Indiveri, S.‑C. Liu
AbstractThis course covers analog circuits with emphasis on neuromorphic engineering: MOS transistors in CMOS technology, static circuits, dynamic circuits, systems (silicon neuron, silicon retina, silicon cochlea) with an introduction to multi-chip systems. The lectures are accompanied by weekly laboratory sessions.
ObjectiveUnderstanding of the characteristics of neuromorphic circuit elements.
ContentNeuromorphic circuits are inspired by the organizing principles of biological neural circuits. Their computational primitives are based on physics of semiconductor devices. Neuromorphic architectures often rely on collective computation in parallel networks. Adaptation, learning and memory are implemented locally within the individual computational elements. Transistors are often operated in weak inversion (below threshold), where they exhibit exponential I-V characteristics and low currents. These properties lead to the feasibility of high-density, low-power implementations of functions that are computationally intensive in other paradigms. Application domains of neuromorphic circuits include silicon retinas and cochleas for machine vision and audition, real-time emulations of networks of biological neurons, and the development of autonomous robotic systems. This course covers devices in CMOS technology (MOS transistor below and above threshold, floating-gate MOS transistor, phototransducers), static circuits (differential pair, current mirror, transconductance amplifiers, etc.), dynamic circuits (linear and nonlinear filters, adaptive circuits), systems (silicon neuron, silicon retina and cochlea) and an introduction to multi-chip systems that communicate events analogous to spikes. The lectures are accompanied by weekly laboratory sessions on the characterization of neuromorphic circuits, from elementary devices to systems.
LiteratureS.-C. Liu et al.: Analog VLSI Circuits and Principles; various publications.
Prerequisites / NoticeParticular: The course is highly recommended for those who intend to take the spring semester course 'Neuromorphic Engineering II', that teaches the conception, simulation, and physical layout of such circuits with chip design tools.

Prerequisites: Background in basics of semiconductor physics helpful, but not required.
  •  Page  1  of  1