**Lecturer: Giacomo Indiveri**
**What is this course about?**
- Information processing in the brain: neurons, synapses, nervous system organization
- Analytical descriptions of neural computations
- Learning and Plasticity
- Encoding information in the brain
- Theoretical neural network models
- Engineering brain-like computers
**The Human Brain**
The average human brain: 1.5kg weight, 1.1 - 1.2l volume.
We have a brain because we can move and interact with the world.
![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image2.png]]
For example, the "Sea squirt" to the right has a brain in the early stages (few hours) of its life, but once it finds a rock to stick to, it digest its brain as it is no more needed for movements and would represent a high-consuming energy organ.
![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image1.png]]
**Producing Behaviour**
Our brain is fundamentally needed to interpret sensory data, take decisions and actions. It is needed to act in the environment.
**Neurons and Brains**
Human brains are large, but by far not the largest (elephant, whales, ...). The cells (neurons) that make up brains are very similar between species.
|![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image3.png]]|![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image4.png]]|
|---|---|
**Brain Evolution**
The size and density of the brain have been constantly increasing over time during evolution.
![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image5.png]]
**A very brief History of Neuroscience**
- **1839** - Theodore Schwann proposes that biological tissue of organisms is composed of cells.
- **1873** - Camillo Golgi develops a method to stain nervous tissue ("La Reazione Nera").
- **1887** - Santiago Ramón y Cajal reports of individual nerve cells in birds brains.
- **1936** - Sir Henry Dale and Otto Loewi are awarded the Nobel prize for discovering the principles of synaptic transmission.
- **1949** - Donald Hebb postulates theory of "Hebbian" learning.
- **1952** - Hodgkin and Huxley propose their model of action potential generation.
**Synapses**
In 1897 **Charles Sherrington** introduced the term **synapse** to describe the specialized structure at the zone of contact between neurons as the point in which one neuron communicates with another.
**EPSC and EPSP**
- Superimposed excitatory post-synaptic currents (EPSCs) recorded in a neuron at different membrane potentials.
- Excitatory post-synaptic potential (EPSP) in response to multiple pre-synaptic spikes.
![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image6.png]]
**Spike Generating Mechanism**
If the membrane voltage increases above a certain threshold, a spike-generating mechanism is activated and an action potential is initiated.
**Spike Properties and the F-I Curve**
|![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image7.png]]|![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image8.png]]|
|---|---|
**The First Models of Neurons - Warren McCulloch and Walter Pitts (1943)**
|![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image9.png]]|![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image10.png]]|
|---|---|
"A logical calculus of the ideas immanent in nervous activity"
The McCulloch&Pitts model quickly became extremely popular, and dominated the **Artificial Neural Network** scene for decades.
**Why?** Isomorphism with calculus of logical propositions. In the hand of John von Neumann, the McCulloch & Pitts model became the basis for the logical design of digital computers.
![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image11.png]]
**The Turing Machine**
It is an universal computing machine, through a binary state machine all the logical operations could be performed.
![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image12.png]]
**Artificial vs Natural Intelligence**
How is the brain different from a computer?
**Hard and Easy Problems**
"*The main lesson of thirty-five years of AI research is that the hard problems are easy and the easy problems are hard" - Steve Pinker*
*"It is comparatively easy to make computer exhibit adult level performance on intelligence tests or playing checkers, and difficult or impossible to give them the skills of a one-year-old when it comes to perception and mobility" - Hans Moravec*
**Artificial vs Real Neural Networks**
![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image13.png]]
**Artificial Intelligence: The Deep Network Revolution**
- Although the first successes of ANNs were first demonstrated in the 1980's they only started to outperform classical optimization and engineering approaches from 2009 on.
- In 2011 CNNs trained using **back-propagation** on GPUs achieved 0.56% error rate in a visual pattern recognition contents, outperforming for the first time humans (by a factor of 2x) and non-neural state-of-the-art algorithms (by a fact of 6x).
**Deep Networks Galore**
- As CNNs and DNNs outperformed classical approaches, many research groups started to extend and optimize them.
- The AI field is now (mostly) dominated by attempts to improve accuracy on standard benchmarks, by scaling up network size and parameter count.
**Deep Networks Computing Power Demands**
- GPT-3 is a network with 175-billion parameters, with a memory size exceeding 350GB.
- According to conservative estimate, training GPT-3 required **over \$4.6 million**.
- Google's energy consumption usage has more than quadrupled from the advent of GPU use of DNN training.
**Problems and Limitations of Artificial Intelligence**
![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image14.png]]
**Neuromorphic Intelligence**
![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image15.png]]
![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image16.png]]
**Design Principles for Emulating Natural Intelligence**
- **Clock Speed**: brains outperform faster computing systems in many sensory processing tasks, at lower speeds and with less power.
- **The Hardware is the Algorithm**: the brain uses the time evolution of the physical system to implement its computations. Neural circuits compute by exploiting the dynamics and the natural time evolution of the physical hardware substrate.
- **Animal brains**:
- Slow, noisy and variable processing elements.
- Local connectivity, small world networks.
- Massively parallel distributed computation.
- Closed-loop interaction with the environment.
- Real-time spatio-temporal signal processing.
- Continual always-on learning.
![[ETH/ETH - Introduction to Neuroinformatics/Images - ETH Introduction to Neuroinformatics/image17.png]]
**Brain-Inspired Principles: a Radical Paradigm Shift**
**Exploit Physical Space**
- Use parallel arrays of spiking neurons.
- Maximize fine grain parallelism (no time-multiplexing).
- To co-localize memory and computation.
- Use dynamic synapse circuits, event-driven, and passive.
- Exploit all the properties of transistors and memristors.
**Let Time Represent Itself**
- For interacting with the environment in real-time.
- To match the circuit time constants to the input signal dynamics.
- For inherently synchronizing with the real-world natural events.
- To process sensory signals efficiently.
**The Neuromorphic Computing Approach**
- Highly **interdisciplinary research** rooted on neuroscience, non-linear dynamical systems theory, physics, microelectronics, ...
- Exploit the **physics of silicon and emerging nano-technologies** in electronic circuits to reproduce the bio-physics of neural systems.
- Design spiking neural network processing chips using **mixed signal analog/digital** circuits and technologies.
- Build **real-time autonomous robotic agents** able to interact with the environment in real-time and exhibit intelligent behaviors and cognitive abilities.
**Edge-Computing Application Specific Tasks**
**Technology Transfer and Applications**
We are now entering the era of neuromorphic intelligence in which dedicated cognitive **chiplets** will be used to provide intelligence to a multitude of extreme edge-computing use cases.
**On-going research: Mind-Brain-Body Iterative Refinement**
- We study the **principles of computation** of real cortical circuits and validate them on neuromorphic systems that interact intelligently with the environment.
- We exploit progress in technology to develop mixed-signal **neuromorphic electronic circuits** for emulating neural dynamics and learning in real-time.
- We build analog/digital neural processing systems interfaced to sensors and robotic platforms that can learn to **exhibit cognitive abilities**.