# Online Course: Machine Learning for Physicists 2021
**Lecture Series by Florian Marquardt**
This is the website for the online lecture series on machine learning (April 12-July 12, 2021). Here we will collect the course overview and links to the forum, code, etc.
This online lecture series takes place every Monday 6pm CET (Western European summer time), until July 12.
Please **register** for the zoom online meetings at https://fau.zoom.us/meeting/register/u5YvdumpqD8rHNI5-UfwGcmIslkaHVEvD-Jq
(free of charge; you only need to register once for the whole series; I like to keep track of the numbers of expected participants and collect a bit of statistics like where people are from and their field of study)
## Some of the questions we will address in this lecture series
- How do artificial neural networks look like?
- How do you train them efficiently?
- How do you recognize images?
- How do you learn a compact representation of data, without providing any extra information?
- How do you analyze time series and sentences?
- How do you discover strategies from scratch without a teacher?
- What are some modern applications in science/physics?
## Most important info
The course is **inverted-classroom style**. This means you watch one of the pre-recorded video lectures (about 90min) on your own, and then we use a live zoom meeting to: discuss the lecture, do live tutorials, and discuss homework problems! The live meetings take place every week on Monday, at 6pm German time (CEST = Central European Summer Time).
For the students at FAU Erlangen-Nürnberg, please register in meinCampus, and there will be a written exam in the end.
For all other students from anywhere in the world: Just like last year, I will offer you the possibility of doing a 'mini-project', i.e. work on a small project of your own choice. In the end, you will hand in a short write-up plus the code, and you will deliver a short online presentation of the results in front of your peers in the course. If you fulfill these requirements, I will send you a certificate stating that you have participated in this way in the online course. It is up to you to find out whether some dean of studies at your university would even accept this in some official manner - in any case, you could add it to your CV, if you like.
## Some extra info
This course has been delivered twice before 'in person', in the summers of 2017 and 2019, at the [physics department](https://www.physics.nat.fau.eu) of the [university in Erlangen](https://www.fau.eu) (Bavaria/Germany). Both versions have been recorded on video, and it is the 2019 recordings that we will use here (see below for the links). Please disregard any organizational announcements made on the recorded video, as they of course relate to the 2019 course. We will make any up-to-date announcements in the live sessions.
The original website for the course is https://machine-learning-for-physicists.org, but the detailed up-to-date materials for the present online version will be found here!
I (Florian Marquardt) am a theoretical physicist working at the intersection of nanophysics and quantum optics, with a strong recent focus on applications of machine learning. My group is located at the [Max Planck Institute for the Science of Light](https://mpl.mpg.de/divisions/marquardt-division/), and I am also affiliated with the university in Erlangen. On my low-volume twitter account [FMarquardtGroup](https://twitter.com/FMarquardtGroup), you can find announcements of our most recent research papers as well as job openings (the latter also on the [group website](https://mpl.mpg.de/divisions/marquardt-division/)).
If you want to reach me with questions regarding the course, it is most efficient to use the forum. My email account is usually overflowing with administrative stuff, so I would likely miss your email in my inbox.
Finally, I also would like to thank Thomas Fösel, Leopoldo Sarra, Riccardo Porotti, and Victor Lopez-Pastor for their help in the earlier lectures.
## Link Map (literature and video links)
[Deep Learning Basics Link Map](https://florianmarquardt.github.io/deep_learning_basics_linkmap.html)
with links to a few selected original articles, as well as to the video lectures (YouTube links).
The lecture videos are available on:
[Lecture videos playlist on YouTube](https://www.youtube.com/playlist?list=PLemsnf33Vij4eFWwtoQCrt9AHjLe3uo9_)
[Lecture videos on FAU Erlangen platform (originals)](https://www.fau.tv/course/id/778)
[Lecture videos on iTunes University](https://podcasts.apple.com/us/podcast/id1490099216)
## Lecture Notes
Although there are no extended lecture notes that cover all the material of this series, I have produced a condensed version (40 pages) of lecture notes, on the occasion of the Les Houches Summer School 2019. You can download a draft here (including discussions of applications to quantum physics, and a bit about quantum machine learning):
[Les Houches 2019 Machine Learning Lecture Notes](https://arxiv.org/abs/2101.01759)
If you find these lecture notes useful for your research, please cite them, e.g. like this:
*F. Marquardt, "Machine Learning and Quantum Devices", arxiv:2101.01759, to appear in "Quantum Information Machines; Lecture Notes of the Les Houches Summer School 2019", eds. M. Devoret, B. Huard, and I. Pop*
Here are the slides used in the lectures, split up in three parts (first one goes up to and including t-SNE; third one covers some applications to science).
[Machine Learning for Physicists, Slides, Part One (pdf, 25 MB)](https://owncloud.gwdg.de/index.php/s/qetLJgXMW6u3FwC)
[Machine Learning for Physicists, Slides, Part Two (pdf, 13 MB)](https://owncloud.gwdg.de/index.php/s/Sc89k0zzFlaw0g9)
[Machine Learning for Physicists, Slides, Part Three (pdf, 4 MB)](https://owncloud.gwdg.de/index.php/s/iLpJmXvDI2I7zpz)
## Material from previous series
[Link to the old 2020 lecture series website](https://pad.gwdg.de/Machine_Learning_For_Physicists_2020)
## Discussion Forum
This is a forum that we already used for the 2020 version of this lecture series. I think some of the old posts may still be useful, so we continue there.
## Github code repository
All the python codes for tutorials, homework, and examples in the lectures (as jupyter notebooks and also as pure python code):
## Software needed
First of all, you want to install the [python](https://www.python.org) programming language. Also, I recommend [JupyterLab](https://jupyter.org) as a convenient notebook interface for python programming. Depending on your taste and your system, you might want to download these individually or as part of a full distribution like [Anaconda](https://www.anaconda.com).
An alternative, completely online solution is the [Colaboratory](https://colab.research.google.com/notebooks/intro.ipynb) platform by Google. This is a web-based jupyter notebook interface to python that comes with tensorflow & keras pre-installed and allows you to run your code for free on their GPU servers.
A third alternative is [deepnote](https://deepnote.com/). This is also a free online platform with a jupyter interface (and tensorflow & keras). But in contrast to colaboratory, several people can edit the same notebook (and execute cells) at the same time! We use it internally in our group for joint programming sessions to explore topics (discussing via a zoom session).
Starting in the 3rd lecture, we will use [keras](https://keras.io), the convenient high-level python neural-network software package. This is included automatically in every recent [tensorflow](https://www.tensorflow.org) installation, so I recommend installing tensorflow (after having python) and then getting access to keras commands in the form
import tensorflow as tf
from tensorflow.keras.layers import Dense
Note: In the examples shown in the course, I assumed you have a keras installation (and, below that, as support for keras, there would be tensorflow or other similar packages), so the syntax for these imports looked slightly different. Adapt as needed.
If you have trouble installing the software (all of which is free!), you may use the forum (link above) to ask other students in the course who might help you.
If you have suggestions for useful software or platforms etc., please post them on the forum! This is a community effort (since I will not be able to provide individual help with installations).
## Course overview: Lectures and online sessions
Note: All notebooks herein are distributed under the terms of the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license, with the exception of the few notebooks that contain code by other people (which is clear inside the notebook):
### Session 0: Introduction
There is no video lecture to watch before this course, this is just the introduction.
:calendar: Online session introducing the course: **April 12, 6pm** (all times CEST, i.e. time in Germany)
:movie_camera: [video recording](https://owncloud.gwdg.de/index.php/s/iadCsoIXbPgenRs)
We will discuss the lecture series program, prerequisites, and the format of the online sessions. I will also say some words about the pre-history of neural networks.
[Old 2020 Slides for online session (PDF, 14 MB)](https://owncloud.gwdg.de/index.php/s/icQRtIwFtwPvXfh)
Suggested historical reading: [Alan Turing's "Computing Machinery and Intelligence" (1950)](https://academic.oup.com/mind/article/LIX/236/433/986238), introducing what is now known as the "Turing test" for artificial intelligence. A wonderful (and very nontechnical) paper!
Some recent 'Turing test chat' with the most powerful neural language model (GPT-3): https://kirkouimet.medium.com/turing-test-2669daffae38 (beware: in the description he writes that he sometimes reruns the model's probabilistic responses until he finds something suitable, so this might be a bit of 'cheating'...)
Further historical reading: [Rosenblatt's concept of a "Perceptron" (1957)](http://blogs.umass.edu/brain-wars/files/2016/03/rosenblatt-1957.pdf)
### Lecture 1: Basic Structure of Neural Networks
:cinema: Recorded video: [Lecture 1](https://www.video.uni-erlangen.de/clip/id/10611) (slides: see above in section "Slides")
:calendar: Online Q&A session and tutorials about this material: **April 19, 6pm** (please watch the lecture video before).
Contents: Introduction (the power of deep neural networks in applications), brief discussion of the lecture outline, structure of a neural network and information processing steps, very brief introduction to python and jupyter, implementing a deep neural network efficiently in basic python (without any additional packages like tensorflow), illustration: complicated functions generated by deep neural networks
After this lecture, you will know the basic structure of a neural network and how to implement the 'forward-pass' in python, but you don't yet know how to adapt the network weights (training).
**Code (jupyter notebook):** [01_MachineLearning_Basics_NeuralNetworksPython.ipynb](https://owncloud.gwdg.de/index.php/s/Unl2Yru1HsqwQNK)
(or download [code as pure python script](https://owncloud.gwdg.de/index.php/s/WLfHIv2YXhq0Z29))
This notebook shows how to calculate the forward-pass through a neural network in pure python, and how to illustrate the results for randomly initialized deep neural networks (as shown in the lecture).
**Notebooks for tutorials:**
[Tutorial: Network Visualization notebook](https://owncloud.gwdg.de/index.php/s/lugGF9eCxClOk56) (also as [pure python](https://owncloud.gwdg.de/index.php/s/SgogI8iVzi60owR))
[Tutorial: Curve Fitting notebook](https://owncloud.gwdg.de/index.php/s/63ok2nUCuTvYwb7) (also as [pure python](https://owncloud.gwdg.de/index.php/s/SgogI8iVzi60owR))
The "Network Visualization" notebook shows how to visualize arbitrary multilayer networks (with two inputs and 1 output neuron, so the result can be displayed in the plane), including a visualization of the network structure.
The "Curve Fitting" notebook visualizes nonlinear curve fitting for a 1D function with a few parameters, via stochastic gradient descent. This is useful to understand what is going on in the higher-dimensional case of neural networks, where essentially the same concept is applied.
**Homework for Lecture 1**
(The solutions will be briefly discussed in the online session for lecture 2. In order to follow the lecture series, please do at least two of these problems – I suggest the ones with the *. And of course, if you do more, you will get more practice and quickly become a master!)
(1)* Implement a network that computes XOR (arbitrary number of hidden layers); meaning: the output should be +1 for y1 y2<0 and 0 otherwise!
(2) Implement a network that approximately or exactly computes XOR, with just 1 hidden layer(!)
(3)* Visualize the results of intermediate layers in a multi-layer randomly initialized NN (meaning: take a fixed randomly initialized multi-layer network, and then throw away the layers above layer n; and directly connect layer n to the output layer; see how results change when you vary n; you can start from the notebook [01_MachineLearning_Basics_NeuralNetworksPython.ipynb](https://owncloud.gwdg.de/index.php/s/Unl2Yru1HsqwQNK))
(4) What happens when you change the spread of the random weights? Smart weight initialization is an important point for NN training.
(5) Explore cases of curve fitting where there are several (non-equivalent) local minima. Is sampling noise helpful (i.e. the noise that comes about because of the small number of x samples)?