Deep Learning is a branch of Machine Learning that applies deep neural networks to modelling data such as language (labelling sentences as having a positive or negative sentiment, finding the main topics in documents, translating between languages), speech (transcribing speech data to text), and images (detecting and labelling objects within images). We expect (and indeed we hope!) to attract attendees from a very wide range of backgrounds, e.g., statistics, information engineering, statistical physics, computational neuroscience, econometrics, computational biology, etc. So if you’re interested, you should apply.
The lectures will range from more introductory to more advanced topics, and will cover topics from across machine learning. We will aim throughout to solidify concepts and fill in gaps through discussions, forming study groups, and having practical sessions. If you’ve completed any previous machine learning course (at a university or an online course), you will have covered most of the necessary background concepts. See recordings from the past Indabas to get a better sense of the material.
You should be able to follow most of the lectures if you have a basic knowledge of these topics (we will also offer refreshers on most of these):
If you know what vectors and matrices are, and you can multiply a matrix with a vector, you’ll be fine. If not, see the following:
- Linear algebra (PDF), chapter 2 in Deep Learning, 2016.
If you know how to take derivatives of functions, and how to optimise a function in one variable (find its maximum or minimum), you are fine. If you’ve been exposed to working with multiple variables and know about gradients, you’re in excellent shape. If not, brush up by reading:
- Numerical computation (PDF), chapter 4 in Deep Learning, 2016, or
- Background mathematics (PDF), appendix A in Bayesian Reasoning and Machine Learning, 2017
Probability and statistics:
If you know what random variables are and what expected values are, you should be fine. If you’ve heard of Gaussian/Normal distributions and Bayes’ Rule then you are in excellent shape! If not, read through:
- Probabilistic reasoning (PDF), chapter 1 in Bayesian Reasoning and Machine Learning, 2017 or
- Probability and information theory (PDF), chapter 3 in Deep Learning, 2016.
Any programming experience would be helpful, e.g., if you know what variables are and can write if-statements (conditionals) and loops and use functions, you’ll be fine. In particular if you’ve programmed in Python or have done any numerical computing in Matlab or Scipy, you are in excellent shape!
However, we don’t expect anyone to be experts at any of these topics and we cannot emphasise enough that if you are interested, we would strongly encourage you to apply!