Course Info

Deep learning is widely used in a growing range of applications ranging from image classification and generation, text comprehension, signal processing, game playing and more. This course will focus on algorithms, programming frameworks and new hardware and software interfaces that aim to allow execution of deep learning algorithms in an efficient way. It will provide both the necessary theoretical background and the hands-on experience required to be an effective deep learning practitioner, or to start on the path towards deep learning research.

Learning Outcomes

At the end of the course, the student will:

  1. Understand and be able to apply notions in deep learning.
  2. Know how to effectively use leading python machine-learning and deep learning frameworks such as PyTorch.
  3. Know how to optimize software and hardware performance in deep neural network applications.
  4. Know how to leverage GPUs and write custom computational kernels to accelerate both training and inference.
  5. Perform a small research project using the studied notions and techniques.

Administration

Evaluation: 40% Homework assignments, 60% final project.

Language: The course will be taught in English.

Credits: 3.0.

Course Staff

Lecturers

Prof. Alex Bronstein

Prof. Alex Bronstein

Lecturer

Prof. Avi Mendelson

Prof. Avi Mendelson

Lecturer

TAs

Chaim Baskin

Chaim Baskin

Head TA

Aviv Rosenberg

Aviv Rosenberg

Assignments TA

Checkers

Evgenii Zheltonozhskii

Evgenii Zheltonozhskii

Homework Checker

Literature

The course does not follow any specific book. For your own reference, the following material may be useful.

  • Deep Learning

    Ian Goodfellow, Yoshua Bengio, Aaron Courville

    MIT Press, 2016

  • Deep Learning with PyTorch

    Vishnu Subramanian

    Packt, 2018

Detailed Syllabus

This semester, the course will be presented using a flipped-classroom approach.

Students are expected to watch and read the pre-requisite material, available from the couse Lectures page before each class. The in-class lectures will then be divided into a supplementary part, relating to the pre-requisite material and an introductory part presenting new material relating to the next lecture.

Date # Pre requisite Lecture Tutorial Homework
17/03/2019 1   Introduction to machine learning Python, numpy and friends  
24/03/2019 2 Lecture 1b Introductory: Supervised learning, probability and statistics Logistic regression HW1
31/03/2019 3 Lecture 2 Supplementary: performance evaluation, ROC, confusion matrix;
Introductory: neural networks
MLP  
07/04/2019 4 Lecture 3 Supplementary: CNNs architectures
Introductory: training, calculus, optimization
CNNs  
14/04/2019 5 Lecture 4 Training deep networks: Optimization, generalization and regularization   HW2
21/04/2019 6   No class    
28/04/2019 7 Lecture 5 Supplementary: Word embeddings, attention
Introductory: Unsupervised learning
RNNs  
05/05/2019 8 Lecture 6 Supplementary: GANs, image generation, domain adaptation
Introductory: Reinforcement learning
Domain adaptation  
12/05/2019 9 Lecture 7 Supplementary: Actor-critic, AutoML, NAS
Introductory: Non-euclidean domains, harmonic analysis
  HW3
19/05/2019 10 Lecture 11 Supplementary: Applications of CNNs on graphs
Introductory: Hardware accelerators
Deep reinforcement learning  
26/05/2019 11 Lecture 8 Neural network compression and pruning    
02/06/2019 12 Lecture 9 Supplementary: GPU architectures Geometric deep learning  
09/06/2019 13   No class   HW4
16/06/2019 14 Lecture 10 Supplementary: Hardware for inference CUDA  
23/06/2019 15   Project Presentations