Description
This course concerns the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. The prerequisites include: DS-GA 1001 Intro to Data Science or a graduate-level machine learning course.
DS-GA 1008 · SPRING 2021
Instructors: Lectures – Yann LeCun | Practicum – Alfredo Canziani
Lectures: Mondays, 9:30 – 11:30am EST, Zoom
Practica: Tuesdays, 9:30 – 10:30am EST
Forum: r/NYU_DeepLearning
Discord: NYU DL
Material: 2021 Repo
Please note we’re officially supporting direct communication with students taking this course online via our Reddit and Discord platforms.
2021 edition disclaimer
Check the repo’s README.md
and learn about:
- Content new organisation
- The semester’s second half intellectual dilemma
- This semester repository
- Previous releases
Lectures
Most of the lectures, labs, and notebooks are similar to the previous edition, nevertheless, some are brand new. I will try to make clear which is which.
Legend: 🖥 slides, 📝 notes, 📓 Jupyter notebook, 🎥 YouTube video.
Translations
If you’re interested in assisting the Deep Learning team with translation, please contact Alfredo Canziani at canziani@nyu.edu.
Theme 1: Introduction
- History and resources 🎥 🖥
- Gradient descent and the backpropagation algorithm 🎥 🖥
- Neural nets inference 🎥 📓
- Modules and architectures 🎥
- Neural nets training 🎥 🖥 📓📓
- Homework 1: backprop
Theme 2: Parameters sharing
- Recurrent and convolutional nets 🎥 🖥 📝
- ConvNets in practice 🎥 🖥 📝
- Natural signals properties and the convolution 🎥 🖥 📓
- Recurrent neural networks, vanilla and gated (LSTM) 🎥 🖥 📓📓
- Homework 2: RNN & CNN
Theme 3: Energy based models, foundations
- Energy based models (I) 🎥 🖥
- Inference for LV-EBMs 🎥 🖥
- What are EBMs good for? 🎥
- Energy based models (II) 🎥 🖥 📝
- Training LV-EBMs 🎥 🖥
- Homework 3: structured prediction
Theme 4: Energy based models, advanced
- Energy based models (III) 🎥 🖥
- Unsup learning and autoencoders 🎥 🖥
- Energy based models (VI) 🎥 🖥
- From LV-EBM to target prop to (any) autoencoder 🎥 🖥
- Energy based models (V) 🎥 🖥
- AEs with PyTorch and GANs 🎥 🖥 📓📓
Theme 5: Associative memories
Theme 6: Graphs
- Graph transformer nets [A][B] 🎥 🖥
- Graph convolutional nets (I) [from last year] 🎥 🖥
- Graph convolutional nets (II) 🎥 🖥 📓
Theme 7: Control
Theme 8: Optimisation
Miscellaneous
- SSL for vision [A][B] 🎥 🖥
- Low resource machine translation [A][B] 🎥 🖥
- Lagrangian backprop, final project, and Q&A 🎥 🖥 📝
DS-GA 1008 · SPRING 2020 · CDS
Instructors: Lectures – Yann LeCun | Practicum – Alfredo Canziani
Lectures: Mondays, 16:55 – 18:35
Practica: Tuesdays, 19:10 – 20:00
Material: Google Drive, Notebooks
NYU Deep Learning Reddit
Lectures
Legend: 🖥 slides, 📓 Jupyter notebook, 🎥 YouTube video.
Translations
🇬🇧 English | 🇸🇦 Arabic | 🇧🇩 Bengali, Bangla | 🇨🇳 Chinese | 🇫🇷 French | 🇭🇺 Hungarian | 🇮🇹 Italian | 🇯🇵 Japanese | 🇰🇷 Korean | 🇮🇷 Persian | 🇵🇹 Portuguese | 🇷🇺 Russian | 🇹🇷 Turkish | 🇷🇸 Serbian | 🇪🇸 Spanish | 🇻🇳 Vietnamese
People
— Yann
Deep Learning by New York University, Yann LeCun, Alfredo Canziani is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Based on a work at https://cds.nyu.edu/deep-learning/