CENG 501 - Deep Learning - Fall 2024
Description
This course assumes that the student has taken already a course on the fundamentals of deep learning and is familiar with conventional models such as Multi-Layer Perceptrons, Convolutional Neural Networks, Recurrent Neural Networks and Long-Short Term Memory Networks.
After a review of these models, the course will cover self-attention & transformers, large-language models, vision-language models, generative models, self-supervised learning and reinforcement learning in detail.
Frequently Asked Questions
Please read the FAQ very carefully if you wish to take the course.
Syllabus
You can access the syllabus here.
Lectures
- Week 12 - Generative Models (Generative Adversarial Networks, Energy-based Models, Diffusion Models). [Slides]
- Week 11 - Generative Models (Autoregressive approaches, Variational methods, Flow-based methods). [Slides]
- Week 10 - Pretraining in Vision Transformers, Vision-Language Models (VisualBERT, VilBERT, Contrastive Approaches, Masking Approaches, Generative Approaches and Approaches Using Pretrained Backbones). [Slides]
- Week 9 - Vision Transformers (ViT, Swin v1 & v2 [ConvNeXt v1 & v2], Fast ViT, Faster ViT). [Slides]
- Week 8 - In-context Learning (prompting) strategies (chain of thought, self consistency, tree of thought...), LLMs with tools, LLMs as agents, finetuning LLMs. [Slides]
- Week 7 - Pretraining in NLP, BERT, GPT-* models, other LLMs, limits and risks of LLMs. [Slides]
- Week 6 - Attention, self-attention, transformers, state-space models, Mamba. [Slides]
- Week 5 - Deep learning fundamentals (CNNs, RNNs). [Slides]
- Week 4 - Deep learning fundamentals (CNNs). [Slides]
- Week 3 - Deep learning fundamentals. [Slides]
- Week 2 - Deep learning fundamentals. [Slides]
- Week 1 - Overview of the course and introduction to deep learning fundamentals. [Slides]
Announcements