CENG 501 - Deep Learning - Spring 2026
Description
This course assumes that the student has taken already a course on the fundamentals of deep learning and is familiar with conventional models such as Multi-Layer Perceptrons, Convolutional Neural Networks, Recurrent Neural Networks and Long-Short Term Memory Networks.
After a review of these models, the course will cover self-attention & transformers, large-language models, vision-language models, generative models, self-supervised learning and reinforcement learning in detail.
Frequently Asked Questions
Please read the FAQ very carefully if you wish to take the course.
Syllabus
You can access the syllabus here.
Lectures
- Week 1: Course logistics. History of Artificial Neuron Models. Perceptron Learning. Multi-layer Perceptrons. Backpropagation. Gradient Descent.
Slides: slides.
- Week 2: Neural Engineering: More on Loss Functions, Activation Functions, Gradient Descent Strategies, Dealing with the Challenges of the Loss Surface.
Slides: slides.
- Week 3: Representational Capacity; Overfitting, Convergence, When to Stop Training; Data Preprocessing; Weight Initialization; Limitations of MLPs and Introduction to CNNs.
Slides: slides.
- Week 4: Continue with CNNs: Types of convolution, pooling, normalization, backprop through a CNN, transfer learning, popular CNN architectures.
Slides: slides.
Announcements
- Please don't miss the first lecture if you wish to take the course.