CEng 501 - Deep
Learning - Fall 2022
Images generated by the “Stable
Diffusion” method. From left to right: “a bunch of students learning
deep neural networks” by Picasso, Van Gogh and Da Vinci.
Instructor: Asst. Prof. Dr. Emre Akbas; e-mail:
emre at ceng dot metu.edu.tr; Office: B-208; Office hours by
apppointment
Lectures: Tuesday 9:40-12:30 at BMB-4
Online communication: (forum, homework
submissions) https://odtuclass.metu.edu.tr/
Syllabus: pdf
Announcements
- Sep 30, 2022 - I am receiving too many e-mails about taking this
course as a special student, as an undergrad, from other
departments/universities, etc. I am not able to respond each of them. If
you want to take the course, please make sure you attend the first
lecture. And, please see my answers to frequently asked
questions.
- Oct 4, 2022 - Student with these
IDs can add the class during add-drops next week. Those who are
already registered do not need to do anything.
- Oct 6, 2022 - Homework assignment 1 is announced. See Week 1
below.
Late submission policy
Any work, e.g. an assignment solution, that is submitted late past
its deadline will receive -10 points per day delayed. For example, if
the deadline is Oct 14, 23:55 and a student submits his/her work anytime
on Oct 15, that work will be evaluated over 90 instead 100.
Detailed syllabus
An important note: Lecture slides provided below are by no
means a “complete” resource for studying. I frequently use the board in
class to supplement the material in the slides.
NOTE: Below, links to slides and homework material
are broken because I removed the files. Recent versions of these files
can be found in the Fall 2022 version of the course.
Week 1
Week 2
- Lecture topics: Machine learning background and basics (slides).
Week 3
- Lecture topics: Model selection, hyper-parameter tuning, maximum
likelihood and maximum aposteriori esitmations; Biological neuron,
artificial neuron, Perceptron, Multilayer perceptrons (slides).
- Hands-on tutorial: developing a basic classifier using hinge loss:
Colab
notebook.
- Reading assignment for next week: sections 8.1 and 8.3 from “Chapter
8: Optimization for training deep models” from the book “Deep
Learning.”
Week 4
- Lecture topics: Multilayer Perceptrons, Activation Functions,
Backpropagation, Stochastic Gradient Descent, Momentum (slides)
- Lecture topics: convolutional neural networks (part 1) (slides)
- Reading assignment for next week: LeCun,
Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature,
521(7553), 436-444.
Week 5
- Lecture topics: Convolutional neural networks continued, multiclass
hinge loss, derivation of cross-entropy loss, implementing
backpropagation in a modular way, AlexNet, data augmentation, dropout.
(slides)
- Reading assignment for next week: He, K., Zhang, X., Ren, S. and
Sun, J. Deep residual
learning for image recognition. In CVPR 2016.
Week 6
- Colab
notebook on building, training and evaluating a basic MLP.
- Colab
notebook on building, training and evaluating a basic CNN.
- Lecture topics: implementing a convolutional layer, initialization
of neural networks, normalization layers, adaptive learning rate methods
(slides).
- HW2 released on ODTUClass.
Week 7
- Lecture topics: some notable applications of CNNs and major CNN
architectures (slides).
- Colab
notebook on implementing a bare-bones object detector.
Week 8
Week 9
- Lecture topics: Some applications of RNNs (slides).
- A RNN example: solving simple arithmetic operations using a
sequence-to-sequence encoder-decoder model: Colab
notebook (in Keras)
Week 10
- Lecture topics: A brief intro to deep reinforcement learning (slides)
Week 11
- Lecture topics: Attention, self-attention, Transformer, some notable
applications of transformers (slides).
Week 12
- Lecture topics: Unsupervised/generative modeling, energy-based
models, Boltzmann machines, Restricted Boltzmann machines, deep belief
networks, generative adversarial networks, autoencoders, variational
autoencoders (slides).
Week 13
- Brief introductions to Diffusion and score-based (generative) models
(slides),
Graph Neural Networks, Dynamic Filtering, Self-supervised learning,
Implicit deep learning (slides).
Week 14
- Brief introductions to Implicit Deep Learning, Double Descent,
Knowledge Distillation, Forward Forward Algorithm (slides).
- Review.
Frequently
Asked Questions about taking the course
I am receiving too many e-mails about taking the course.
Unfortunately, I cannot reply them one by one. Below are answers to
common questions.
Q0: Is it
going to be an online or face-to-face course?
A0: Face to face.
Q1: Can I take the course?
A1: There is a huge demand for the course from all kinds of
backgrounds. Thanks. However, I have the responsibility to evaluate your
learning outcomes and grade you. Therefore, I need to limit the number
of seats. Based on my previous years’ experience, this limit will be
around 35-40.
Since this is a graduate METU CENG course, I need to give priority to
the graduate students in our department. Here is the priority order that
I will use to accept students to the class. From high to low
priority:
- Grad students from METU CENG,
- A limited number of 4th year undergraduate students from METU
CENG,
- Grad students from other METU departments,
- Special students (see http://oidb.metu.edu.tr/ozel-ogrenci,
you need to be a grad student in some other university to be
eligible).
I must note that the first two categories (METU CENG students) almost
fill up the whole capacity. So, unfortunately, there will not be much
room for the remaining two categories.
Also, precedence will be given to students who are actively doing
research in machine learning and related areas. This course is not a
PyTorch or Keras tutorial, we intend to go beyond the “user” level.
You might want to check out the other two DL courses given at Multimedia
Informatics and Electrical
Engineering departments.
Machine learning background is required. If you have not taken a
machine learning course before, please do not take this course.
Fluency in Python is required.
Q2: How can I register for
the course?
A2: Come to the first lecture. In the first lecture, I will collect
information from the participants and then, will decide (based on my
answer A1 above) on who will be able to register. This enrollment list
will be announced in a couple of hours following the first lecture.
Students listed in this enrollment list will be able to add the course
during the add-drop period.
Q3:
I was able to take the course during the regular interactive
registration. Should I worry about not being accepted?
A3: No worries. You will stay.
Q4: Can I take
this course as a special student?
A4: Possible but unlikely. Please see my answer A1 above.
Q5:
Even if I don’t officially register for the class, can I audit it?
A5: Yes, definitely.