CEng 501 - Deep
Learning - Fall 2021
Instructor: Asst. Prof. Dr. Emre Akbas; e-mail:
emre at ceng dot metu.edu.tr; Office: B-208; Office hours by
apppointment
Lectures: Friday 9:40-12:30 at BMB-5
Online communication: (forum, homework
submissions) https://odtuclass.metu.edu.tr/
Syllabus: pdf
Announcements
- Nov 5, 2020 - Zoom link below will be used for the rest of the
weeks.
- Oct 23, 2020 - Student with these
IDs can add the class during add-drops next week (in addition to
those who are already registered).
- Oct 22, 2021 - Here is the link for the enrollment request form.
- Oct 22, 2021 - Hw1 is announced, see Week 1 below.
- Oct 21, 2021 - Zoom
link for the first meeting.
- Oct 12, 2021 - I am receiving many e-mails about taking this course
as a special student, as an undergrad, from other
departments/universities, etc. I am not able to respond each of them. If
you want to take the course, please make sure you attend the first
lecture. And, please see my answers to frequently asked
questions.
Late submission policy
Any work, e.g. an assignment solution, that is submitted late past
its deadline will receive -10 points per day delayed. For example, if
the deadline is Oct 14, 23:55 and a student submits his/her work anytime
on Oct 15, that work will be evaluated over 90 instead 100.
Detailed syllabus
An important note: Lecture slides provided below are by no
means a “complete” resource for studying. I frequently use the board in
class to supplement the material in the slides.
NOTE: Below, links to slides and homework material
are broken because I removed the files. Recent versions of these files
can be found in the Fall 2022 version of the course.
Week 1
Week 2
- No class – 29 Ekim Cumhuriyet Bayramı (Republic Day) National
Holiday
Week 3
- Lecture topics: A high-level introduction to Deep Learning (slides)
- Lecture topics: Machine learning background and basics (1 of 2) (slides)
- The link for the recorded lecture is available on ODTUClass.
Week 4
- Lecture topics: Machine learning background and basics (2 of 2) (slides)
- Colab
notebook for the in-class hands-on demo on binary classification,
gradient descent, hinge loss.
- Lecture topics: Biological neuron, artificial neuron, Perceptron (slides)
- Adding regularization to the hinge loss classifier: Colab
notebook
- Reading assignment for next week: “Chapter
8: Optimization for training deep models” from the book “Deep
Learning.”
- The link for the recorded lecture is available on ODTUClass.
Week 5
- Lecture topics: Multilayer Perceptrons, Backpropagation, Activation
Functions, Stochastic Gradient Descent, Momentum (slides)
- Lecture topics: convolutional neural networks (part 1) (slides)
- Colab
notebook on building, training and evaluating a basic MLP.
- Reading assignment for next week: LeCun,
Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature,
521(7553), 436-444.
- The link for the recorded lecture is available on ODTUClass.
Week 6
- Lecture topics: Convolutional neural networks continued, multiclass
hinge loss, derivation of cross-entropy loss, implementing
backpropagation in a modular way. (slides)
- Building, training and evaluating a basic CNN: Colab
notebook
- Reading assignment for next week: He, K., Zhang, X., Ren, S. and
Sun, J. Deep residual
learning for image recognition. In CVPR 2016.
- Initial list
of candidate papers for in-class presentations is up. The list will
be occasionally updated. Papers will be assigned in a
first-come-first-serve basis. E-mail me with your preferences.
- HW2 released on ODTUClass.
- The link for the recorded lecture is available on ODTUClass.
Week 7
- Lecture topics: Convolutional neural networks continued, convolution
as matrix multiplication, notes on initializing neural networks, batch
normalization, variants of stochastic gradient methods, adaptive
learning rate methods. (slides)
- Lecture topics: some applications of ConvNets. Image classification.
Residual networks (ResNets), object detection, artistic style transfer,
image segmentation, fully convolution networks, deconvolution,
visualizing CNNs, CNNs for speech generation and NLP. (slides)
- Building a basic object detector: Colab
notebook
- Reading assignment for next week: A.
Karpathy’s blog post on recurrent neural networks
- The link for the recorded lecture is available on ODTUClass.
Week 8
- Lecture topics: Recurrent Neural Networks, LSTM, GRU. (slides)
- HW3 released on ODTUClass.
- The link for the recorded lecture is available on ODTUClass.
Week 9
- Lecture topics: Some applications of RNNs. (slides)
- A RNN example: solving simple arithmetic operations using a
sequence-to-sequence encoder-decoder model: Colab
notebook (in Keras)
Week 10
- Lecture topics: Deep generative modeling (slides)
- The link for the recorded lecture is available on ODTUClass.
Week 11
- Lecture topics: A brief intro to deep reinforcement learning (slides)
- The link for the recorded lecture is available on ODTUClass.
Week 12
- Miscellaneous topics: Attention, Self-attention, Transformers,
Non-local neural networks, Vision Transformer, Dynamic Filtering,
Self-supervised learning, Implicit deep learning (slides)
- The link for the recorded lecture is available on ODTUClass.
Week 13
Paper presentations and discussion:
- “Small
data, big decisions: Model selection in the small-data regime” by
Bornschein et al., ICML 2020, presented by Ceren Gürsoy.
- “Focal
loss for dense object detection” by Lin et al., ICCV 2017, presented
by Güneş Çepiç.
- “Attention
is all you need” by Vaswani et al., NeurIPS 2017, presented by Ege
Erdil.
- “A
unified approach to interpreting model predictions” by Lundberg et
al, NeurIPS 2017, presented by Aslı Umay Öztürk.
- “The
lottery ticket hypothesis: Finding sparse, trainable neural
networks” by Frankle and Carbin, ICLR 2018, presented by Yavuz
Kara.
- “Don’t
decay the learning rate, increase the batch size” by SMith et al.,
ICLR 2018, presented by Hıdır Yeşiltepe.
- “Masked
Autoencoders Are Scalable Vision Learners” by He et al., 2021,
presented by Faruk Uğurcalı.
- “Dynamic
filter networks” by Jia et al., NeurIPS 2016 presented by Sina
Şehlaver.
- “Kernel-predicting
convolutional networks for denoising Monte Carlo renderings” by Bako
et al., ACM Trans. Graph. 2017, presented by Kadir Cenk Alpay.
Week 14
Paper presentations and discussion:
- “What
uncertainties do we need in bayesian deep learning for computer
vision?” by Kendal and Gal, NeurIPS 2017, presented by Alpay
Özkan.
- “MLP-mixer:
An all-mlp architecture for vision” by Tolstikhin et al., NeurIPS
2021, presented by Cihad Tekinbaş.
- “An
image is worth 16x16 words: Transformers for image recognition at
scale” by Dosovitskiy et al., ICLR 2021, presented by Süleyman Onat
Çelik.
- “Flexconv:
Continuous kernel convolutions with differentiable kernel sizes” by
Romero et al., 2021, presented by Ece Gökçay.
- “Projected
GANs Converge Faster” by Sauer et al., NeurIPS 2021, presented by
İlter Taha Aktolga.
- “Green
AI + Knowledge Distillation” by Schwartz et al. and Hinton et al.,
presented by Ahmed Khalil.
- “Reconciling
modern machine-learning practice and the classical bias–variance
trade-off” by Belkin et al., PNAS 2019, presented by Mert
Ergürtuna.
- “Pay
Attention to MLPs” by Liu et al., 2021, presented by Furkan
Aldemir.
- “Training
data-efficient image transformers & distillation through
attention” by Touvron et al., ICML 2021, presented by Burak
Akgül.
- “Mean
teachers are better role models: Weight-averaged consistency targets
improve semi-supervised deep learning results” by Tarvainen and
Valpola, NeurIPS 2017, presented by Tolunay Durmuş.
- “PnP-DETR:
Towards Efficient Visual Analysis with Transformers” by Wang et al.,
ICCV 2021, presented by Egemen Demiröz.
Frequently
Asked Questions about taking the course
I am receiving too many e-mails about taking the course.
Unfortunately, I cannot reply them one by one. Below are answers to
common questions.
Q0: Is it
going to be an online or in-class course?
A0: This will be a hybrid course, which means it will primarily be an
in-class course with possible online components. I plan to stream the
live lecture via Zoom, so that students who cannot come to the class
that day can follow the lecture.
Q1: Can I take the course?
A1: There is a huge demand for the course from all kinds of
backgrounds. Thanks. However, I have the responsibility to evaluate your
learning outcomes and grade you. Therefore, I need to limit the number
of seats. Based on my previous years’ experience, this limit will be
around 35-40.
Since this is a graduate METU CENG course, I need to give priority
for the graduate students in our department. Here is the priority order
that I will use to accept students to the class. From high to low
priority:
- Grad students from METU CENG,
- A limited number of 4th year undergraduate students from METU
CENG,
- Grad students from other METU departments,
- Special students (see http://oidb.metu.edu.tr/ozel-ogrenci,
you need to be a grad student in some other university to be
eligible).
I must note that the first two categories (METU CENG students) almost
fill up the whole capacity. So, unfortunately, there will not be much
room for the remaining two categories.
Also, precedence will be given to students who are actively doing
research in machine learning and related areas. This course is not a
PyTorch or Keras tutorial, we intend to go beyond the “user” level.
You might want to check out the other two DL courses given at Multimedia
Informatics and Electrical
Engineering departments.
Machine learning background is required. If you have not taken a
machine learning course before, please do not take this course.
Fluency in Python is required.
Q2: How can I register for
the course?
A2: Come to the first lecture. I will publicly announce the lecture
link on this page. In the first lecture, I will collect information from
the participants and then, will decide (based on my answer A1 above) on
who will be able to register. This enrollment list will be announced in
a couple of hours following the first lecture. Students listed in this
enrollment list will be able to add the course during the add-drop
period.
In the past, the number of students attending the first lecture has
been around 80. Since this is larger than the capacity of BMB-5,
the first lecture will be online. The Zoom link will be
posted on this page.
Q3:
I was able to take the course during the regular interactive
registration. Should I worry about not being accepted?
A3: No worries. You will stay.
Q4: Can I take
this course as a special student?
A4: Possible but unlikely. Please see my answer A1 above.
Q5:
Even if I don’t officially register for the class, can I audit it?
A6: Yes, definitely. However, there might not be sufficient seats
available in class, so you need to follow via Zoom.