IT-3030: Deep Learning (DL)
This is the instructor's page for IT-3030. We use this webpage for (relatively) static content, while online meetings, discussions, etc. are found on Blackboard.
This course covers the theoretical basis of deep neural networks, including both the calculus and linear algebra underlying DL's cornerstone algorithm: backpropagation. In addition, students get hands-on experience in both a) programming deep networks from scratch, and b) using popular deep-learning software (e.g., Tensorflow, PyTorch, etc.) for complex classification tasks.
Key concepts covered in this course include: standard backpropagation nets, convolution nets, recurrent nets (e.g. long short-term memory, LSTM), regularization techniques, optimizers, autoencoders, generative adversarial networks (GANs), representation learning, structured probablistic models (e.g. variational autoencoders, VAE) and energy-based models (e.g. Boltzmann machines). See the Lecture plan for more details.
Instructors
PhD Assistant
Teaching Assistants and Their Availability
Ivar Nore is teaching assistant in the course. Due to a conflict with Experts in Teams, we change the TA hours from Wednesday to Tuesdays, starting week 7. See blackboard for more info.
TA hours and location:
- Tuesdays 12:15 - 14:00 in Cybele. Map: https://link.mazemap.com/9JZtTh2Y
Lectures
- Where: Room S7 Sentralbygg 2
- When: Tuesdays 10:15 - 12:00
- First meeting: January 10, 2023
Exam
Important Links
NTNU's official web page for IT-3030.