Welcome to STATS 315 / DATA SCI 315
This is an introductory deep learning course using the Python programming language and the TensorFlow deep learning library.
- Textbook: We will not follow any one textbook too closely. Here are a few references we might use:
- Dive into Deep Learning by Zhang, Lipton, Li and Smola. An advanced text from research scientists at Amazon. It weaves together math, figures, and code in an interactive online resource available here. Code examples are provided in three frameworks: MXNet, PyTorch and TensorFlow.
- Deep Learning with Python (2nd edition) by Chollet. A solid hands-on guide oriented towards programmers from the creator of the Keras deep learning library. Ebook and print versions are available from Manning Publications
- Introduction to Deep Learning by Charniak. This undergraduate level textbook is perhaps the closest to this course in terms of prerequisites and the level at which deep learning is presented. It is available from MIT Press. Unfortunately this uses TensorFlow version 1 whereas we will use the more user friendly version 2.
- Deep Learning by Goodfellow, Bengio and Courville. Written by three top deep learning researchers, this comprehensive book is required reading if you want to pursue your study of deep learning at a more advanced level. Print version is available from MIT Press and an online version is here.
- Neural Networks and Deep Learning by Nielsen. Uses the Theano library which is no longer popular. However, most of this free online textbook is focused on conceptual issues that will not become out-of-date anytime soon.
- Undergraduate Courses on Deep Learning: Many universities now offer an introductory deep learning course, e.g., Berkeley, CMU, MIT, Stanford
- Canvas: You should access the Canvas class page for this course frequently. It will let you access important announcements and track course deliverables. (requires UM login)
- Slack: The course slack workspace is at um-wn22-stats315.slack.com (requires UM login)
- Days and Times: Tuesdays and Thursdays, 10-11:30
- Location: B1580 BUS (for virtual lectures, you can find zoom link in canvas or slack)
Instructor Information
Note: Due to the omicron surge, all office hours will be held via zoom until further notice.
Name: Ambuj Tewari
Office Hours: Tue and Thu, 11:30-1 (zoom link)
Email: tewaria@umich.edu
GSI Information
Name: Vinod Raman (Lab 002 Th 8:30-10)
Email: vkraman@umich.edu
Name: Yash Patel (Lab 003 Th 2:30-4)
Email: yppatel@umich.edu
Lab webpage (also has GSI office hours info): https://yashpatel5400.github.io/stats315-winter2022lab/
Grading
- Weekly quizzes (60%): We will only use your top 10 scores
- Homeworks (40%): Four homework assignments with a substantial coding component
Academic Integrity
The University of Michigan community functions best when its members treat one another with honesty, fairness, respect, and trust. The college promotes the assumption of personal responsibility and integrity, and prohibits all forms of academic dishonesty and misconduct. All cases of academic misconduct will be referred to the LSA Office of the Assistant Dean for Undergraduate Education. Being found responsible for academic misconduct will usually result in a grade sanction, in addition to any sanction from the college. For more information, including examples of behaviors that are considered academic misconduct and potential sanctions, please see https://lsa.umich.edu/lsa/academics/academic-integrity.html
Accommodation for Students with Disabilities
If you think you need accommodation for a disability, please let me know at your earliest convenience. Some aspects of this course, the assignments, the in-class activities, and the way the course is usually taught may be modified to facilitate your participation and progress. As soon as you make me aware of your needs, we can work with the Office of Services for Students with Disabilities (SSD) to help us determine appropriate academic accommodations. SSD (734-763-3000; http://ssd.umich.edu/) typically recommends accommodations through a Verified Individualized Services and Accommodations (VISA) form. Any information you provide is private and confidential and will be treated as such.
Mental Health and Well-Being
Students may experience stressors that can impact both their academic experience and their personal well-being. These may include academic pressures and challenges associated with relationships, mental health, alcohol or other drugs, identities, finances, etc. If you are experiencing concerns, seeking help is a courageous thing to do for yourself and those who care about you. If the source of your stressors is academic, please contact me so that we can find solutions together. For personal concerns, U-M offers a variety of resources, many which are listed on the Resources for Student Well-being webpage. You can also search for additional well-being resources here.
Schedule
IDL = Introduction to Deep Learning by Charniak
DLPy = Deep Learning with Python (2nd edition) by Chollet
DL = Deep Learning by Goodfellow, Bengio and Courville
NNDL = Neural Networks and Deep Learning by Nielsen
D2L = Dive into Deep Learning by Zhang, Lipton, Li and Smola
Note: A “V” in the date column denotes a virtual lecture.
Lecture No. | Date | Topic | Reading Assignment |
---|---|---|---|
Basics | |||
01 | Jan 06 | Course logistics Introduction slides |
DLPy, Chap. 1 DL, Chap. 1 D2L, Chap. 1 |
02 | Jan 11 V |
Basic Elements of Linear Regression slides |
D2L, Sec. 3.1.1 |
03 | Jan 13 V |
Regression Loss functions and gradient descent slides |
D2L, Sec. 3.1.1 |
04 | Jan 18 V |
Regression wrap-up Classification slides |
D2L, Sec. 3.1.3-4 D2L, Sec. 3.4.1 |
05 | Jan 20 V |
Softmax Operation Cross Entropy Loss Function slides |
D2L, Sec. 3.4.2-4 D2L, Sec. 3.4.6.1 |
06 | Jan 25 | Softmax Derivatives Information Theory Basics slides |
D2L, Sec. 3.4.6.2-3 D2L, Sec. 3.4.7 |
TensorFlow/Keras | |||
07 | Jan 27 | TensorFlow, Keras, Google Colab notebook |
DLPy, Sec. 3.1-4 |
08 | Feb 01 | First steps with TensorFlow notebook |
DLPy, Sec. 3.5.1-2 DLPy, Sec. 2.4.4 |
– | Feb 03 | CANCELLED | |
– | Feb 06 | HW 1 due | |
09 | Feb 08 | First steps with TensorFlow (continued) notebook |
DLPy, Sec. 3.5.3-4 |
10 | Feb 10 | Getting started with NNs: Classification MNIST notebook |
DLPy, Sec. 2.1 |
11 | Feb 15 | Getting started with NNs: Classification IMDB notebook |
DLPy, Sec. 4.1 |
12 | Feb 17 | Getting started with NNs: Regression Boston Housing Price notebook |
DLPy, Sec. 4.3 |
13 | Feb 22 | Generalization Evaluating ML models notebook |
DLPy, Sec. 5.1-2 |
14 | Feb 24 | Improving model fit Regularizing your model notebook |
DLPy, Sec. 5.3 DLPy, Sec. 5.4.4 |
– | Mar 01 | SPRING BREAK | |
– | Mar 03 | SPRING BREAK | |
Linear Algebra Boot Camp | |||
15 | Mar 08 | Linear Algebra notebook |
D2L, Sec. 18.1.1-2 |
16 | Mar 10 | Linear Algebra (continued) notebook HW 2 due |
D2L, Sec. 18.1.3-5 |
17 | Mar 15 | Linear Algebra (continued) notebook |
D2L, Sec. 18.1.6-7 D2L, Sec. 18.1.9 |
Convolutional Neural Networks | |||
18 | Mar 17 | From Fully-Connected Layers to Convolutions notebook |
D2L, Sec. 6.1 |
19 | Mar 22 | Convolutions for Images notebook Padding and Stride notebook Multiple Input and Multiple Output Channels notebook |
D2L, Sec. 6.2 D2L, Sec. 6.3-4 |
– | Mar 24 | CANCELLED | |
20 | Mar 29 | Pooling notebook LeNet notebook |
D2L, Sec. 6.5-6 |
Deep Learning for Time Series | |||
– | Mar 31 | CANCELLED HW 3 due |
|
21 | Apr 05 V |
A temperature-forecasting example notebook |
DLPy, Sec. 10.2 |
22 | Apr 07 | A temperature-forecasting example (continued) notebook |
DLPy, Sec. 10.2 |
23 | Apr 12 | Understanding recurrent neural networks Advanced use of recurrent neural networks notebook |
DLPy, Sec. 10.3 DLPy, Sec. 10.4 |
24 | Apr 14 | Recurrent Neural Networks notebook Backpropagation Through Time notebook V |
D2L, Sec. 8.4 D2L, Sec. 8.7.1 |
25 | Apr 19 | Course Conclusion Ask Me Anything! slides |
|
Apr 25 | HW 4 due | ||
Multivariable Calculus Boot Camp | |||
Multivariable Calculus notebook |
D2L, Sec. 18.4 |