CS 342 - Neural networks - Fall 2018

TTh 10am-11am UTC 3.124
Fr 1pm-2pm / 2pm-3pm ECJ 1.204

instructor Philipp Krähenbühl ( philkr (at) utexas.edu )
office hours Tue 11am-12pm GDC 4.824
Please only visit during office hours or with appointment (or if I didn’t reply your email after the third try)

TA hours TBD

We will use piazza for questions and canvas homework.


class section

Date Topic Concepts Slides Corresponding book chapters (optional; see below) Notes and due dates
Aug 30 Introduction tensors layers   Background: Ch 1-5  
Aug 31 Installing pytorch and building the first compute graph       HW1 out
Sep 4 Simple networks fully connected ReLU      
Sep 6 Output transformations and loss functions sigmoid softmax log likelihood L1/L2 loss   5.9 HW1 due 11:59pm
Sep 7 Logistic regression and MLP       HW2 out
Sep 11 Back propagation back prop   Ch 6  
Sep 13 Stochastic gradient descent SGD momentum     HW2 due 11:59pm
Sep 14 Regression vs classification       HW3 out
Sep 18 Activation functions + Initialization LeakyReLU PReLU Gaussian Init Xavier Init   Ch 6  
Sep 20 Convolutions + Pooling Conv2D Avg Pooling Max Pooling striding / padding   9.1-9.3 HW3 due 11:59pm
Sep 21 ConvNets       HW4 out
Sep 25 Overfitting early stopping data augmentation dropout parameter sharing ensembles   7.4, 7.8, 7.9, 7.11, 7.12  
Sep 27 Optimization tips and tricks weight regularization batch normalization   7.1-7.3 HW4 due 11:59pm
Sep 28 Regularization and normalization       HW5 out
Oct 2 Popular Architectures I LeNet AlexNet VGG auxillary losses      
Oct 4 Popular Architectures II GoogLeNet ResNet DenseNet residual connections     HW5 due 11:59pm
Oct 5 ResNets       HW6 out
Oct 9 Object detection ROI Pooling R-CNN Fast R-CNN YOLO      
Oct 11 Segmentation up-convolution FCN     HW6 due 11:59pm
Oct 12 Fully convolutional networks       HW7 out
Oct 16 Visualization and understanding gradients and saliency      
Oct 18 Generative models reparametrization trick adversarial loss   20.10, 20.12, 20.13 HW7 due 11:59pm
Oct 19 Upconvolution and deep rendering       HW8 out
Oct 23 Recurrent Network I RNN LSTM GRU   10.1-10.11  
Oct 25 Recurrent Network II Temporal CNNs     HW8 due 11:59pm
Oct 26 Future prediction       HW9 out
Oct 30 Deep Reinforcement Learning I policy gradient      
Nov 1 Deep Reinforcement Learning II Q-learning actor-critic     HW9 due 11:59pm
Nov 2 Acting through future prediction       HW10 out
Nov 6 Deep Reinforcement Learning III gradient free optimization      
Nov 8 Acting with supervised learning imitation learning dagger direct future prediction     HW10 due 11:59pm
Nov 9 Gradient free optimization       HW11 out
Nov 13 Embedding learning triplet loss contrastive loss      
Nov 15 Style transfer graham statistic     HW11 due 11:59pm
Nov 16 Driving in SuperTuxKart       Final project out
Nov 20 Language        
Nov 22 no class        
Nov 23 no class        
Nov 27 Adversarial examples   adversarial attack 7.13  
Nov 29 Negative results        
Nov 30 Final project Q/A        
Dec 4 Final project flash presentations       Final project due 11:59pm
Dec 6 Final project competition        
Dec 7 Nagging TA for better grade       no class

concepts: layers/architecture algorithm supervision applications


Class overview

Goals of the class

After this class you should be able to


There will be in class quizzes throughout the semester (about one a week). You can work with your peers on those quizzes during class. Each quiz will be graded pass/fail.

There will be a homework assignment every week, but we will keep them small and manageable. Homework needs to be solved individually, if we detect duplicate solutions you will lose the full credit for the assignment. Most of the homework will be graded automatically, you’ll have access to a partial grader while you’re working on the assignment. We will use the same grader, but different test data for your final grade. You’ll submit your homework through canvas.

Late policy:

Expected workload

Estimates of required effort to pass the class are:

Course material

The course should be self contained, but if you need additional reading material consult Deep learning, Goodfellow, Bengio and Courville 2016 www.deeplearningbook.org.


Syllabus subject to change.