DS4440 // practical neural networks // spring 2019
Course details
Byron Wallace
Office: 256 West Village H
Jay DeYoung deyoung.j@husky.neu.edu

MR 11:45-1:25 / Hastings Suite 118

Here is a link to the course Piazza site.
Books & Resources

Deep Learning with Python
Francois Chollet
ISBN-10 1617294438

Neural Network Methods for Natural Language Processing
Yoav Goldberg
ISBN-10 1627052984

PyTorch tutorials (official)

Course description

This course is a hands-on introduction to modern neural network ("deep learning") tools and methods. The course will cover the fundamentals of neural networks, and introduce standard and new architectures: from simple feed forward networks to recurrent neural networks. We will cover stochastic gradient descent and backpropagation, along with related fitting techniques.

The course will have a particular emphasis on using these technologies in practice, via modern toolkits. We will specifically be introducing (1) Keras (together with TensorFlow) and (2) PyTorch, which are illustrative of static and dynamic network implementations, respectively. Applications of these models to various types of data will be reviewed, including images and text. This iteration will have a bit of a bias toward the latter, reflecting instructor biases.

40%Final project

Prior exposure to machine learning is recommended, and enforced through a co-req with ML. Working knowledge of Python required (or you must be willing to pick up rapidly as we go). Familiarity with linear algebra, (basic) calculus and probability will be assumed throughout.


Homeworks will consist of both written and programming components. The latter will be completed in Python, using the Keras/TensorFlow and PyTorch frameworks.


The mid-term will be given in class, and will be testing for understanding of the core material presented in the course regarding the fundamentals of neural networks, including backpropagation, as well as differences between and applicability of the architectures introduced.


A big component of this course will be your project, which will involve picking a particular dataset on which to implement, train and evaluate neural models. Collaboration is allowed (team sizes should be <= 3, however). This project will be broken down into several graded deliverables, and culminate in a report and final presentation in class to your peers.

Academic integrity policy

A commitment to the principles of academic integrity is essential to the mission of Northeastern University. The promotion of independent and original scholarship ensures that students derive the most from their educational experience and their pursuit of knowledge. Academic dishonesty violates the most fundamental values of an intellectual community and undermines the achievements of the entire University. For more information, please refer to the Academic Integrity Web page.

Shedule outline

MeetingTopic(s)& etc.
1/7Course aims, expectations, logistics; Review of supervised learning / Perceptron; intro to colab.
1/10Logistic Regression and Estimation via SGD
1/14Beyond Linear Models: The Multi-Layer Perceptron
1/17Layers, activations, and loss functionsHW 1 Due!
1/21No class (MLK day)
1/24Backpropagation I
1/28Backpropagation II
1/31Learning continuous representations of discrete things: Embeddings
2/4Convolutional Neural Networks (CNNs) IHW 2 Due!
2/7Convolutional Neural Networks (CNNs) II
2/11Recurrent Neural Networks (RNNs) I
2/14Recurrent Neural Networks (RNNs) II
2/18No class (President's day)HW 3 Due!
2/21Optimizer matters: training neural networks in practice
2/25Neural Sequence Tagging
Spring break
3/11Sequence-to-Sequence Models I
3/14Sequence-to-Sequence Models II
3/18Summarization ModelsHW 4 Due!
3/21Variational Auto-Encoders (VAEs)
3/25Generative Adversarial Networks (GANs)
3/28Deep Reinforcement Learning IHW 5 Due!
4/1Deep Reinforcement Learning II
4/4Advanced Topics (TBD)/built-in slack
4/8Final project presentations/discussion I
4/11Final project presentations/discussion II

HTML/CSS/JS used (and modified), with permission, courtesy of Prof. Alan Mislove, (c) 2015