CS 533: Natural Language Processing (NLP)

Instructor: Karl Stratos
TA: TBD
Time and location: Wednesday 12-3pm at BE 252
Instructor office hours: Wednesday 3:20-4:30pm at Tillett 111H

Course description. This project-centered graduate course will cover technical foundations of modern NLP. Students are expected to start working on course projects immediately from the beginning of the course and throughout, culminating in (1) in-class project presentations and (2) written reports that aspire to conference publication level. The course will have two parts that happen in parallel. The first part is standard lecture-based classes in which the instructor exposes students to fundamental concepts and applications in the field. The second part is continual discussions and brainstorming about course projects and self-initiated research efforts. There is no required textbook: all materials are publicly available online resources.

Please use the Canvas site to ask questions regarding lectures/homeworks/projects, to submit assignments, and to find announcements.

Goals.
  1. Achieving an understanding of the foundational concepts and tools used in modern NLP
  2. Obtaining an ability to critically read and accurately evaluate conference papers in NLP
  3. Finding new research projects that persist beyond this course

Audience and prerequisites. No previous exposure to NLP is assumed. However, this is a fast-paced course designed for self-motivated graduate or advanced undergraduate students with a solid technical background in probability and statistics, calculus, and linear algebra. Technical requirements include:
  1. Probabilistic reasoning (e.g., What is the conditional probability of Y=y given X=x, assuming the knowledge of a joint distribution over X and Y?)
  2. Intimate and intuitive understanding of matrix and vector operations (e.g., What is the shape of a matrix product? How similar are two vectors?)
  3. Mathematical notions in optimization (e.g., What does it mean for a function to have zero derivative at a certain point?)
If you cannot complete A1 comfortably, you may need to consult with the instructor about whether your background meets the prerequisites. Significant programming experience in Python is necessary for programming assignments and course projects.

Grading.
  1. Project: 40% (written report 30%, presentation 10%)
  2. Exam (in-class and open book): 30%
  3. Assignments: 20%
  4. Participation: 10%
The assignment report must be written in LaTeX using the provided assignment report template. Similarly, the project report must be written in LaTeX using the provided project report template and will be reviewed by the instructor like a conference submission.

Tentative plan.
Date Topics Readings Assignments
Week 1 (January 22) Logistics, Introduction, Language Modeling Michael Collins notes on n-gram models and log-linear models A1 [code] (Due 2/4)
Week 2 (January 29) Deep Learning for NLP: Neural Language Modeling Colah's blogs on deep learning and LSTMs, NLM papers using feedforward (Bengio et al., 2003), recurrent (Mikolov et al., 2010; Melis et al., 2018), and attention-based (GPT-2) architectures
Week 3 (February 5) Deep Learning for NLP: Conditional Neural Language Modeling
Week 4 (February 12) Deep Learning for NLP: An Overview of Other Techniques and Applications
Week 5 (February 19) Structured Prediction in NLP: Tagging
Week 6 (February 26) Structured Prediction in NLP: Parsing
Week 7 (March 4) Unsupervised Learning in NLP: Latent-Variable Models and the EM Algorithm
Week 8 (March 11) Unsupervised Learning in NLP: Variational Autoencoders
Spring Recess
Week 9 (March 25) Unsupervised Learning in NLP: Pretrained Neural Text Representations
Week 10 (April 1) Exam
Week 11 (April 8) Special Topics: TBD (Information Extraction, Question Answering) Project proposal due
Week 12 (April 15) Special Topics: TBD (Dialogue, Grounding)
Week 13 (April 22) Special Topics: TBD (Maximal Mutual Information Representation Learning)
Week 14 (April 29) Project Presentations


Other resources.
  1. Speech and Language Processing (3rd edition) by Dan Jurafsky and James H. Martin
  2. A Primer on Neural Network Models for Natural Language Processing by Yoav Goldberg
  3. Natural Language Processing by Jacob Eisenstein