Schedule is tentative. Check regularly for updates. Readings beginning with “E” are from Eisenstein’s Natural Language Processing, while readings beginning with “JM” are from Jurafsky and Martin’s Speech and Language Processing.
DATE | TOPIC | READINGS | ASSIGNMENT | LINKS | |
1/18 | Two Cultures of Linguistics | E1 | |||
1/21 | Two Cultures of Linguistics (Part 2) and Naive Bayes | E2.1-2.2;4.1;4.3-4.4; JM4 | |||
1/25 | Naive Bayes (Part 2) and Logistic Regression | E2.5-2.7; JM5 | HW1 out | ||
1/28 | Logistic Regression (Part 2) | ||||
2/1 | Stochastic Gradient Descent | ||||
2/4 | Stochastic Gradient Descent (Part 2) | HW1 Due; HW2 out | |||
2/8 | Multinomial Logistic Regression and Perceptrons | E2.3-2.4;2.8 | |||
2/11 | Perceptrons (Part 2) and Multilayer Perceptrons | E3.1-3.3; JM7 | |||
2/15 | Neural Networks and Computation Graphs | Kleene 1951 | |||
2/18 | Computation Graphs (Part 2) | HW2 Due | Baydin et al. 2015 | ||
3/1 | Vector Semantics | E14; JM6 | HW3 out | ||
3/4 | Vector Semantics (Part 2) | ||||
3/8 | Quiz | ||||
3/11 | Word2vec | ||||
3/15 | Singular Value Decomposition and Hidden Markov Models | E7.1-7.4;8.1; JM8;A | HW3 Due | ||
3/18 | Hidden Markov Models (Part 2) and Structured Perceptrons | E7.5 | HW4 out | ||
3/22 | Structured Prediction and Conditional Models | ||||
3/25 | Syntax | E9.2; JM12 | |||
3/29 | Earley Algorithm | ||||
4/1 | CKY Algorithm | E10.1-10.2; JM13 | HW4 Due; HW5 out | ||
4/5 | Probabilistic Context-Free Grammars | E10.3-10.4; JMC | |||
4/8 | Probabilistic Context-Free Grammars (Part 2) and Dependency Parsing | E11; JM14 | Stanza | ||
4/12 | Recurrent Neural Networks: 1; 2; 3; 4; 5 | E6.3;7.6; JM9 | HW5 Due; HW6 out | ||
4/26 | Attention and Transformers | ||||
4/29 | Quiz Review Session | HW6 Due | |||
5/3 | Attention and Transformers (Part 2) | HW7 out | |||
5/5 | Quiz out | ||||
5/12 | HW7 Due; Quiz Due |