The lab component of the course will cover mathematical concepts and computational libraries used in computational linguistics. The purpose of the lab is to familiarize you with linear algebra, differential and vector calculus, Numpy, and PyTorch. These concepts are crucial in understanding computational linguistics and natural language processing algorithms covered in lecture.
Lab instructors: Kenneth Lai, Haibo Sun
Place and Time: Friday 2:20 – 3:50 in Lown 002 / Zoom.
Lab notes and exercises
Notes and exercises from the lab will be posted here as the semester progresses.
- 1/20: Intro to Numpy and Naive Bayes in Numpy | Numpy tutorial, Numpy tutorial supplement, Naive Bayes in Numpy slides
- 1/27: Intro to Numpy (Part 2) and Linear Classifiers | slides
- 2/3: Broadcasting in Numpy and Linear Classifiers (Part 2) | slides, exercise
- 2/10: Linear Classifiers (Part 3) and Intro to Pandas | slides, gradient supplement, Pandas tutorial
- 2/17: Intro to Pandas (Part 2) and Neural Networks | slides, backpropagation supplement, Goodfellow, Bengio, and Courville book, Nielsen book
- 3/3: Word Vectors | slides, word2vec papers: 1, 2
- 3/10: Structured Perceptrons | slides, exercise
- 3/17: Intro to PyTorch | code, data
- 3/24: Context-Free Grammars and CKY Algorithm | slides, Iyyer slides
- 3/31: Dependency Parsing and Constituency Parsing (Part 2) | Stymne slides, exercise
- 4/14: Recurrent Neural Networks and Transformers | slides, Understanding LSTM Networks, The Illustrated Transformer, papers: Vaswani et al. 2017, Wolf et al. 2020
- 4/21: Contextualized Word Embeddings | slides, The Illustrated BERT, ELMo, and co., papers: ELMo, BERT, GPT-: 1, 2, 3, 3.5, 4