The lab component of the course will cover mathematical concepts and computational libraries used in computational linguistics. The purpose of the lab is to familiarize you with linear algebra, differential and vector calculus, Numpy, and PyTorch. These concepts are crucial in understanding computational linguistics and natural language processing algorithms covered in lecture.
Lab instructor: Kenneth Lai
Place and Time: Thursday 3:30 – 5:00 on Zoom / Shapiro Science Center LL16.
Lab notes and exercises
Notes and exercises from the lab will be posted here as the semester progresses.
- 1/27: Intro to Numpy and Naive Bayes in Numpy | Numpy tutorial, Numpy tutorial supplement, Naive Bayes in Numpy slides
- 2/3: More Intro to Numpy and Linear Classifiers (Part 1) | slides
- 2/10: Broadcasting in Numpy and Linear Classifiers (Part 2) | slides, exercise
- 2/17: Linear Classifiers (Part 3) | slides, gradient supplement
- 3/3: Neural Networks and Intro to Pandas | slides, backpropagation supplement, Pandas tutorial, Goodfellow, Bengio, and Courville book, Nielsen book
- 3/10: More Intro to Pandas and Word Vectors (Part 1) | slides
- 3/17: Word Vectors (Part 2) and Viterbi Algorithm in Numpy | word vectors slides, Viterbi algorithm in Numpy slides, word2vec papers: 1, 2
- 3/24: Structured Perceptrons | slides, exercise
- 3/31: Intro to PyTorch | code
- 4/7: Context-Free Grammars and CKY Algorithm | slides, exercise
- 4/14: Dependency Parsing and More Intro to PyTorch | Stymne dependency parsing slides, last year’s RNN slides
- 4/28: Contextualized Word Embeddings | slides | Olah blog: Understanding LSTM Networks | Alammar blog: The Illustrated Transformer | papers: transformers, ELMo, BERT