The lab component of the course will cover mathematical concepts used in computational linguistics. The purpose of the lab is to familiarize you with not-so-basic probability theory, information theory, Bayesian inference, linear algebra, and descriptive and inferential statistics. These concepts are crucial in understanding computational linguistics and natural language processing algorithms covered in lecture.

Lab instructors: Kenneth Lai and Keigh Rim

Place and Time: Friday 12:00 – 2:00 on Zoom.

**Lab notes**

Notes from the lab will be posted here as the semester progresses.

- 2/5: Intro to Numpy | tutorial, Numpy tutorial supplement, Naive Bayes in Numpy slides, review of Naive Bayes slides
- 2/12: Review on Mathematics, Gradient Descent | jamboard
- 2/19: Logistic Regression in Numpy, Information Theory | jamboard1, jamboard2
- 2/26: From Logistic Regression to Neural Networks (Part 1) | slides, review of logistic regression slides
- 3/5: From Logistic Regression to Neural Networks (Part 2) | slides, Goodfellow, Bengio, and Courville book, Nielsen book
- 3/12: From Logistic Regression to W2V (skipgram) | onenote, code
- 3/19: Viterbi Algorithm in Numpy | slides
- 3/26: Perceptrons and Structured Perceptrons | slides, whiteboard1, whiteboard2
- 4/9: Intro to PyTorch | onenote, code
- 4/16: Context-Free Grammars and (Probabilistic) CKY Algorithm | slides
- 4/23: Dependency Parsing and Recurrent Neural Networks | Stymne dependency parsing slides, RNN slides
- 4/30: Encoder-Decoder Models, Attention, Transformers, and Contextualized Word Embeddings | slides | Alammar blog: Seq2seq Models With Attention, The Illustrated Transformer | papers: transformers, ELMo, BERT