Neural Networks II
Lecture Slides Lecture Slides (pdf) Lecture Slides (ipynb)
Tutorial Exercise Tutorial Exercise (pdf) Tutorial Exercise (ipynb)
This week we continue our discussion of neural networks and look at their application to text analysis problems.
Required Readings
- Jurafsky and Martin 2026 Ch 8 Transformers
- Mikolov, Tomas, et al. 2013. “Efficient Estimation of Word Representations in Vector Space.” arXiv 1301.3781. https://arxiv.org/abs/1301.3781
Additional Readings
- Bengio, Yoshua, Réjean Ducharme, Pascal Vincent, and Christian Janvin. 2003. “A Neural Probabilistic Language Model.” The Journal of Machine Learning Research 3:1137–55 http://www.jmlr.org/papers/volume3/bengio03a/bengio03a.pdf
- Vaswani, Ashish, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Ł ukasz Kaiser, and Illia Polosukhin. 2017. “Attention Is All You Need.” Advances in Neural Information Processing Systems. Vol. 30. https://proceedings.neurips.cc/paper_files/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf
Tutorial
- Applying neural networks to text