6 - 10 занятия курса DL in NLP

Краткий конспект по нашим занятиям во второй половине курса.

Записи лекций CS224n:

  1. https://youtu.be/Keqep_PKrY8
  2. https://youtu.be/QuELiw8tbx8
  3. https://youtu.be/IxQtK2SjWWM

Слайды наших лекций:

  1. Seminar 9. Attention is All You Need
  2. Seminar 10. 2018 is the Year of Transfer Learning in NLP

Дополнительные материалы:

  1. The Unreasonable Effectiveness of Recurrent Neural Networks
  2. Deep Learning Book
  3. Understanding LSTM Networks
  4. On the difficulty of training recurrent neural networks
  5. Attention
  6. Neural Machine Translation by Jointly Learning to Align and Translate
  7. The Annotated Encoder-Decoder with Attention
  8. Good practices in Modern Tensorflow for NLP
  9. Attention is All You Need
  10. Illustrated transformer
  11. A proposal of good practices for files, folders and models architecture
  12. Deep contextualized word representations
  13. Universal Language Model Fine-tuning for Text Classification
  14. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
  15. The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)
Written on October 26, 2018