The NLPwDL course provides in depth knowledge on Deep Learning methods in Natural Language Processing (NLP). It describes methods and architectures such as RNN, LSTM, Attentions, and Transformers, and studies them in NLP applications such as language modeling, document classification, machine translation, and abstractive/extractive summarization. Topics such as interpretability and energy consumption of NLP models as well as mitigating societal biases in NLP systems are discussed in the course.
Covered topics:
- Convolutional Neural Networks in document classification
- Recurrent Neural Networks for language modeling
- Sequence to sequence models in abstractive summarization
- Attention mechanism in sequence to sequence models
- Transformers for neural machine translation
- Large language models (BERT, GPT-x, etc.)
- Advanced DL topics, i.e., energy consumption and model compression, parameter-efficient learning, interpretability, learning fair and invariant representations.
Information for the current semester (if available):
{{ labelInLang('cid') }} | {{ labelInLang('title') }} | {{ labelInLang('registration') }} | {{ labelInLang('type') }} | {{ labelInLang('hours') }} | {{ labelInLang('teachers') }} | {{ labelInLang('rhythm') }} |
---|---|---|---|---|---|---|
{{ item._id }} ({{ item.term }}) |
{{ item.title }}: {{ item.subtitle }}
{{ labelInLang('moreinfo') }} {{ labelInLang('expand') }} {{ labelInLang('collapse') }} |
{{ labelInLang('register') }} | {{ item.type }} | {{ item['hours-per-week'] }} | {{ teacher.firstname }} {{ teacher.lastname }} {{ item.teachers.teacher.firstname }} {{ item.teachers.teacher.lastname }} | {{ item.rhythm }} |
{{ item._id }} ({{ item.term }}) | |
{{ labelInLang('title') }} |
{{ item.title }}: {{ item.subtitle }}
{{ labelInLang('moreinfo') }} {{ labelInLang('expand') }} {{ labelInLang('collapse') }} |
{{ labelInLang('registration') }} | {{ labelInLang('register') }} |
{{ labelInLang('type') }} | {{ item.type }} |
{{ labelInLang('hours') }} | {{ item['hours-per-week'] }} |
{{ labelInLang('teachers') }} | {{ teacher.firstname }} {{ teacher.lastname }} {{ item.teachers.teacher.firstname }} {{ item.teachers.teacher.lastname }} |
{{ labelInLang('rhythm') }} | {{ item.rhythm }} |
Prerequisites:
- It is strongly suggested to first take the VL+UE Natural Language Processing course offered in winter semesters. The NLP course covers all necessary prerequisites for NLPwithDL. Otherwise, if the NLP course is not taken, the following skills are expected:
- Knowledge on machine learning, neural networks, and linear algebra is expected.
- Knowledge of text processing, document classification, word embeddings, and in general the principles of NLP.
- Good programming skill with Python. You should be able to implement a classification task in PyTroch.
- Prior knowledge on deep learning models is not required.
Teaching materials of the latest course (Summer Semester 2022):
- N-gram Embeddings with Convolutional Neural Networks slides, opens an external URL in a new window
- Recurrent Neural Networks slides, opens an external URL in a new window
- RNNs: Language and Sequence to Sequence Models slides, opens an external URL in a new window
- Attention Networks slides, opens an external URL in a new window
- Transformers slides, opens an external URL in a new window
- Large Language Models slides, opens an external URL in a new window