Organisation
Time | Tuesday, 10:00h-11:30h |
---|---|
Room | 142 |
Credits | 2SWS/3ECTS |
Exam | written exam |
Announcements
- First lesson in term WS 24-25: Tuesday, 08.10. 2024
Natural Language Processing
Natural Language Processing (NLP) deals with techniques that enable computers to understand the meaning of text, which is written in a natural language. Thus NLP constitutes an essential part of Human Computer Interaction (HCI). As a science NLP can be considered as the field, where Computer Science, Artificial Intelligence, Machine Learning and Linguistics overlap.
The image below lists popular NLP use-cases in the leftmost column. For providing these applications different NLP specific tools, which are listed in the center column, are applied. These NLP tools implement more general algorithms, e.g. from Machine Learning (right column). Today, in particular a specific type of Deep Neural Network, the transformer, is applied in Large Language Models like GPT.
All of the above mentioned NLP applications, tools and algorithms are addressed in this lecture. However, at the heart of the lecture we have a strong emphasis on Large Language Models and how they can be applied in the context of Retrieval Augmented Generation (RAG). Moreover, the Neural Network Architecture, which is applied in all LLMs, the Transformer is explained in detail.
Content
- Jupyterbook as .html: https://lectures.mi.hdm-stuttgart.de/mi7nlp/intro.html (ask the lecturer for the access credentials)
- Gitlab repo of Jupyterbook and all sources: https://gitlab.mi.hdm-stuttgart.de/maucher/nlpbook
- Link to Checker Quests (old version - will be adapted soon).