This course is intended for anyone fascinated by the capabilities of large language models and generative artificial intelligence who wants to understand this field beyond the level of an ordinary user. Together, we will explore [...]
  • MLC_NLP
  • Duration 1 day
  • 0 ITK points
  • 0 terms
  • ČR (4 990 Kč)

    SR (210 €)

This course is intended for anyone fascinated by the capabilities of large language models and generative artificial intelligence who wants to understand this field beyond the level of an ordinary user. Together, we will explore transformers—the fundamental building blocks of modern language models—introduce the most well-known architectures, and demonstrate how large language models can be used in various applications. No paid third-party accounts are required for the practical exercises. We will use open-source models which, when used correctly, can be just as good as the largest commercial ones.

»

This course is intended for anyone who is fascinated by the capabilities of large language models and generative artificial intelligence, and who wants to delve into this field beyond the level of an ordinary user.

Together, we will explore transformers—the fundamental building blocks of modern language models—introduce the most well-known architectures, and demonstrate how large language models can be used for various applications. No paid third-party accounts are required for the practical exercises. We will use open-source models which, when used correctly, can be just as good as the largest commercial ones.

  • Basic knowledge of programming in Python.
  • Knowledge of machine learning at the level of an Introduction to Machine Learning course.
  • Generative AI for text and images
  • Evolution of language modeling
  • Transformers
  • Types of transformers for language modeling (encoder, decoder, encoder–decoder)
  • Reinforcement learning from human feedback (RLHF)
  • Selected transformer-based language models (BERT, GPT, LLaMA, T5, BART…)
  • Practical example of text classification using transformers with the HuggingFace library in Google Colab
  • Prompt engineering: in-context learning, zero-shot, one-shot and few-shot prompting, key configuration parameters of generative processes
  • Practical example of in-context learning using the HuggingFace library in Google Colab
  • Fine-tuning large language models and parameter-efficient fine-tuning (LoRA)
  • Evaluation of generative language models (ROUGE, BLEU)
  • Practical example of parameter-efficient fine-tuning using the HuggingFace library in Google Colab
  • Retrieval-Augmented Generation (RAG)
Current offer
Training location
Course language

The prices are without VAT.

Custom Training

Didn’t find a suitable date or need training tailored to your team’s specific needs? We’ll be happy to prepare custom training for you.