ČR (23 000 Kč)
SR (1 000 €)
This course is designed for programmers who want to understand how large language models (LLMs) work "inside" and work their way practically from simple statistical language models to RNN/LSTMs to transformers and mini GPTs. In the second part of the course, participants will learn how to work with ready-made models (Hugging Face), perform fine-tuning (including LoRA) and build a practical application on top of custom documents using RAG (retrieval-augmented generation). Production aspects are also included: latency, optimization, quantization and deployment as an API including Docker.
Fundamentals of neural networks and NLP
The prices are without VAT.
Didn’t find a suitable date or need training tailored to your team’s specific needs? We’ll be happy to prepare custom training for you.