The Leanpub Podcast 🎙️ Feat. Andriy Burkov, Author of The Hundred-Page Language Models Book and The Hundred-Page Language Models Course
In this episode of the Leanpub Podcast, Len Epp interviews Andriy Burkov, author of The Hundred-Page Language Models Book: Hands-on with PyTorch and The Hundred-Page Language Models Course.
Episode Details
In this episode of the Leanpub Podcast, Len Epp interviews Andriy Burkov, author of The Hundred-Page Language Models Book: Hands-on with PyTorch and The Hundred-Page Language Models Course.
Andriy shares his deep expertise in machine learning and artificial intelligence, discussing the rapid rise of LLMs like GPT, the realities behind AGI hype, and the current state of AI research and applications.
They also explore Andriy’s experience writing the new book, the challenges of teaching technical concepts in an age of short attention spans, and the lessons learned from his earlier bestseller The Hundred-Page Machine Learning Book. With a background that includes work in autonomous driving, Andriy offers a critical perspective on the limits of current technology and what’s truly feasible today.
This interview was recorded on June 23, 2025.
Thank you for watching, please like and leave a comment, we'd love to hear from you!
About the Book

Master language models through mathematics, illustrations, and code―and build your own from scratch!
The Hundred-Page Language Models Book by Andriy Burkov, the follow-up to his bestselling The Hundred-Page Machine Learning Book (now in 12 languages), offers a concise yet thorough journey from language modeling fundamentals to the cutting edge of modern Large Language Models (LLMs). Within Andriy's famous "hundred-page" format, readers will master both theoretical concepts and practical implementations, making it an invaluable resource for developers, data scientists, and machine learning engineers.
The Hundred-Page Language Models Book allows you to:
- Master the mathematical foundations of modern machine learning and neural networks
- Build and train three architectures of language models in Python
- Understand and code a Transformer language model from scratch in PyTorch
- Work with LLMs, including instruction finetuning and prompt engineering
Written in a hands-on style with working Python code examples, this book progressively builds your understanding from basic machine learning concepts to advanced language model architectures. All code examples run on Google Colab, making it accessible to anyone with a modern laptop.
About the technology
Language models have evolved from simple n-gram statistics to become one of the most transformative technologies in AI, rivaling only personal computers in their impact. This book spans the complete evolution—from count-based methods to modern Transformer architectures—delivering a thorough understanding of both how these models work and how to implement them.
About the book
The Hundred-Page Language Models Book takes a unique approach by introducing language modeling concepts gradually, starting with foundational methods before advancing to modern architectures. Each chapter builds upon the previous one, making complex concepts accessible through clear explanations, diagrams, and practical implementations.
What's inside
- Essential machine learning and neural network fundamentals
- Text representation techniques and basic language modeling
- Implementation of RNNs and Transformer architectures with PyTorch
- Practical guidance on finetuning language models and prompt engineering
- Important considerations on hallucinations and ways to evaluate models
- Additional resources for advanced topics through the book's wiki
The complete code and additional resources are available through the book's website at thelmbook.com/wiki.
About the reader
Readers should have programming experience in Python. While familiarity with PyTorch and tensors is helpful, it's not required. College-level math knowledge is beneficial, but the book presents mathematical concepts intuitively with clear examples and diagrams.
About the Course

Hi everyone, Andriy here: This Leanpub course is basically the book content, plus 6 quizzes and 27 exercises to help you focus. It also includes about 3 hours of videos where I talk about some of the fundamental concepts. It's for people who want to learn the material for work, but their company would rather pay for them to do a course than to sit around in their pyjamas and fuzzy slippers reading a book :) Let me know what you think!
Master language models through mathematics, illustrations, and code―and build your own from scratch!
The Hundred-Page Language Models Course by Andriy Burkov, the follow-up to his bestselling The Hundred-Page Machine Learning Book (now in 12 languages), offers a concise yet thorough journey from language modeling fundamentals to the cutting edge of modern Large Language Models (LLMs). Within Andriy's famous "hundred-page" format, readers will master both theoretical concepts and practical implementations, making it an invaluable resource for developers, data scientists, and machine learning engineers.
The Hundred-Page Language Models Course allows you to:
- Master the mathematical foundations of modern machine learning and neural networks
- Build and train three architectures of language models in Python
- Understand and code a Transformer language model from scratch in PyTorch
- Work with LLMs, including instruction finetuning and prompt engineering
Written in a hands-on style with working Python code examples, this course progressively builds your understanding from basic machine learning concepts to advanced language model architectures. All code examples run on Google Colab, making it accessible to anyone with a modern laptop.
About the technology
Language models have evolved from simple n-gram statistics to become one of the most transformative technologies in AI, rivaling only personal computers in their impact. This course spans the complete evolution—from count-based methods to modern Transformer architectures—delivering a thorough understanding of both how these models work and how to implement them.
About the course
The Hundred-Page Language Models Course takes a unique approach by introducing language modeling concepts gradually, starting with foundational methods before advancing to modern architectures. Each chapter builds upon the previous one, making complex concepts accessible through clear explanations, diagrams, and practical implementations.
What's inside
- Essential machine learning and neural network fundamentals
- Text representation techniques and basic language modeling
- Implementation of RNNs and Transformer architectures with PyTorch
- Practical guidance on finetuning language models and prompt engineering
- Important considerations on hallucinations and ways to evaluate models
- Additional resources for advanced topics through the book's wiki
The complete code and additional resources are available through the course's website at thelmbook.com/wiki.
About the reader
Readers should have programming experience in Python. While familiarity with PyTorch and tensors is helpful, it's not required. College-level math knowledge is beneficial, but the book presents mathematical concepts intuitively with clear examples and diagrams.
Testimonials for the book version
Vint Cerf, Internet pioneer and Turing Award recipient: "This book cleared up a lot of conceptual confusion for me about how Machine Learning actually works - it is a gem of clarity."
Tomáš Mikolov, the author of word2vec and FastText: "The book is a good start for anyone new to language modeling who aspires to improve on state of the art."
About the Author

Andriy Burkov holds a Ph.D. in Artificial Intelligence and is a recognized expert in machine learning and natural language processing. As a machine learning expert and leader, Andriy has successfully led dozens of production-grade AI projects in different business domains at Fujitsu and Gartner. His previous books have been translated into a dozen languages and are used as textbooks in many universities worldwide. His work has impacted millions of machine learning practitioners and researchers worldwide.
Follow the author here!
Leanpub course LAUNCH 🚀 The Hundred-Page Language Models Course
Clips From This Episode of the Leanpub Podcast
