The Ultimate Guide to Fine-Tuning Large Language Models with PyTorch and HuggingFace
The Ultimate Guide to Fine-Tuning Large Language Models with PyTorch and HuggingFace
About the Book
This book is a practical guide to fine-tuning Large Language Models (LLMs), offering both a high-level overview and detailed instructions on how to train these models for specific tasks. It covers essential topics, from loading quantized models and setting up LoRA adapters to properly formatting datasets and selecting the right training arguments. The book focuses on the key details and "knobs" you need to adjust to effectively handle LLM fine-tuning. Additionally, it provides a comprehensive overview of the tools you'll need along the way, including HuggingFace's Transformers, Datasets, and PEFT packages, as well as BitsAndBytes, Llama.cpp, and Ollama. The final chapter includes troubleshooting tips for common error messages and exceptions that may arise.
Table of Contents
- Frequently Asked Questions (FAQ)
- 100% Human Writing
- Why Fine-Tune LLMs?
- How Difficult It Is to Fine-Tune an LLM?
- Why This Book?
- Who Should Read This Book?
- What Do I Need to Know?
- What Setup Do I Need?
- How to Read This Book
- Chapter 0: TLDR;
- Loading a Quantized Base Model
- Setting Up Low-Rank Adapters (LoRA)
- Formatting Your Dataset
- Fine-Tuning with SFTTrainer
- Querying the Model
- Chapter 1: Pay Attention to LLMs
- Language Models, Small and Large
- Transformers
- Attention is All You Need
- No Such Thing As Too Much RAM
- Flash Attention and SDPA
- Types of Fine-Tuning
- Self-Supervised
- Supervised
- Instruction
- Preference
- Chapter 2: Loading a Quantized Base Model
- Quantization in a Nutshell
- Half-Precision Weights
- The Brain Float
- Loading Models
- Mixed Precision
- BitsAndBytes
- 8-bit Quantization
- 4-bit Quantization
- The Secret Lives of Dtypes
- Chapter 3: Low-Rank Adaptation (LoRA)
- Low-Rank Adaptation in a Nutshell
- Parameter Types and Gradients
- PeFT
- target_modules
- The PEFT Model
- modules_to_save
- Embeddings
- Managing Adapters
- Chapter 4: Formatting Your Dataset
- Formatting in a Nutshell
- Applying Templates
- Supported Formats
- BYOFF (Bring Your Own Formatting Function)
- BYOFD (Bring Your Own Formatted Data)
- The Tokenizer
- Data Collators
- Packed Dataset
- Advanced - BYOT (Bring Your Own Template)
- Chat Template
- Custom Template
- Special Tokens FTW
- Chapter 5: Training with HuggingFace
- Training in a Nutshell
- Fine-Tuning with SFTTrainer
- SFTConfig
- Memory Usage Arguments
- Mixed-Precision Arguments
- Dataset-Related Arguments/li>
- Typical Training Arguments
- Environment and Logging Arguments
- The Actual Training (For Real!)
- Attention
- Flash Attention 2
- PyTorch's SDPA
- Studies, Ablation-Style
- Chapter 6: Deploying It Locally
- Deploying in a Nutshell
- Querying the Model
- Llama.cpp
- GGUF File Format
- Converting Adapters
- Converting Full Models
- Serving Models
- Ollama
- Llama.cpp
- Chapter -1: Troubleshooting
- Appendix A: Setting Up Your GPU Pod
- Appendix B: Data Types' Internal Representation
The Leanpub 60 Day 100% Happiness Guarantee
Within 60 days of purchase you can get a 100% refund on any Leanpub purchase, in two clicks.
Now, this is technically risky for us, since you'll have the book or course files either way. But we're so confident in our products and services, and in our authors and readers, that we're happy to offer a full money back guarantee for everything we sell.
You can only find out how good something is by trying it, and because of our 100% money back guarantee there's literally no risk to do so!
So, there's no reason not to click the Add to Cart button, is there?
See full terms...
Earn $8 on a $10 Purchase, and $16 on a $20 Purchase
We pay 80% royalties on purchases of $7.99 or more, and 80% royalties minus a 50 cent flat fee on purchases between $0.99 and $7.98. You earn $8 on a $10 sale, and $16 on a $20 sale. So, if we sell 5000 non-refunded copies of your book for $20, you'll earn $80,000.
(Yes, some authors have already earned much more than that on Leanpub.)
In fact, authors have earnedover $14 millionwriting, publishing and selling on Leanpub.
Learn more about writing on Leanpub
Free Updates. DRM Free.
If you buy a Leanpub book, you get free updates for as long as the author updates the book! Many authors use Leanpub to publish their books in-progress, while they are writing them. All readers get free updates, regardless of when they bought the book or how much they paid (including free).
Most Leanpub books are available in PDF (for computers) and EPUB (for phones, tablets and Kindle). The formats that a book includes are shown at the top right corner of this page.
Finally, Leanpub books don't have any DRM copy-protection nonsense, so you can easily read them on any supported device.
Learn more about Leanpub's ebook formats and where to read them