Ollama in Action: Building Safe, Private AI with LLMs, Function Calling and Agents

Ollama in Action: Building Safe, Private AI with LLMs, Function Calling and Agents

Mark Watson
Buy on Leanpub

Table of Contents

Ollama in Action: Building Safe, Private AI with LLMs, Function Calling and Agents

  • Preface
    • About the Author
    • Requests from the Author
    • Why Should We Care About Privacy?
  • Setting Up Your Computing Environment for Using Ollama and Using Book Example Programs
    • Python Build Tools
  • Using Ollama From the Command Line
    • Using JSON Format
    • Analysis of Images
    • Analysis of Source Code Files
  • Short Examples
    • Using The Ollama Python SDK with Image and Text Prompts
    • Using the OpenAI Compatibility APIs with Local Models Running on Ollama
  • LLM Tool Calling with Ollama
    • Example Showing the Use of Tools Developed Later in this Chapter
    • Tool for Reading and Writing File Contents
    • Tool for Getting File Directory Contents
    • Tool for Accessing SQLite Databases Using Natural Language Queries
    • Tool for Summarizing Text
    • Tool for Web Search and Fetching Web Pages
    • Tools Wrap Up
  • Pydantic AI Experiments
    • Weather Lookup Tool Use Example
    • DuckDuckGo Search Summary Tool Example
    • Wrap Up for Pandantic AI Library
  • Automatic Evaluation of LLM Results: More Tool Examples
    • Tool For Judging LLM Results
    • Evaluating LLM Responses Given a Chat History
    • A Tool for Detecting Hallucinations
    • Wrap Up
  • Prompt Caching
    • Caching is Implicit with Ollama
    • Example Code to Show Caching Effectiveness
    • Wrap Up for Prompt Caching
  • Building Agents with Ollama and the Hugging Face Smolagents Library
    • Choosing Specific LLMs for Writing Agents
    • Installation notes
    • Overview of the Hugging Face Smolagents Library
    • Overview for LLM Agents (optional section)
    • Let’s Write Some Code
    • Output from Third Example: “Read the text in the file ‘data/economics.txt’ file and then summarize this text.”
    • Agents Wrap Up
  • Using AG2 Open-Source AgentOS LLM-Based Agent Framework for Generating and Executing Python Code
    • Example Implementation
    • Example Output
    • Wrap Up for Using AG2’s Agent Framework for Generating and Executing Python Code
  • Using the Unsloth Library on Google Colab to FineTune Models for Ollama
    • Colab Notebook 1: A Quick Test of Fine Tuning and Deployment to Ollama on a Laptop
    • Fine Tuning Using a Fun Things To Do in Arizona Data Set
    • Third Colab Notebook That Fine Tunes a Larger Model
    • Fine Tuning Wrap Up
  • DSP Experiments
    • DSP Uses Pydantic for Type Signatures - a First Ollama Example
  • Reasoning with Large Language Models
    • A Simple Example
    • Key Features of Reasoning Models
    • A More Complex Example: City Traffic Planning
  • Using Property Graph Database with Ollama
    • Overview of Property Graphs
    • Example Using Ollama, LangChain, and the Kuzu Property Graph Database
    • Using LLMs to Create Graph Databases from Text Data
  • LangGraph
  • Using the Open Codex Command Line Interface Coding Agent
    • Example Use Cases
    • Open Codex Wrap Up
  • Long Term Persistence Using Mem0 and Chroma
    • Code Example Using Mem0 and Chroma
    • Example Output
  • Using Ollama Cloud Services
    • Ollama Cloud Services: Power and Knowledge on Demand
    • Augmenting Models with the Web Search API
    • Wrap-Up: A Unified Local and Cloud Strategy
  • Semantic Navigator App Using Gradio
    • Overview or Semantic Web and Linked Data
    • Design Goals for the Semantic Navigator App
    • Implementation of the Semantic Navigator App Using Gradio
  • RAG Using zvec Vector Datastore and Local Model
    • Introduction and Architecture
    • Design Analysis: Dependency Minimization
    • Implementation Walkthrough
    • Example Run
    • Wrap Up for RAG Using zvec Vector Datastore and Local Model
  • Book Wrap Up
Ollama in Action: Building Safe, Private AI with LLMs, Function Calling and Agents/overview

Ollama in Action: Building Safe, Private AI with LLMs, Function Calling and Agents

course_overview

Use Ollama to run LLMs locally for privacy and control of your tech stack. We look at Tools/function calling and agents in detail.

count_chapters
begin_reading
download
p_implied_book_part_name

Ollama in Action: Building Safe, Private AI with LLMs, Function Calling and Agents21 chapters

Begin ›
  1. Preface

  2. Setting Up Your Computing Environment for Using Ollama and Using Book Example Programs

  3. Using Ollama From the Command Line

  4. Short Examples

  5. LLM Tool Calling with Ollama

  6. Pydantic AI Experiments

  7. Automatic Evaluation of LLM Results: More Tool Examples

  8. Prompt Caching

  9. Building Agents with Ollama and the Hugging Face Smolagents Library

  10. Using AG2 Open-Source AgentOS LLM-Based Agent Framework for Generating and Executing Python Code

  11. Using the Unsloth Library on Google Colab to FineTune Models for Ollama

  12. DSP Experiments

  13. Reasoning with Large Language Models

  14. Using Property Graph Database with Ollama

  15. LangGraph

  16. Using the Open Codex Command Line Interface Coding Agent

  17. Long Term Persistence Using Mem0 and Chroma

  18. Using Ollama Cloud Services

  19. Semantic Navigator App Using Gradio

  20. RAG Using zvec Vector Datastore and Local Model

  21. Book Wrap Up