Using Local LLMs With Ollama

Running local models with Ollama offers several practical advantages for Common Lisp developers, especially those of us building exploratory or long-lived AI systems:

  • Local inference eliminates network latency and external API dependencies, which simplifies debugging, improves reproducibility, and enables fully offline workflows—important when iterating on symbolic/LLM hybrids or REPL-driven experiments.
  • Data never leaves the machine, providing strong privacy guarantees and avoiding compliance issues that can arise when sending prompts or intermediate representations to third-party services.
  • Cost and rate-limit concerns disappear: once a model is downloaded, usage is bounded only by local compute, making it ideal for background agents, continuous evaluation, or batch reasoning tasks initiated from Lisp.
  • Ollama’s simple HTTP interface fits naturally with Common Lisp’s strengths—process control, incremental development, and meta-programming, allowing developers to treat local language models as just another deterministic(ish) subsystem under their control.

The ollama package developed here provides generative AI code and tool use/function calling generative AI code in the directory loving-common-lisp/src/ollama.

Design Notes (Optional Material)

Here we describe the design and architecture of the Ollama Common Lisp library, which provides an interface to the Ollama API for running local LLMs.

1. Common Utilities

The shared utilities are defined in ollama-helper.lisp and provide foundational functionality used by both basic completions and tool-calling APIs.

Configuration

  • model-host — The Ollama API endpoint URL, defaulting to http://localhost:11434/api/chat

JSON Handling

  • lisp-to-json-string — Converts Lisp data structures (alists) to JSON strings using cl-json
  • substitute-subseq — String substitution utility used to work around cl-json’s encoding of nil as null (the Ollama API requires false for the stream parameter)

HTTP Communication

  • ollama-helper — Core request handler that:
    1. Executes curl commands via uiop:run-program
    2. Parses JSON responses
    3. Extracts message content and tool calls from the response structure
    4. Returns multiple values: (content function-calls)

Package Definition

The ollama package (defined in package.lisp) exports:

  • completions, completions-with-tools — Main API functions
  • summarize, answer-question — Convenience wrappers
  • *model-name*, *tool-model-name*, *model-host* — Configuration variables

2. Generative AI

Basic generative AI functionality is provided in ollama.lisp for simple text completions without tool calling.

Configuration

  • model-name — Model identifier, defaults to "mistral:v0.3"

Core Functions

  • completions — Sends a user prompt to the LLM and returns the text response
    • Constructs a message with role “user” and the provided content
    • Builds the request payload with model, stream (false), and messages
    • Uses ollama-helper to execute the request and extract content

Convenience Wrappers

  • summarize — Prepends “Summarize: “ to input text and calls completions
  • answer-question — Formats input as a Q&A prompt and calls completions

Request Flow

1 User TextMessage ConstructionJSON Encodingcurl Command2 Ollama APIJSON ResponseContent ExtractionReturn String

3. Generative AI with Tools

Tool-calling (function calling) support is implemented in ollama-tools.lisp, enabling the LLM to invoke registered functions.

Configuration

  • tool-model-name — Model for tool calling, defaults to "mistral:v0.3"
  • available-functions — Hash table storing registered tool functions

Data Structures

  • ollama-function — Struct containing:
    • name — Function identifier string
    • description — Human-readable description for the LLM
    • parameters — JSON Schema defining expected arguments
    • handler — Common Lisp function to invoke when called

Function Registration

  • register-tool-function — Registers a tool with the system
    • Parameters: name, description, parameters (JSON Schema), handler (Lisp function)
    • Stores an ollama-function struct in *available-functions*

Tool Execution

  • handle-tool-function-call — Processes an LLM tool call
    • Extracts function name and arguments from the response
    • Falls back to infer-function-name-from-args if model returns empty name
    • Looks up the registered handler and invokes it with the arguments
  • infer-function-name-from-args — Workaround for models that return empty function names
    • Inspects argument keys to determine which function was intended

Main API

  • completions-with-tools — Enhanced completion with tool support
    • Accepts prompt text and optional list of function names to enable
    • Builds tool definitions from registered functions
    • Sends request to Ollama with tools specification
    • Automatically invokes handlers when LLM returns tool calls

Built-in Tools

Two sample tools are pre-registered:

  1. get_weather — Returns mock weather data for a location

    • Parameters: location (string) — The city name
    • Returns: Formatted weather string
  2. calculate — Evaluates mathematical expressions

    • Parameters: expression (string) — Math expression like “2 + 2”
    • Uses Common Lisp’s eval to compute results

Tool Call Flow

1 User Prompt + Tool NamesBuild Tool DefinitionsJSON Request2 Ollama APIResponse with Tool CallsParse Function Call3 Lookup HandlerInvoke with ArgumentsReturn Result

Example Usage

1 (ollama::completions-with-tools 
2   "What's the weather like in New York?" 
3   '("get_weather" "calculate"))
4 ;; => "Weather in New York: Sunny, 72°F"

System Definition

The ASDF system (ollama.asd) loads components in dependency order:

  1. package — Package definition
  2. ollama-helper — Shared utilities
  3. ollama-tools — Tool-calling support
  4. ollama — Basic completions

Dependencies: uiop, cl-json

Implementation of Common Helper Code

The defpackage form for the #:ollama library establishes an isolated namespace for interacting with local Large Language Models. By inheriting functionality from #:cl, #:uiop, and #:cl-json, the package handles core logic, system-level file operations, and the JSON-heavy communication required by the Ollama REST API. The exported symbols define a public interface, ranging from high-level text processing functions like summarize and answer-question.

Listing of package.lisp:

1 ;;;; package.lisp
2 
3 (defpackage #:ollama
4   (:use #:cl #:uiop #:cl-json)
5   (:export #:completions #:completions-with-tools
6            #:summarize #:answer-question
7            *model-name* *tool-model-name* *model-host*))

Listing of ollama.asd that defines a defsystem for this package:

 1 ;;;; ollama.asd
 2 
 3 (asdf:defsystem #:ollama
 4   :description "Library for using the ollama APIs"
 5   :author "Mark Watson"
 6   :license "Apache 2"
 7   :depends-on (#:uiop #:cl-json)
 8   :components ((:file "package")
 9                (:file "ollama-helper")
10                (:file "ollama-tools") 
11                (:file "ollama")))

The following implementation establishes a bridge between Common Lisp and the Ollama local API, providing the infrastructure necessary for handling structured LLM interactions. By defining a dedicated ollama package and setting a default local Ollama server host variable, the code creates a controlled environment for external communication. The utility functions included here address two primary technical hurdles: the conversion of Lisp data structures into JSON-compliant strings for API consumption and a manual string substitution routine for fine-tuning command payloads. At the heart of this listing is a robust helper function that orchestrates a system-level curl call, capturing the resulting output and parsing the returned JSON. This process involves a traversal of the response object to isolate the model’s textual content and any prospective tool calls, ensuring that the final output is returned in a format that Lisp can easily manipulate for downstream logic.

Listing of ollama-helper.lisp:

 1 (in-package #:ollama)
 2 
 3 (defvar *model-host* "http://localhost:11434/api/chat")
 4 
 5 (defun lisp-to-json-string (data)
 6   (with-output-to-string (s)
 7     (json:encode-json data s)))
 8 
 9 (defun substitute-subseq (string old new &key (test #'eql))
10   (let ((pos (search old string :test test)))
11     (if pos
12         (concatenate 'string
13                      (subseq string 0 pos)
14                      new
15                      (subseq string (+ pos (length old))))
16         string)))
17 
18 (defun ollama-helper (curl-command)
19   (princ curl-command)
20   (terpri)
21   (handler-case
22       (let ((response
23              (uiop:run-program
24               curl-command
25               :output :string
26               :error-output :string)))
27         (princ "Raw response: ")
28         (princ response)
29         (terpri)
30         (with-input-from-string
31             (s response)
32           (let* ((json-as-list (json:decode-json s))
33                  (message (cdr (assoc :message json-as-list)))
34                  (content (cdr (assoc :content message)))
35                  ;; Extract function details from each tool_call
36                  (function-calls (mapcar (lambda (tc)
37                                            (cdr (assoc :function tc)))
38                                          tool-calls)))
39             (values content function-calls))))
40     (error (e)
41       (format t "Error executing curl command: ~a~%" e)
42       nil)))

The code begins by setting up the environment with a global variable for the Ollama endpoint and helper functions for data transformation. The function lisp-to-json-string leverages the cl-json library to serialize data, while substitute-subseq provides a specialized way to replace substrings within the command strings. These utilities ensure that the data sent to the model is formatted correctly and that the commands remain flexible.

The core logic resides in ollama-helper, which uses uiop:run-program to execute a shell command and capture its output. The function is designed with error handling to manage potential connectivity or execution failures gracefully. Once a response is received, it decodes the JSON and performs an association list lookup to extract both the natural language message and any structured function calls, returning them as multiple values for the caller to process.

Implementation of Generative AI Functionality

In this section, we examine a practical implementation of a Common Lisp client designed to interface with the Ollama local LLM inference service. The code defines a workflow for sending synchronous requests to a Large Language Model (LLM) by wrapping the system’s curl utility to communicate with the Ollama API. By utilizing the mistral:v0.3 model as a default, the program demonstrates how to structure Lisp data, specifically association lists, into the JSON format required by the endpoint. It includes a specific handling mechanism for boolean conversion, ensuring that Lisp’s nil is correctly interpreted as a JSON false to disable streaming. Beyond the core transport logic, the listing provides high-level abstractions for common natural language processing tasks, such as summarization and question answering, illustrating how simple string concatenation can be used to format prompts that guide the model toward specific generative behaviors.

Listing of ollama.lisp:

 1 (in-package #:ollama)
 2 
 3 ;;; Basic Ollama completions without tool calling support
 4 ;;; For tool calling, see ollama-tools.lisp
 5 
 6 (defvar *model-name* "mistral:v0.3")
 7 
 8 (defun completions (starter-text)
 9   "Simple completion without function/tool calling support."
10   (let* ((message (list (cons :|role| "user")
11                         (cons :|content| starter-text)))
12          (data (list (cons :|model| *model-name*)
13                      (cons :|stream| nil)
14                      (cons :|messages| (list message))))
15          (json-data (lisp-to-json-string data))
16          ;; Hack: cl-json encodes nil as null, but we need false for stream
17          (fixed-json-data (substitute-subseq json-data ":null" ":false" :test #'string=))
18          (curl-command
19           (format nil "curl ~a -d ~s"
20                   ollama::*model-host*
21                   fixed-json-data)))
22     (multiple-value-bind (content function-call)
23         (ollama-helper curl-command)
24       (declare (ignore function-call))
25       (or content "No response content"))))
26 
27 ;;(ollama:completions "Complete the following text: The President went to")
28 
29 ;; Helper functions for summarization and question answering
30 (defun summarize (some-text)
31   (completions (concatenate 'string "Summarize: " some-text)))
32 
33 (defun answer-question (some-text)
34   (completions (concatenate 'string "
35 Q: " some-text "
36 A:")))

The core of this implementation lies in the completions function, which manages the transformation of Lisp structures into a command-line request. A notable detail is the manual string substitution used on the JSON payload; since many Common Lisp JSON libraries represent nil as null, the code explicitly replaces these occurrences with false to satisfy the Ollama API’s requirement for the stream parameter. This ensures the function waits for a complete response rather than processing a continuous stream of tokens, simplifying the return value for the caller.

The program also showcases the extensibility of the base completion logic through the summarize and answer-question helper functions. These functions act as specialized wrappers that prepend task-specific instructions to the user input, effectively demonstrating “prompt engineering” within a programmatic context. By delegating the heavy lifting to the ollama-helper and the external curl command, the code remains focused on message preparation and providing a clean, functional interface for Lisp-based AI applications.

Sample output:

1 

Implementation of Tool Use/Function Calling Generative AI Functionality

The following listing an experimental implementation of tool-calling (also known as function-calling) within a Common Lisp environment using the Ollama API. By defining a custom ollama-function structure and a global registry via a hash table, the code allows developers to map Large Language Model (LLM) tool requests directly to native Lisp handlers. The primary entry point, completions-with-tools, handles the complex task of serializing Lisp data structures into the specific JSON format required by Ollama’s /api/chat endpoint, including a necessary workaround for JSON boolean representation. Furthermore, the implementation includes a defensive “inference” mechanism to recover function names from argument keys if the model returns an incomplete response, ensuring that calls to registered tools like get_weather or calculate are dispatched correctly even when the model’s output is slightly malformed.

Listing of ollama-tools.lisp:

  1 (in-package #:ollama)
  2 
  3 ;;; Ollama completions with tool/function calling support
  4 ;;; Uses shared utilities from ollama-helper.lisp
  5 
  6 (defvar *tool-model-name* "qwen3:1.7b")
  7 
  8 (defvar *available-functions* (make-hash-table :test 'equal))
  9 
 10 (defstruct ollama-function
 11   name
 12   description
 13   parameters
 14   handler)  ;; Common Lisp function to handle the call
 15 
 16 (defun register-tool-function (name description parameters handler)
 17   "Register a function that can be called by the LLM via tool calling.
 18    HANDLER is a Common Lisp function that takes a plist of arguments."
 19   (setf (gethash name *available-functions*)
 20         (make-ollama-function
 21          :name name
 22          :description description
 23          :parameters parameters
 24          :handler handler)))
 25 
 26 (defun infer-function-name-from-args (args)
 27   "Infer the function name based on argument keys
 28    (workaround for models that return empty name)."
 29   (let ((arg-keys (mapcar #'car args)))
 30     (cond
 31       ((member :location arg-keys) "get_weather")
 32       ((member :expression arg-keys) "calculate")
 33       (t nil))))
 34 
 35 (defun handle-tool-function-call (function-call)
 36   "Handle a function call returned from the LLM
 37    by invoking the registered handler."
 38   (format t "~%DEBUG handle-tool-function-call: ~a~%" function-call)
 39   (let* ((raw-name (cdr (assoc :name function-call)))
 40          (args (cdr (assoc :arguments function-call)))
 41          ;; If name is empty, try to infer from arguments
 42          (name (if (or (null raw-name) (string= raw-name ""))
 43                    (infer-function-name-from-args args)
 44                    raw-name))
 45          (func (gethash name *available-functions*)))
 46     (format t "DEBUG raw-name=~a inferred-name=~a args=~a func=~a~%"
 47             raw-name name args func)
 48     (if func
 49         (let ((handler (ollama-function-handler func)))
 50           (if handler
 51               (funcall handler args)
 52               (format nil
 53                       "No handler for function ~a, args: ~a" name args)))
 54         (error "Unknown function: ~a" name))))
 55 
 56 (defun completions-with-tools (starter-text &optional functions)
 57   "Completion with function/tool calling support.
 58    STARTER-TEXT is the prompt to send to the LLM.
 59    FUNCTIONS is an optional list of registered function names
 60    to make available."
 61   (let* ((function-defs
 62            (when functions
 63              (mapcar
 64               (lambda (f)
 65                 (let ((func (gethash f *available-functions*)))
 66                   (list
 67                    (cons :|name| (ollama-function-name func))
 68                    (cons :|description|
 69                          (ollama-function-description func))
 70                    (cons :|parameters|
 71                          (ollama-function-parameters func)))))
 72               functions)))
 73          (message (list (cons :|role| "user")
 74                         (cons :|content| starter-text)))
 75          (base-data (list (cons :|model| *tool-model-name*)
 76                           (cons :|stream| nil)
 77                           (cons :|messages| (list message))))
 78          (data (if function-defs
 79                    (append base-data
 80                            (list (cons :|tools| function-defs)))
 81                    base-data))
 82          (json-data (lisp-to-json-string data))
 83          ;; Hack: cl-json encodes nil as null, but we need false
 84          (fixed-json-data
 85            (substitute-subseq json-data ":null" ":false"
 86                               :test #'string=))
 87          (curl-command
 88            (format nil "curl ~a -d ~s"
 89                    ollama::*model-host*
 90                    fixed-json-data)))
 91     (multiple-value-bind (content function-call)
 92         (ollama-helper curl-command)
 93       (if function-call
 94           (handle-tool-function-call (car function-call))
 95           (or content "No response content")))))
 96 
 97 ;; Define handler functions
 98 
 99 (defun get_weather (args)
100   "Handler for get_weather tool. ARGS is an alist with :location key."
101   (format t "get_weather called with args: ~a~%" args)
102   (let ((location (cdr (assoc :location args))))
103     (format nil "Weather in ~a: Sunny, 72°F" (or location "Unknown"))))
104 
105 (defun calculate (args)
106   "Handler for calculate tool. ARGS is an alist with :expression key."
107   (let ((expression (cdr (assoc :expression args))))
108     (if expression
109         (handler-case
110             (format nil "Result: ~a"
111                     (eval (read-from-string expression)))
112           (error (e) (format nil "Error calculating: ~a" e)))
113         "No expression provided")))
114 
115 ;; Register sample functions with handlers
116 (register-tool-function
117  "get_weather"
118  "Get current weather for a location"
119  (list (cons :|type| "object")
120        (cons :|properties|
121              (list (cons :|location|
122                          (list (cons :|type| "string")
123                                (cons :|description|
124                                      "The city name")))))
125        (cons :|required| '("location")))
126  #'get_weather)
127 
128 (register-tool-function
129  "calculate"
130  "Perform a mathematical calculation"
131  (list (cons :|type| "object")
132        (cons :|properties|
133              (list (cons :|expression|
134                          (list (cons :|type| "string")
135                                (cons :|description|
136                                      "Math expression like 2 + 2")))))
137        (cons :|required| '("expression")))
138  #'calculate)

The core of this system lies in the decoupling of tool definitions from their execution logic. By using the register-tool-function routine, you can define the JSON schema for a tool, specifying required parameters and types, while simultaneously binding it to a specific Lisp function. This allows the handle-tool-function-call dispatcher to act as a bridge, looking up the appropriate handler in the available-functions hash table and executing it with the arguments extracted from the LLM’s response.

One particularly noteworthy aspect of this implementation is its handling of Lisp’s unique syntax and data types during the JSON conversion process. Because standard Lisp libraries often encode nil as null, the code performs a string substitution to ensure the API receives false, which is mandatory for the :stream parameter in the Ollama schema. Additionally, the calculate tool demonstrates the power of this integration by using read-from-string and eval, allowing the LLM to effectively execute dynamic mathematical expressions directly within a Common Lisp REPL or program.

Sample output (I include a lot of debug printout):

 1 * (ql:quickload :ollama)
 2 To load "ollama":
 3   Load 1 ASDF system:
 4     ollama
 5 ; Loading "ollama"
 6 [package ollama].
 7 
 8 use:
 9 
10 (in-package :ollama)
11 nil
12 * (ollama::completions-with-tools "Use the get_weather tool for: What's the weather like in New York?" '("get_weather" "calculate"))
13 curl http://localhost:11434/api/chat -d "{\"model\":\"qwen3:1.7b\",\"stream\":false,\"messages\":[{\"role\":\"user\",\"content\":\"Use the get_weather tool for: What's the weather like in New York?\"}],\"tools\":[{\"name\":\"get_weather\",\"description\":\"Get current weather for a location\",\"parameters\":{\"type\":\"object\",\"properties\":{\"location\":{\"type\":\"string\",\"description\":\"The city name\"}},\"required\":[\"location\"]}},{\"name\":\"calculate\",\"description\":\"Perform a mathematical calculation\",\"parameters\":{\"type\":\"object\",\"properties\":{\"expression\":{\"type\":\"string\",\"description\":\"Math expression like 2 + 2\"}},\"required\":[\"expression\"]}}]}"
14 Raw response: {"model":"qwen3:1.7b","created_at":"2025-12-28T18:01:28.789187Z","message":{"role":"assistant","content":"","thinking":"Okay, the user is asking about the weather in New York. Let me check the available tools. There's a get_weather tool, which I think is meant to fetch weather information. The function name is probably \"get_weather\" and it takes a parameter, maybe the location. The user specified \"New York,\" so I need to call the get_weather function with \"New York\" as the argument. Let me make sure the parameters are correct. The tool's parameters are described as having a type \"properties\" but no specific details. Since the user provided the location, I'll pass that directly. I should structure the tool call with the name and arguments as a JSON object. Alright, that's it. Just call get_weather with \"New York\" as the argument.\n","tool_calls":[{"id":"call_fadz9if9","function":{"index":0,"name":"","arguments":{"location":"New York"}}}]},"done":true,"done_reason":"stop","total_duration":2964277542,"load_duration":79982833,"prompt_eval_count":150,"prompt_eval_duration":111485791,"eval_count":181,"eval_duration":2745854365}
15 
16 DEBUG handle-tool-function-call: ((index . 0) (name . )
17                                   (arguments (location . New York)))
18 DEBUG raw-name= inferred-name=get_weather args=((location . New York)) func=#S(ollama-function
19                                                                                :name get_weather
20                                                                                :description Get current weather for a location
21                                                                                :parameters ((type
22                                                                                              . object)
23                                                                                             (properties
24                                                                                              (location
25                                                                                               (type
26                                                                                                . string)
27                                                                                               (description
28                                                                                                . The city name)))
29                                                                                             (required
30                                                                                              location))
31                                                                                :handler #<function ollama::get_weather>)
32 get_weather called with args: ((location . New York))
33 "Weather in New York: Sunny, 72°F"
34 *