Ollama Tools/Function Calling in Racket
One of the most powerful features of modern LLMs is their ability to call external functions (tools) during a conversation. This allows the model to perform actions beyond just generating text — it can fetch live data, interact with files, call APIs, and more.
Ollama supports tool/function calling through its chat API. When you provide a list of available tools with their schemas, the model can decide to call one or more tools, and your code executes them and returns the results back to the model.
The examples for this chapter are in the directory Racket-AI-book/source-code/ollama_tools.
How Tool Calling Works
The flow is:
- You define tools — functions with JSON schemas describing their parameters
- Send request to Ollama — include the tool definitions and user prompt
-
Model decides — if it needs a tool, it returns a
tool_callsarray - You execute the tool — call your Racket function with the arguments
- Return result — add the tool result to the message history
- Model responds — uses the tool output to generate its final answer
This creates a conversation loop where the LLM can request information it doesn’t have intrinsically from its training data.
A Racket Tools Library
The following code defines a reusable library for Ollama tool calling. It provides:
- A tool registry to register functions with their schemas
- Built-in tools for common operations (weather, files, Wikipedia)
- API communication to call Ollama and handle tool responses
This example demonstrates how to bridge the gap between Large Language Models and local system capabilities by implementing a tool-calling framework in Racket. The code provides a structured way to register Racket functions as “tools” that Ollama-hosted models can invoke to perform real-world tasks such as fetching live weather data, searching Wikipedia, or interacting with the local file system. By defining a clear registry system and using JSON schema for parameter validation, the module automates the complex loop of sending prompts to the LLM, parsing its request for a function call, executing the corresponding Racket code, and returning the results back to the model for a final synthesis. This pattern is essential for building “agentic” applications where the AI is not just a chatbot, but a functional interface capable of executing logic and retrieving dynamic data.
The following file tools.rkt contains both the library code for creating and using tools and also example tool implementations:
1 #lang racket
2
3 ;;; Copyright (C) 2026 Mark Watson <markw@markwatson.com>
4 ;;; Apache 2 License
5 ;;;
6 ;;; Ollama Tools/Function Calling Example for Racket
7 ;;;
8 ;;; This module demonstrates how to use Ollama's tool/function calling
9 ;;; capability from Racket. It defines tools (functions) that the LLM
10 ;;; can call, registers them, and handles the tool call flow.
11
12 (require net/http-easy)
13 (require json)
14 (require racket/date)
15
16 (provide register-tool
17 get-tool
18 call-ollama-with-tools
19 get-current-datetime
20 get-weather
21 list-directory
22 read-file-contents
23 *available-tools*
24 *ollama-host*
25 *default-model*)
26
27 ;;; -----------------------------------------------------------------------------
28 ;;; Configuration
29
30 (define *default-model* (make-parameter (or (getenv "OLLAMA_MODEL") "qwen3:1.7b")))
31 (define *ollama-host* (make-parameter (or (getenv "OLLAMA_HOST") "http://localhost:1\
32 1434")))
33
34 ;;; -----------------------------------------------------------------------------
35 ;;; Tool Registry
36
37 (define *available-tools* (make-hash))
38
39 (define (register-tool name description parameters handler)
40 "Register a tool that can be called by the LLM.
41 NAME: string - the tool name
42 DESCRIPTION: string - what the tool does
43 PARAMETERS: hash - JSON schema for parameters
44 HANDLER: function - Racket function to execute the tool"
45 (hash-set! *available-tools* name
46 (hash 'name name
47 'description description
48 'parameters parameters
49 'handler handler)))
50
51 (define (get-tool name)
52 "Get a registered tool by name."
53 (hash-ref *available-tools* name #f))
54
55 ;;; -----------------------------------------------------------------------------
56 ;;; Tool Implementations
57
58 (define (get-current-datetime args)
59 "Returns the current date and time as a string."
60 (date->string (current-date) "~Y-~m-~d ~H:~M:~S"))
61
62 (define (get-weather args)
63 "Fetches current weather for a location using wttr.in.
64 ARGS should contain 'location' key."
65 (let ([location (hash-ref args 'location "unknown")])
66 (with-handlers ([exn:fail? (lambda (e)
67 (format "Error fetching weather: ~a" (exn-message e\
68 )))])
69 (let* ([url (format "https://wttr.in/~a?format=3"
70 (string-replace location " " "+"))]
71 [response (get url)]
72 [body (response-body response)])
73 (string-trim (bytes->string/utf-8 body))))))
74
75 (define (list-directory args)
76 "Lists files in the current directory.
77 ARGS: empty hash (no parameters needed)"
78 (let ([files (directory-list (current-directory))])
79 (format "Files in ~a: ~a"
80 (current-directory)
81 (string-join (map path->string files) ", "))))
82
83 (define (read-file-contents args)
84 "Reads contents of a file.
85 ARGS should contain 'file_path' key."
86 (let ([file-path (hash-ref args 'file_path #f)])
87 (if (and file-path (file-exists? file-path))
88 (with-handlers ([exn:fail? (lambda (e)
89 (format "Error reading file: ~a" (exn-message e\
90 )))])
91 (file->string file-path))
92 (format "File not found: ~a" file-path))))
93
94 (define (search-wikipedia args)
95 "Searches Wikipedia for a query and returns summary.
96 ARGS should contain 'query' key."
97 (let ([query (hash-ref args 'query #f)])
98 (if query
99 (with-handlers ([exn:fail? (lambda (e)
100 (format "Error searching Wikipedia: ~a" (exn-me\
101 ssage e)))])
102 (let* ([url (format "https://en.wikipedia.org/api/rest_v1/page/summary/~a"
103 (string-replace query " " "_"))]
104 [response (get url
105 #:headers (hash 'user-agent "RacketOllamaTools/1.0"))]
106 [data (response-json response)])
107 (hash-ref data 'extract "No summary available")))
108 "No query provided")))
109
110 ;;; -----------------------------------------------------------------------------
111 ;;; Register Default Tools
112
113 (register-tool
114 "get_current_datetime"
115 "Get the current date and time"
116 (hash 'type "object"
117 'properties (hash)
118 'required '())
119 get-current-datetime)
120
121 (register-tool
122 "get_weather"
123 "Get the current weather for a location"
124 (hash 'type "object"
125 'properties (hash 'location (hash 'type "string"
126 'description "City name, e.g., 'London' or\
127 'New York'"))
128 'required '("location"))
129 get-weather)
130
131 (register-tool
132 "list_directory"
133 "List files in the current directory"
134 (hash 'type "object"
135 'properties (hash)
136 'required '())
137 list-directory)
138
139 (register-tool
140 "read_file_contents"
141 "Read the contents of a file"
142 (hash 'type "object"
143 'properties (hash 'file_path (hash 'type "string"
144 'description "Path to the file to read"))
145 'required '("file_path"))
146 read-file-contents)
147
148 (register-tool
149 "search_wikipedia"
150 "Search Wikipedia and return a summary"
151 (hash 'type "object"
152 'properties (hash 'query (hash 'type "string"
153 'description "Search query"))
154 'required '("query"))
155 search-wikipedia)
156
157 ;;; -----------------------------------------------------------------------------
158 ;;; Ollama API Communication
159
160 (define (make-tool-schemas tool-names)
161 "Build tool schemas for the Ollama API request."
162 (for/list ([name tool-names])
163 (let ([tool (get-tool name)])
164 (if tool
165 (hash 'type "function"
166 'function (hash 'name (hash-ref tool 'name)
167 'description (hash-ref tool 'description)
168 'parameters (hash-ref tool 'parameters)))
169 (error (format "Unknown tool: ~a" name))))))
170
171 (define (call-ollama-api messages tools)
172 "Call the Ollama chat API with tools.
173 MESSAGES: list of message hashes with 'role and 'content
174 TOOLS: list of tool schemas"
175 (let* ([data (hash 'model (*default-model*)
176 'messages messages
177 'tools tools
178 'stream #f)]
179 [json-data (jsexpr->string data)]
180 [response (post (string-append (*ollama-host*) "/api/chat")
181 #:data json-data
182 #:headers (hash 'content-type "application/json"))]
183 [result (response-json response)])
184 result))
185
186 (define (handle-tool-call tool-call)
187 "Execute a tool call from the LLM response."
188 (let* ([name (hash-ref tool-call 'function (hash))]
189 [func-name (hash-ref name 'name #f)]
190 [args-str (hash-ref name 'arguments "{}")]
191 [args (if (string? args-str)
192 (string->jsexpr args-str)
193 args-str)]
194 [tool (get-tool func-name)])
195 (if tool
196 (let ([handler (hash-ref tool 'handler #f)])
197 (if handler
198 (let ([result (handler args)])
199 (hash 'role "tool"
200 'content result))
201 (hash 'role "tool"
202 'content (format "No handler for tool: ~a" func-name))))
203 (hash 'role "tool"
204 'content (format "Unknown tool: ~a" func-name)))))
205
206 (define (call-ollama-with-tools prompt tool-names #:model [model (*default-model*)])
207 "Call Ollama with tools and handle the tool calling loop.
208 PROMPT: the user's prompt
209 TOOL-NAMES: list of tool names to make available
210 MODEL: optional model override
211
212 Returns the final response text after any tool calls are processed."
213 (parameterize ([*default-model* model])
214 (let* ([tools (make-tool-schemas tool-names)]
215 [messages (list (hash 'role "user" 'content prompt))])
216 (let loop ([msgs messages]
217 [max-iterations 10])
218 (if (<= max-iterations 0)
219 "Max iterations reached"
220 (let* ([response (call-ollama-api msgs tools)]
221 [message (hash-ref response 'message (hash))]
222 [tool-calls (hash-ref message 'tool_calls #f)])
223 (if tool-calls
224 ;; Process tool calls and continue
225 (let* ([tool-results (for/list ([tc tool-calls])
226 (handle-tool-call tc))]
227 [assistant-msg (hash 'role "assistant"
228 'content (hash-ref message 'content #f)
229 'tool_calls tool-calls)]
230 [new-msgs (append msgs (list assistant-msg)
231 tool-results)])
232 (loop new-msgs (- max-iterations 1)))
233 ;; No tool calls, return the content
234 (hash-ref message 'content "No response"))))))))
235
236 ;;; -----------------------------------------------------------------------------
237 ;;; Example Usage (commented out for library use)
238
239 #|
240 (require "tools.rkt")
241
242 ;; Example 1: Get current date/time
243 (displayln (call-ollama-with-tools
244 "What is the current date and time?"
245 '("get_current_datetime")))
246
247 ;; Example 2: Get weather
248 (displayln (call-ollama-with-tools
249 "What is the weather in Phoenix Arizona?"
250 '("get_weather")))
251
252 ;; Example 3: Multiple tools available
253 (displayln (call-ollama-with-tools
254 "Tell me about the Eiffel Tower"
255 '("get_weather" "search_wikipedia" "get_current_datetime")))
256
257 ;; Example 4: List files
258 (displayln (call-ollama-with-tools
259 "What files are in the current directory?"
260 '("list_directory")))
261 |#
This tool use implementation relies on a central registry, available-tools which stores tool metadata and their associated handler functions. When a user sends a prompt, the call-ollama-with-tools function packages the available tool definitions into the format expected by the Ollama API. The model then decides whether to answer the query directly or request a tool execution. If the model provides a tool_calls object, the Racket handler dynamically dispatches the request to the local function, processes the output, and feeds it back into the conversation history.
A key technical highlight is the use of the net/http-easy and json libraries to manage the RESTful communication with the Ollama service. The recursive loop within call-ollama-with-tools ensures that the system can handle multi-step reasoning where a model might need to call one tool to get a piece of information before calling another to complete the task. This robust structure allows developers to expand the LLM’s capabilities indefinitely by simply registering new Racket functions to the registry.
Complete Example Using the Tools Library and Example Tools
Here we use the example tool that we previously saw implemented in the file tools.rkt.
The file main.rkt in the ollama_tools directory provides an interactive menu for testing the tools:
1 #lang racket
2
3 ;;; Copyright (C) 2026 Mark Watson <markw@markwatson.com>
4 ;;; Apache 2 License
5 ;;;
6 ;;; Ollama Tools Example - Interactive Demo
7 ;;;
8 ;;; Run with: racket main.rkt
9
10 (require "tools.rkt")
11
12 (define (display-menu)
13 (displayln "\n=== Ollama Tools Demo ===")
14 (displayln "1. Get current date and time")
15 (displayln "2. Get weather for a location")
16 (displayln "3. List files in current directory")
17 (displayln "4. Read a file")
18 (displayln "5. Search Wikipedia")
19 (displayln "6. Custom prompt (all tools available)")
20 (displayln "7. Exit")
21 (display "Select option: "))
22
23 (define (run-demo)
24 (displayln (format "Using model: ~a" (*default-model*)))
25 (displayln (format "Ollama host: ~a" (*ollama-host*)))
26 (displayln "Make sure Ollama is running and the model is pulled.")
27 (newline)
28
29 (let loop ()
30 (display-menu)
31 (let ([choice (read-line)])
32 (cond
33 [(string=? choice "1")
34 (displayln "\n>>> Calling get_current_datetime...")
35 (displayln (call-ollama-with-tools
36 "What is the current date and time?"
37 '("get_current_datetime")))
38 (loop)]
39
40 [(string=? choice "2")
41 (display "Enter location: ")
42 (let ([location (read-line)])
43 (displayln (format "\n>>> Getting weather for ~a..." location))
44 (displayln (call-ollama-with-tools
45 (format "What is the weather in ~a?" location)
46 '("get_weather"))))
47 (loop)]
48
49 [(string=? choice "3")
50 (displayln "\n>>> Listing directory...")
51 (displayln (call-ollama-with-tools
52 "What files are in the current directory?"
53 '("list_directory")))
54 (loop)]
55
56 [(string=? choice "4")
57 (display "Enter file path: ")
58 (let ([filepath (read-line)])
59 (displayln (format "\n>>> Reading ~a..." filepath))
60 (displayln (call-ollama-with-tools
61 (format "Read the contents of ~a and summarize it" filepath)
62 '("read_file_contents"))))
63 (loop)]
64
65 [(string=? choice "5")
66 (display "Enter search query: ")
67 (let ([query (read-line)])
68 (displayln (format "\n>>> Searching Wikipedia for ~a..." query))
69 (displayln (call-ollama-with-tools
70 (format "Tell me about ~a" query)
71 '("search_wikipedia"))))
72 (loop)]
73
74 [(string=? choice "6")
75 (display "Enter your prompt: ")
76 (let ([prompt (read-line)])
77 (displayln "\n>>> Processing with all tools...")
78 (displayln (call-ollama-with-tools
79 prompt
80 '("get_current_datetime" "get_weather"
81 "list_directory" "read_file_contents"
82 "search_wikipedia"))))
83 (loop)]
84
85 [(string=? choice "7")
86 (displayln "Goodbye!")]
87
88 [else
89 (displayln "Invalid choice, try again.")
90 (loop)]))))
91
92 (run-demo)
Here is some example output:
1 $ racket main.rkt
2 Using model: qwen3:1.7b
3 Ollama host: http://localhost:11434
4 Make sure Ollama is running and the model is pulled.
5
6
7 === Ollama Tools Demo ===
8 1. Get current date and time
9 2. Get weather for a location
10 3. List files in current directory
11 4. Read a file
12 5. Search Wikipedia
13 6. Custom prompt (all tools available)
14 7. Exit
15 Select option: 1
16
17 >>> Calling get_current_datetime...
18 The current date and time is **Wednesday, April 8th, 2026 11:28:40am**.
19
20 === Ollama Tools Demo ===
21 1. Get current date and time
22 2. Get weather for a location
23 3. List files in current directory
24 4. Read a file
25 5. Search Wikipedia
26 6. Custom prompt (all tools available)
27 7. Exit
28 Select option: 3
29
30 >>> Listing directory...
31 The current directory contains the following files:
32
33 - `README.md`
34 - `compiled`
35 - `main.rkt`
36 - `main.rkt~` (modified)
37 - `tools.rkt`
38 - `tools.rkt~` (modified)
39
40 These files are located in the directory `/Users/markwatson/GITHUB/Racket-AI-book/so\
41 urce-code/ollama_tools/`. The ~ symbols indicate modified files.
42
43 === Ollama Tools Demo ===
44 1. Get current date and time
45 2. Get weather for a location
46 3. List files in current directory
47 4. Read a file
48 5. Search Wikipedia
49 6. Custom prompt (all tools available)
50 7. Exit
51 Select option: 5
52 Enter search query: Flagstaff Arizona
53
54 >>> Searching Wikipedia for Flagstaff Arizona...
55 Flagstaff, Arizona, is a city located in the Phoenix metropolitan area, known for it\
56 s scenic beauty, historical landmarks, and outdoor activities. It is part of the Gra\
57 nd Canyon Railway system and is home to the Grand Canyon Railway Museum. The city al\
58 so features the historic Flagstaff Historical Society and the Flagstaff Art Center. \
59 Flagstaff is situated near the Colorado River and is a popular destination for outdo\
60 or recreation, including hiking, camping, and visiting the Grand Canyon. While speci\
61 fic Wikipedia summaries may not be available, Flagstaff is recognized for its natura\
62 l beauty, cultural heritage, and community spirit.
63
64 === Ollama Tools Demo ===
65 1. Get current date and time
66 2. Get weather for a location
67 3. List files in current directory
68 4. Read a file
69 5. Search Wikipedia
70 6. Custom prompt (all tools available)
71 7. Exit
72 Select option:
Summary
Tool calling transforms LLMs from passive text generators into active agents that can:
- Access live data — weather, news, stock prices
- Interact with the system — read/write files, run commands
- Call external APIs — databases, web services
- Chain operations — multiple tools in sequence
This is foundational for building AI agents and assistants. In the next chapter on agents, we’ll see how tools enable more complex autonomous behavior.