Agents Using the Agno Agent Framework Running On a Local Ollama Model
The example in this chapter uses a local LLM running on Ollama. The examples for this chapter are found in the directory agents_agno. If yuo skipped reading the previous chapter, please review the opening material for running the Ollama service.
An Agent For Answering Questions About A Specific Web Site
Here we construct a sophisticated web scraping agent using the agno library. This program defines a specialized tool, scrape-website-content which leverages the requests and BeautifulSoup libraries to fetch and parse the textual content from any given URL, stripping away common non-content elements like navigation bars and scripts. This tool is then integrated into an Agent powered by a local Ollama model. The agent is configured with a detailed description, a step-by-step instruction set, and a defined output format, guiding it to first scrape a user-provided URL and then answer a specific question based only on the extracted information, ensuring a focused and verifiable response.
1 (import textwrap [dedent])
2 (import os requests)
3 (import bs4 [BeautifulSoup])
4 (import agno.agent [Agent])
5 (import agno.models.ollama [Ollama])
6 (import agno.tools [tool])
7
8 (tool ;; in Python this would be a @tool annotation
9 (defn scrape-website-content [url]
10 "Fetches and extracts the clean, textual content from a given webpage URL.
11 Use this tool when you need to read the contents of a specific web page to
12 answer a question.
13
14 Args:
15 url (str): The full, valid URL of the webpage to be scraped
16 (e.g., 'https://example.com').
17
18 Returns:
19 str: The extracted text content of the webpage.
20 "
21 (try
22 ;; Set a User-Agent header to mimic a real browser.
23 (let
24 [headers
25 {"User-Agent" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36\
26 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36"}
27 response (requests.get url :headers headers :timeout 10)]
28 (.raise_for_status response)
29 (let [soup (BeautifulSoup response.text "html.parser")]
30 ;; Remove unwanted tags
31 (for [tag (soup ["script" "style" "nav" "footer" "aside"])]
32 (.decompose tag))
33 (let [text (.get_text soup :separator "\n" :strip True)]
34 (if (not text)
35 f"Connected to {url}, but no text content could be extracted."
36 f"Successfully scraped content from {url}:\n\n{text}")))))))
37
38 ;; Initialize the web scraping and analysis agent
39 (setv scraper-agent (Agent
40 :model (Ollama :id "qwen3:30b")
41 :tools [scrape-website-content]
42 :description (dedent
43 "You are an expert web scraping and analysis agent. You follow a strict process:
44
45 - Given a URL in a prompt, you will first use the appropriate tool to scrape
46 its content.
47 - You will then carefully read the scraped content to understand it thoroughly.
48 - Finally, you will answer the user's question based *only* on the information
49 contained within that specific URL's content.")
50
51 ;; The instructions are refined to provide a clear, step-by-step reasoning process.
52 :instructions (dedent
53 "1. Scrape Phase πΈοΈ
54 - Analyze the user's prompt to identify the target URL.
55 - Invoke the `scrape` tool with the identified URL.
56
57 2. Analysis Phase π
58 - Carefully read the entire content returned by the `scrape` tool.
59 - Systematically extract the specific information required to answer the
60 user's question.
61
62 3. Answering Phase βοΈ
63 - Formulate a concise and accurate answer based exclusively on the scraped
64 information.
65 - If the information is not present, state that clearly.
66
67 4. Quality Control β
68 - Reread the original query and your answer to ensure it is accurate
69 and relevant.")
70
71 :expected_output (dedent
72 "# {Answer based on website content}
73
74 **Source:** {URL provided by the user}")
75
76 :markdown True
77 :show_tool_calls True
78 :add_datetime_to_instructions True))
79
80 ;; Main execution block
81 (when (= __name__ "__main__")
82 (setv prompt "Using the web site https://markwatson.com Consultant Mark Watson has\
83 written Common Lisp, semantic web, Clojure, Java, and AI books. What musical instru\
84 ments does he play?")
85
86 (.print-response scraper-agent
87 prompt
88 :stream True))
This code is divided into two main parts: the tool definition and the agent configuration.
The first part is the definition of the scrape-website-content function acts as the agentβs primary capability. It takes a URL, uses the requests library to perform an HTTP GET request (while mimicking a browserβs User-Agent header to improve compatibility), and then processes the resulting HTML with BeautifulSoup. Critically, it removes tags like <script>, <style>, <nav>, and <footer> that typically contain boilerplate or non-essential content. This cleaning step is vital as it provides the language model with a concise and relevant block of text, free from the noise of web page structure and styling, allowing it to focus on the core information needed to answer the userβs query.
The second part initializes the Agent from the agno library. This is where the AIβs behavior is defined. Itβs configured to use a specific Ollama model and is given access to the scrape-website-content tool we defined. The description and instructions parameters are crucial; they act as a system prompt that programs the agentβs workflow, forcing it into a strict sequence of scraping, analyzing, and then answering. By specifying expected_output, we enforce a consistent structure on the agentβs final response. The main execution block demonstrates a practical example, asking the agent to find information about musical instruments from a specific website, which triggers the entire scrape-and-answer process.
Note: The Agno framework prints beautiful colored bounding boxes around blocks of output text. In the following listing these bounding boxes, represented by four specific Unicode characters, just show up here as tiny box-characters.
1 $ uv run hy web_site_qa.hy
2 ββ Message ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
3 β β
4 β Using the web site https://markwatson.com Consultant Mark Watson has written β
5 β Common Lisp, semantic web, Clojure, Java, and AI books. What musical β
6 β instruments does he play? β
7 β β
8 βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
9 ββ Tool Calls βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
10 β β
11 β β’ scrape_website_content(url=https://markwatson.com) β
12 β β
13 βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
14 ββ Response (11.8s) βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
15 β β
16 β Mark Watson plays the guitar, didgeridoo, and American Indian flute. β
17 β β
18 β Source: https://markwatson.com β
19 β β
20 βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Wrap Up for Agno Agent Example
The Python source code repository for Agno is found here: https://github.com/agno-agi/agno.
Documentation is found here: https://docs.agno.com/introduction.
There were a few Hy-specific nuances for using Agno with the Hy language. Hopefully, dear reader, the example here serves as a good example fr writing your own aganet in the Hy language.