Multi-prompt Search using LLMs, the Duckduckgo Search API, and Local Ollama Models
The short example we develop is inspired by commercial LLM apps like Perplexity. I subscribe to the Perplexity Pro plan and find it useful so I wanted to implement my own simple minimalist Python library that provides the same type of multiple LLM pass on a query and search results, finally producing a relevant summary.
We will start by looking at example uses of this library and then, dear reader, you can decide if you want to hack on this example code and make it your own.
The example code uses three simple prompt templates used to filter out non-useful search results, to summarize the text from fetch search result web links, and to write a final summary:
1 prompt1 = "return concisely either 'Y' or 'N' if this query | %s | is matched well b\
2 y the following text: %s"
3 prompt2 = "Using the query | %s | summarize the following text including only materi\
4 al relevant to the query:\n%s"
5 prompt3 = "Using the query | %s | summarize in multiple paragraphs the following tex\
6 t including only material relevant to the query:\n%s"
Example 1: “Write a business plan for a new startup using LLMs and expertise in medical billing.“
The example code has a ton of debug printout so here we only look at the final output summary:
1 Title: Business Plan for Mercury Concierge Medical Practice Startup
2
3 The business plan for the new startup, Mercury Concierge Medical Practice, is design\
4 ed to transition an existing medical practice to a concierge model, leveraging the e
5 xpertise of the Mercury Advisory Group. The primary objective is to provide personal
6 ized and comprehensive healthcare services while ensuring financial sustainability.
7
8 To achieve this, the startup will develop a mission statement, explore strategic opt\
9 ions, and create a budget. Key areas of focus include market entry strategy, operati
10 ons plan, HIM technology requirements, revenue management strategy, organizational d
11 esign, goal setting, and business unit strategies.
12
13 Mercury Concierge Medical Practice offers an initial flat-fee consultation service p\
14 riced at $3500. This service includes a full day of private coaching, Q&A sessions,
15 travel costs, a workbook, and a signed copy of the Handbook of Concierge Medical Pra
16 ctice Design.
17
18 Beyond the initial consultation, Mercury Advisory Group provides additional services\
19 such as conducting SWOT (Strengths, Weaknesses, Opportunities, Threats), PEST (Poli
20 tical, Economic, Social, Technological) analyses, and Porter's Five Forces assessmen
21 ts to develop a business sustainability strategy. They also assist with the developm
22 ent of a lean/continuous improvement strategy, staffing plan, promotional strategy,
23 and operations plan.
24
25 The startup caters to various situations, including physicians who wish to jettison \
26 non-converting patients, those starting a new practice from scratch, or those transi
27 tioning while bound by contracts with health plans. The service is currently availab
28 le in the continental USA, with additional travel surcharges for Alaska, Guam, Puert
29 o Rico, Hawaii, and USVI.
30
31 In summary, Mercury Concierge Medical Practice aims to provide high-quality, persona\
32 lized healthcare services through a concierge model. By leveraging the expertise of
33 the Mercury Advisory Group, the startup will create a sustainable business strategy,
34 develop an effective operations plan, and offer valuable consulting services to med
35 ical professionals.
Example 2: “Common Lisp and Deep Learning consultant”
Here we only look at the final output summary:
1 Mark Watson is a professional deep learning and artificial intelligence consultant w\
2 ho specializes in various programming languages such as Common Lisp, Clojure, Python
3 , Java, Haskell, and Ruby. He has authored over 20 books on AI, deep learning, and o
4 ther related topics, with his clients including notable companies like Google, Capit
5 al One, Disney, and Olive AI.
6
7 For those aiming to start with AI, Mark recommends several free courses: Generative \
8 AI by Google, AI for Beginners by Microsoft, and Artificial Intelligence (6.034) by
9 MIT. These courses cover topics such as generative AI, neural networks, deep learnin
10 g, computer vision, natural language processing, and more. Each course is text-based
11 and offers exercises to help solidify understanding.
Example Code for Multi-prompt Search using LLMs, the Duckduckgo Search API, and Local Ollama Models
The example code script uses LangChain’s Ollama interface, the Duckduckgo library for accessing Duckduckgo’s internal quick results data (for low bandwidth non-commercial use), and the Trafilatura library for fetching p[lain text from a web URI:
1 from ddg import Duckduckgo
2 from langchain_community.llms.ollama import Ollama
3
4 # pip install llama-index html2text trafilatura
5 import trafilatura
6 from pprint import pprint
7
8 ddg_api = Duckduckgo()
9
10 llm = Ollama(
11 model="mistral:v0.3",
12 verbose=False,
13 )
14
15 prompt1 = "return concisely either 'Y' or 'N' if this query | %s | is matched well b\
16 y the following text: %s"
17 prompt2 = "Using the query | %s | summarize the following text including only materi\
18 al relevant to the query:\n%s"
19 prompt3 = "Using the query | %s | summarize in multiple paragraphs the following tex\
20 t including only material relevant to the query:\n%s"
21
22 def llm_search(query):
23 results = ddg_api.search(query)
24 data = results['data']
25 good_results = []
26 good_summaries = []
27 for d in data:
28 description = d['description']
29 p = prompt1 % (query, description)
30 s = llm.invoke(p)
31 print(f"Prompt: {p}\nResponse: {s}\n\n")
32 if s.strip()[0:1] == 'Y':
33 good_results.append(d)
34 uri = d['url']
35 downloaded = trafilatura.fetch_url(uri)
36 text = trafilatura.extract(downloaded)
37 p2 = prompt2 % (query, text)
38 s2 = llm.invoke(p2)
39 good_summaries.append(s2)
40 p3 = prompt3 % (query, "\n\n".join(good_summaries))
41 final_summary = llm.invoke(p3)
42
43 return (good_results, good_summaries, final_summary)
Here is an example use of this example code:
1 (results, summaries, final_summary) = llm_search("Write a business plan for a new st\
2 artup using LLMs and expertise in medical billing.")
3
4 print(f"\n\n****** Good Results ******\n\n")
5 print(results)
6
7 print(f"\n\n****** Good Summaries ******\n\n")
8 print(summaries)
9
10 print(f"\n\n****** Final Summary ******\n\n")
11 print(final_summary)