WebUI extension for LLM web search using DuckDuckGo or SearXNG
Top 99.5% on sourcepulse
This extension for oobabooga/text-generation-webui
enables local Large Language Models (LLMs) to perform web searches. It targets users who want to augment their LLM's knowledge base with real-time information from the internet, providing a mechanism for LLMs to access and process external data.
How It Works
The extension intercepts LLM output for specific commands (e.g., Search_web("query")
) using regular expressions. It then utilizes DuckDuckGo (or a SearXNG instance) to perform the search, retrieving results. An ensemble of a dense embedding model and Okapi BM25 or SPLADE is employed to extract relevant snippets from the search results, which are then appended to the LLM's output. This approach allows for contextually relevant information retrieval, enhancing the LLM's ability to answer questions with up-to-date data.
Quick Start & Requirements
text-generation-webui
using "Install or update an extension."pip
(potentially using unofficial faiss-cpu
) or manual update of the conda environment using the provided environment.yml
.python server.py --extension LLM_Web_search
.Highlighted Details
Maintenance & Community
oobabooga/text-generation-webui
, benefiting from its community and development.Licensing & Compatibility
text-generation-webui
implies it's intended for use within that ecosystem.Limitations & Caveats
faiss-cpu
, which may not work on all systems.3 days ago
1 day