Chatbot for question answering over LangChain documentation
Top 8.7% on sourcepulse
This repository implements a chatbot designed for question answering over the LangChain documentation. It targets developers and users seeking to query technical documentation efficiently, leveraging real-time updates and streaming capabilities.
How It Works
The system comprises two main components: ingestion and question-answering. Ingestion involves fetching HTML from the LangChain documentation site and GitHub codebase, loading it with RecursiveURLLoader
and SitemapLoader
, splitting documents using RecursiveCharacterTextSplitter
, and creating a Weaviate vectorstore with OpenAI embeddings. The question-answering pipeline first refines user input and chat history into a standalone question using an LLM, retrieves relevant documents from the vectorstore, and then passes these to a model for a streamed answer.
Quick Start & Requirements
This project is deployed via LangGraph Cloud and is not designed for local execution without a LangGraph Cloud account. For a local, albeit feature-limited, version, refer to the code and documentation from the specified branch.
Highlighted Details
Maintenance & Community
No specific details on contributors, sponsorships, or community channels (like Discord/Slack) are provided in the README.
Licensing & Compatibility
The repository's licensing is not specified in the provided README.
Limitations & Caveats
The project is currently deployed via LangGraph Cloud, limiting local execution without an account. A separate branch offers a local-runnable version but with a reduced feature set.
1 day ago
1 week