DocQA  by afaqueumer

Streamlit app for question answering using LLMs

created 2 years ago
294 stars

Top 90.9% on sourcepulse

GitHubView on GitHub
Project Summary

DocQA is a Streamlit web application enabling generative question answering over custom documents using LLMs and the LangChain framework. It targets users who want to interact conversationally with their own data without complex setup.

How It Works

The application leverages LangChain to process uploaded text files, creating a vector store database. Users can then ask natural language questions, which are processed by an LLM to generate answers based on the document content. This approach simplifies the integration of LLM-powered Q&A for personal document analysis.

Quick Start & Requirements

  • Install dependencies using setup_env.bat.
  • Launch the app with run_app.bat.
  • Requires Python environment.

Highlighted Details

  • Web application built with Streamlit and LangChain.
  • Enables question answering over custom uploaded text files.
  • Provides a conversational AI interface for local document interaction.

Maintenance & Community

Contributions are welcome via issues or pull requests.

Licensing & Compatibility

MIT License. Permissive for commercial use and closed-source linking.

Limitations & Caveats

The application relies on batch scripts (.bat) for setup and execution, indicating potential Windows-specific dependencies or a lack of cross-platform compatibility.

Health Check
Last commit

2 years ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
5 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.