Flask web app for local knowledge graph exploration
Top 68.7% on sourcepulse
This application provides a local, Flask-based web interface for querying a Llama language model, generating step-by-step reasoning, and visualizing this process as an interactive knowledge graph. It is designed for users who want to explore complex queries and understand the underlying reasoning process locally, without relying on external APIs.
How It Works
The application utilizes a local Llama language model served via Ollama (defaulting to http://localhost:11434
) to process user queries. It generates a step-by-step reasoning chain, which is then transformed into a knowledge graph using NetworkX. Semantic similarity, calculated using Annoy and scikit-learn, is employed to find and display related questions and answers.
Quick Start & Requirements
pip install -r requirements.txt
python app.py
llama3.1:8b
) running on http://localhost:11434
.Highlighted Details
Maintenance & Community
No information on contributors, community channels, or roadmap is provided in the README.
Licensing & Compatibility
The README does not specify a license. Compatibility for commercial or closed-source use is not stated.
Limitations & Caveats
The application strictly requires a locally running Llama model accessible at http://localhost:11434
. No alternative model providers or endpoints are mentioned. The project status (e.g., alpha, beta) and potential for breaking changes are not indicated.
10 months ago
1+ week