RAG framework and context engine
Top 36.6% on SourcePulse
Canopy is an open-source Retrieval Augmented Generation (RAG) framework designed for developers and researchers to quickly build and experiment with RAG applications. It simplifies the entire RAG workflow, from data ingestion and embedding to query optimization and context generation, enabling users to augment LLMs with their own data and reduce hallucinations.
How It Works
Canopy implements a full RAG pipeline, comprising a KnowledgeBase
for data chunking, embedding, and storage (Pinecone/Qdrant), a ContextEngine
for retrieving relevant document chunks and formulating context, and a ChatEngine
for managing chat history and generating LLM responses. This modular design allows for flexibility in data management and LLM integration.
Quick Start & Requirements
pip install canopy-sdk
(with optional extras like grpc
, torch
, transformers
, cohere
, qdrant
).INDEX_NAME
. Optional keys for Anyscale, Cohere, Jina, Azure OpenAI, and OctoAI.canopy new
to create a Pinecone index, canopy upsert <data>
to load documents, canopy start
to run the server.Highlighted Details
http://host:port/v1
.Maintenance & Community
The repository is no longer actively maintained by the Canopy team, who recommend the Pinecone Assistant for a managed RAG solution.
Licensing & Compatibility
The license is not explicitly stated in the README, but it is an open-source project from Pinecone. Compatibility for commercial use or closed-source linking would require clarification on the specific license.
Limitations & Caveats
The project is no longer maintained, meaning no future updates or bug fixes are expected. The evaluation chat tool's side-by-side comparison feature is currently only supported with OpenAI.
9 months ago
Inactive