farfalle  by rashadphz

AI search engine for local/cloud LLMs

Created 1 year ago
3,453 stars

Top 14.0% on SourcePulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

Farfalle is an open-source AI-powered search engine designed to provide an alternative to services like Perplexity. It allows users to self-host their search experience, leveraging either local Large Language Models (LLMs) or cloud-based APIs for question answering and search result summarization. The target audience includes privacy-conscious users, developers, and researchers who want a customizable and self-managed search solution.

How It Works

Farfalle integrates multiple search providers, including SearXNG, Tavily, Serper, and Bing, to gather raw search results. It then utilizes LLMs to process these results, answer user queries directly, or employ an agent-based approach for more complex search planning and execution. This hybrid model offers flexibility, allowing users to choose between the cost-effectiveness and privacy of local LLMs (like Llama3, Gemma, Mistral, Phi3 via Ollama) or the power of cloud models (OpenAI, Groq) through LiteLLM integration.

Quick Start & Requirements

  • Install/Run: docker-compose -f docker-compose.dev.yaml up -d after cloning the repository and configuring .env.
  • Prerequisites: Docker, Ollama (for local LLMs), optional API keys for cloud services (Tavily, Serper, OpenAI, Bing, Groq).
  • Setup: Requires cloning the repository and setting up environment variables. Official documentation is available for custom setups.

Highlighted Details

  • Supports a wide range of local LLMs via Ollama and cloud LLMs via LiteLLM.
  • Features an agent that plans and executes searches for improved results.
  • Integrates with multiple search APIs including SearXNG, eliminating external dependencies for basic search.
  • Offers a live demo at farfalle.dev (cloud models only).

Maintenance & Community

The project is actively maintained by rashadphz. Community interaction is encouraged via GitHub issues and Twitter. A roadmap is available outlining planned features.

Licensing & Compatibility

The repository does not explicitly state a license in the provided README. Users should verify licensing for commercial use or integration with closed-source projects.

Limitations & Caveats

The project is described as a "clone" of Perplexity, implying feature parity may not be complete. The "Chat with local files" feature is listed as a future enhancement. Users relying solely on local models will need to manage LLM downloads and Ollama setup.

Health Check
Last Commit

11 months ago

Responsiveness

Inactive

Pull Requests (30d)
3
Issues (30d)
3
Star History
55 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.