llama-fs  by iyaja

Self-organizing file manager using LLMs

Created 1 year ago
5,634 stars

Top 9.1% on SourcePulse

GitHubView on GitHub
Project Summary

LlamaFS is an AI-powered file organization tool designed to automatically rename and structure files based on their content and user-defined conventions. It targets users struggling with disorganized digital spaces, offering both batch processing for immediate organization and a continuous "watch mode" that learns from user renaming habits. The primary benefit is automated, intelligent file management that reduces manual effort and improves discoverability.

How It Works

LlamaFS leverages the Llama 3 model, primarily via Groq's inference API for speed, to analyze file content and suggest organizational structures. It supports various file types, including images (via Moondream) and audio (via Whisper). The system operates in two modes: batch processing for one-off organization and a daemon that monitors directories, learning from user interactions to proactively organize new files. An "incognito mode" routes requests through Ollama for local processing, enhancing privacy.

Quick Start & Requirements

  • Install: Clone the repository, navigate to the directory, and run pip install -r requirements.txt.
  • Prerequisites: Python 3.10+, pip. Requires API keys for Groq and AgentOps (obtainable from their respective sites). Ollama and moondream are optional for incognito mode.
  • Setup: Update .env with API keys.
  • Usage: Run fastapi dev server.py to start the server. Example API call: curl -X POST http://127.0.0.1:8000/batch -H "Content-Type: application/json" -d '{"path": "/Users/<username>/Downloads/", "instruction": "string", "incognito": false}'.
  • Docs: https://github.com/iyaja/llama-fs

Highlighted Details

  • File operations processed in <500ms in watch mode, attributed to smart caching and Groq's inference speed.
  • Supports image and audio file analysis for organization.
  • Offers an "incognito mode" for local processing via Ollama.
  • Learns user renaming conventions in watch mode.

Maintenance & Community

The project is maintained by iyaja. Further community or roadmap details are not explicitly provided in the README.

Licensing & Compatibility

The repository's license is not specified in the README. Compatibility for commercial use or closed-source linking is not detailed.

Limitations & Caveats

The README mentions that "filesystem diffs are hard," suggesting potential complexities or limitations in tracking and managing file changes. The integration of Ollama for incognito mode is marked as a "TODO," indicating it might not be fully implemented or stable.

Health Check
Last Commit

1 month ago

Responsiveness

1+ week

Pull Requests (30d)
1
Issues (30d)
0
Star History
260 stars in the last 30 days

Explore Similar Projects

Starred by Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems"), Daniel Han Daniel Han(Cofounder of Unsloth), and
1 more.

synthetic-data-kit by meta-llama

0.8%
1k
Synthetic data CLI tool for LLM fine-tuning
Created 5 months ago
Updated 1 month ago
Feedback? Help us improve.