LinguaHaru  by YANG-Haruka

LLM-powered AI translation app for documents

Created 1 year ago
251 stars

Top 99.9% on SourcePulse

GitHubView on GitHub
Project Summary

Summary LinguaHaru is a next-generation AI translation tool powered by LLMs, offering high-quality, precise translations for various common file formats (Office docs, PDF, TXT) with a single click. It targets users needing efficient, multi-format document localization across multiple languages, simplifying complex translation workflows.

How It Works This tool leverages cutting-edge LLMs for exceptional translation quality with minimal user interaction. It supports DOCX, XLSX, PPTX, PDF, TXT, and SRT files. A key advantage is its flexible engine configuration, allowing seamless switching between local Ollama models and online APIs (Deepseek, OpenAI) for diverse operational environments.

Quick Start & Requirements

  • Installation: Create/activate Conda env (conda create -n lingua-haru python=3.10, conda activate lingua-haru), then pip install -r requirements.txt.
  • Running: Execute python app.py. Default access: http://127.0.0.1:9980.
  • Prerequisites: Python 3.10, CUDA (11.7/12.1 tested). Local LLM requires Ollama and downloaded models (e.g., QWen).
  • Links: User Guide (Wiki) [Link not provided].

Highlighted Details

  • Supports DOCX, XLSX, PPTX, PDF, TXT, SRT formats; plans for expansion.
  • Covers 10+ languages (e.g., Chinese, English, Japanese, Korean, Russian), with ongoing additions.
  • Flexible translation engines: local (Ollama) and online APIs (Deepseek, OpenAI).
  • Features LAN sharing for efficient collaborative work.
  • One-click rapid translation with minimal user operation.

Maintenance & Community Active development is evident from recent updates in Jan 2026 and May 2025, indicating ongoing feature additions and bug fixes. No specific community channels or notable contributors/sponsorships are mentioned in the README.

Licensing & Compatibility The software is fully open-source under the GPL-3.0 license. This copyleft license permits free use but requires derivative works to also be licensed under GPL-3.0, potentially restricting integration into closed-source commercial products.

Limitations & Caveats Local LLM support is currently limited to Ollama. Features like "continue translation functionality" are noted as under development, suggesting potential incompleteness or ongoing refinement.

Health Check
Last Commit

2 weeks ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
1
Star History
8 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.