MagicWX  by Pangu-Immortal

On-device LLM inference app for Android

Created 6 years ago
797 stars

Top 44.1% on SourcePulse

GitHubView on GitHub
Project Summary

Summary

MagicWX addresses Android device modification without root access, enabling features like device spoofing and unlimited WeChat multi-instance. It also serves as an Android-native application for downloading and running 10 mainstream Large Language Models (LLMs) locally and offline, requiring no server infrastructure.

How It Works

The application employs a dual inference architecture, supporting both RWKV (RNN state machine) and Transformer (KV-Cache) models. It leverages ONNX Runtime 1.20.0 for efficient on-device inference, with support for various quantization levels (FP32, FP16, INT8, INT4) to optimize performance and memory usage. A modern UI is built using Jetpack Compose and Material3, offering a streamlined user experience for model management and chat interaction.

Quick Start & Requirements

Clone the repository (git clone https://github.com/Pangu-Immortal/MagicWX.git), open it in Android Studio, and build using ./gradlew assembleDebug. Install the resulting APK on an Android device (API 24-35). Users can then select a model, download it (including tokenizers), and begin local inference. Key dependencies include ONNX Runtime 1.20.0 and Jetpack Compose (BOM 2024.12.01).

Highlighted Details

  • Supports one-click download and local inference for 10 popular LLMs, including RWKV-7 World, DeepSeek-R1, Phi-3 Mini, and Llama 3.2.
  • Enables advanced Android modifications like device spoofing ("一键新机") and unlimited WeChat multi-instance without requiring root privileges.
  • Inference engine supports FP32, FP16, INT8, and INT4 quantization for optimized model execution.
  • The project also lists extensive paid services for app survival, anti-detection, and other deep Android customizations.

Maintenance & Community

Contributions are welcomed via pull requests. Users can join the Telegram group for discussions and community support. Issue tracking is managed through GitHub Issues.

Licensing & Compatibility

The project is licensed under the MIT License. However, a crucial disclaimer states it is for "personal learning and exchange only" and prohibits any commercial use, overriding the typical permissive nature of MIT for commercial applications.

Limitations & Caveats

The project explicitly restricts commercial use, despite its MIT license. A significant portion of its functionality is offered as paid services, distinct from the open-source LLM inference component. The "no-root" Android modification features may vary in effectiveness and stability across different Android versions and device configurations. The combination of LLM inference with deep system modification tools may raise security and privacy considerations.

Health Check
Last Commit

2 days ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
11 stars in the last 30 days

Explore Similar Projects

Starred by Andrej Karpathy Andrej Karpathy(Founder of Eureka Labs; Formerly at Tesla, OpenAI; Author of CS 231n), Gabriel Almeida Gabriel Almeida(Cofounder of Langflow), and
2 more.

torchchat by pytorch

0%
4k
PyTorch-native SDK for local LLM inference across diverse platforms
Created 1 year ago
Updated 5 months ago
Feedback? Help us improve.