AR-Mahjong-Assistant-preview  by LYiHub

AR Mahjong assistant for real-time strategy

Created 1 week ago

New!

418 stars

Top 70.3% on SourcePulse

GitHubView on GitHub
Project Summary

Summary

ARmahjongAssist is a Mahjong assistance system designed for RayNeo AR glasses, leveraging localized AI to provide real-time, privacy-preserving gameplay insights. It targets Mahjong players seeking an edge by offering optimal move suggestions and situational analysis directly within their field of view, enhancing gameplay without relying on external cloud services.

How It Works

The system comprises an Android client on AR glasses and a Python server. The client captures hand images and transmits them for processing. The server utilizes YOLOv8 (ONNX) for local, low-latency tile recognition. A dedicated Mahjong library calculates optimal plays based on '向听' (Shanten) and '进张' (Ukeire). Integrated Faster-Whisper handles offline speech-to-text, enabling LLM-powered natural language understanding for advanced queries like '绝张' (isolated tile) analysis. This fully localized approach prioritizes user privacy and responsiveness.

Quick Start & Requirements

  • Installation: Recommended via Docker Compose (docker-compose up -d --build).
  • Prerequisites: Docker Desktop, a local LLM service (e.g., LM Studio, Ollama) with an OpenAI-compatible API, and network connectivity between server and client. Client-side development requires Android Studio (Ladybug+) and JDK 17+.
  • Hardware: RayNeo X3 AR glasses or compatible Android device.
  • Configuration: Requires setting environment variables for LLM endpoint and client IP address.
  • Links: Project repository: https://github.com/fAres4s/ARmahjongAssist.git

Highlighted Details

  • Full Localization: All AI processing, including YOLOv8 inference via ONNX Runtime, occurs locally, ensuring data privacy and minimal latency.
  • Voice Interaction: Supports offline speech-to-text (Faster-Whisper) and LLM integration for natural language commands, enabling features like real-time '绝张' (isolated tile) analysis.
  • YOLO Debug Tool: A web interface (http://localhost:8000/static/yolo_debug.html) allows real-time tuning of recognition parameters (Confidence, IoU Thresholds) for improved accuracy in varied conditions.
  • 牌效分析 (Play Efficiency Analysis): Leverages a Mahjong library to compute '向听' (Shanten) and '进张' (Ukeire) for optimal move recommendations.

Maintenance & Community

The project acknowledges contributions to datasets and libraries (Mahjong Dataset, riichi-mahjong-tiles, faster-whisper, nanoemoji). No explicit community channels (Discord, Slack) or roadmap details are provided in the README.

Licensing & Compatibility

  • License: Not specified in the provided documentation.
  • Compatibility: Primarily targets Android (Min SDK 31) and requires specific AR hardware. Server-side requires Python 3.9+. The lack of a specified license may restrict commercial use.

Limitations & Caveats

The project is designated as "-preview," indicating potential instability or incomplete features. It has a strong dependency on specific hardware (RayNeo AR glasses) and requires manual network configuration. The absence of a clear license is a significant adoption blocker, preventing clarity on usage rights and restrictions.

Health Check
Last Commit

1 week ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
450 stars in the last 12 days

Explore Similar Projects

Feedback? Help us improve.