Discover and explore top open-source AI tools and projects—updated daily.
stevelaskaridisLLMs and tools optimized for mobile and embedded hardware
Top 91.2% on SourcePulse
Summary This repository is a curated "awesome list" for Large Language Models (LLMs) and related research tailored for mobile and embedded hardware. It serves researchers, engineers, and practitioners aiming to deploy LLM technology on resource-constrained devices. The list consolidates information on mobile-first LLMs, deployment infrastructure, benchmarking, optimization techniques, applications, and multimodal models, accelerating edge AI development.
How It Works
The project is a structured directory linking to LLMs, papers, code, and deployment frameworks for mobile/embedded LLM inference and training. It categorizes resources into "Mobile-First LLMs," "Infrastructure / Deployment," "Benchmarking," and "Mobile-Specific Optimisations." This approach highlights essential models (e.g., sub-3B parameter LLMs) and tools (e.g., llama.cpp, MLC-LLM, PyTorch ExecuTorch, MLX), enabling quick identification of relevant resources and trends.
Quick Start & Requirements This is a curated list, not a runnable project, so no direct installation steps exist. Users navigate the list to find specific projects. Requirements vary widely, often including specific hardware (mobile, edge devices), OS (Android, iOS), software dependencies (Python, CUDA, ML frameworks), and potentially API keys or datasets. Links to official quick-start guides and documentation are provided for many projects.
Highlighted Details
1 month ago
1+ week
pytorch
pytorch