wasm-micro-runtime  by bytecodealliance

Lightweight WebAssembly runtime for embedded, IoT, edge, and cloud

created 6 years ago
5,470 stars

Top 9.4% on sourcepulse

GitHubView on GitHub
Project Summary

WebAssembly Micro Runtime (WAMR) is a versatile and highly configurable WebAssembly runtime designed for a wide range of applications, from embedded systems and IoT to cloud-native environments and trusted execution environments. It offers a small footprint, high performance, and multiple execution modes, making it suitable for developers needing to run Wasm modules efficiently across diverse hardware and software platforms.

How It Works

WAMR's core is its VMcore, which supports interpreter, Ahead-of-Time (AOT) compilation, and Just-in-Time (JIT) compilation (including a two-tier Fast JIT and LLVM JIT with dynamic tier-up). This flexibility allows for optimized execution based on the target environment's needs. It also includes iwasm, an executable binary supporting WASI, and wamrc, an AOT compiler. Key advantages include its self-implemented AOT loader for broad platform compatibility and choices between a built-in C library subset or WASI for application libc support.

Quick Start & Requirements

  • Install/Run: Build from source (CMake-based).
  • Prerequisites: C compiler (GCC, Clang), CMake, Python 3.x. Specific platform builds may have additional requirements (e.g., Zephyr SDK for Zephyr).
  • Resources: Minimal for interpreter mode; AOT compilation requires more resources.
  • Docs: Guide, Build WAMR, Embed WAMR

Highlighted Details

  • Small runtime binary size (e.g., ~29.4K for AOT runtime on Cortex-M4F).
  • Supports multiple architectures (x86, ARM, AArch64, RISC-V) and platforms (Linux, macOS, Windows, Zephyr, Android, etc.).
  • Features include multi-threading, WASI support, SGX support, source debugging, and XIP.
  • Post-MVP features like SIMD, Reference Types, and GC are supported.

Maintenance & Community

The project is part of the Bytecode Alliance. Notable contributors include members from Sony, Amazon, Intel, Xiaomi, and Ant Group. Community interaction is available via chat.

Licensing & Compatibility

Licensed under Apache 2.0 with LLVM exception, permitting commercial use and integration into closed-source products.

Limitations & Caveats

The WAMR-IDE is noted as experimental. While extensive, some advanced features like GC and Exception Handling are listed under "post-MVP features," implying they might be less mature or still under development compared to the core MVP features.

Health Check
Last commit

1 day ago

Responsiveness

1 week

Pull Requests (30d)
63
Issues (30d)
34
Star History
176 stars in the last 90 days

Explore Similar Projects

Starred by Andrej Karpathy Andrej Karpathy(Founder of Eureka Labs; Formerly at Tesla, OpenAI; Author of CS 231n), Nat Friedman Nat Friedman(Former CEO of GitHub), and
32 more.

llama.cpp by ggml-org

0.4%
84k
C/C++ library for local LLM inference
created 2 years ago
updated 18 hours ago
Feedback? Help us improve.