Discover and explore top open-source AI tools and projects—updated daily.
High-performance deep learning inference and graph execution within Redis
Top 42.5% on SourcePulse
Redis-inference-optimization is a Redis module designed for serving deep learning and machine learning models directly within Redis, aiming to maximize computation throughput and reduce latency through data locality. It simplifies the deployment and serving of ML graphs by leveraging Redis's robust infrastructure. The project was formerly known as RedisAI but was renamed in January 2025. It is targeted at developers and organizations seeking to integrate ML inference capabilities seamlessly into their data-intensive applications. However, it is important to note that this project is no longer actively maintained or supported.
How It Works
This module acts as a "workhorse" for model serving by providing out-of-the-box support for popular ML frameworks. It executes deep learning/machine learning models and manages their data by adhering to the principle of data locality, processing data where it resides within Redis. This approach minimizes data transfer overhead and enhances performance. The module simplifies the deployment and serving of ML graphs by integrating directly into Redis's production-proven infrastructure.
Quick Start & Requirements
The quickest way to try Redis-inference-optimization is via Docker.
docker run -p 6379:6379 redislabs/redisai:1.2.7-cpu-bionic
docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:1.2.7-gpu-bionic
Building from source requires:
git clone --recursive
), build backend dependencies (bash get_deps.sh
or bash get_deps.sh gpu
), and then compile the module (make -C opt clean ALL=1
and make -C opt
or make -C opt GPU=1
).Highlighted Details
Maintenance & Community
Redis-inference-optimization is no longer actively maintained or supported. The project acknowledges the community's past interest and support. Further information on Redis's current AI offerings can be found on the Redis website.
Licensing & Compatibility
The project is licensed under the choice of Redis Source Available License 2.0 (RSALv2) or Server Side Public License v1 (SSPLv1). These licenses may have restrictions on commercial use or linking with closed-source software.
Limitations & Caveats
The primary limitation is that Redis-inference-optimization is no longer actively maintained or supported, indicating a lack of ongoing development, bug fixes, or security updates. Users should be aware of potential compatibility issues with newer versions of Redis or ML frameworks, and the absence of community support for new deployments.
1 month ago
Inactive