Caution
RedisAI is no longer actively maintained or supported.
We are grateful to the RedisAI community for their interest and support.
RedisAI is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. RedisAI both maximizes computation throughput and reduces latency by adhering to the principle of data locality, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure.
To read RedisAI docs, visit redisai.io. To see RedisAI in action, visit the demos page.
RedisAI is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies.
The following sections describe how to get started with RedisAI.
The quickest way to try RedisAI is by launching its official Docker container images.
docker run -p 6379:6379 redislabs/redisai:1.2.7-cpu-bionic
For GPU support you will need a machine you'll need a machine that has Nvidia driver (CUDA 11.3 and cuDNN 8.1), nvidia-container-toolkit and Docker 19.03+ installed. For detailed information, checkout nvidia-docker documentation
docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:1.2.7-gpu-bionic
You can compile and build the module from its source code. The Developer page has more information about the design and implementation of the RedisAI module and how to contribute.
- Packages: git, python3, make, wget, g++/clang, & unzip
- CMake 3.0 or higher needs to be installed.
- CUDA 11.3 and cuDNN 8.1 or higher needs to be installed if GPU support is required.
- Redis v6.0.0 or greater.
You can obtain the module's source code by cloning the project's repository using git like so:
git clone --recursive https://github.com/RedisAI/RedisAI
Switch to the project's directory with:
cd RedisAI
Use the following script to download and build the libraries of the various RedisAI backends (TensorFlow, PyTorch, ONNXRuntime) for CPU only:
bash get_deps.sh
Alternatively, you can run the following to fetch the backends with GPU support.
bash get_deps.sh gpu
Once the dependencies have been built, you can build the RedisAI module with:
make -C opt clean ALL=1
make -C opt
Alternatively, run the following to build RedisAI with GPU support:
make -C opt clean ALL=1
make -C opt GPU=1
RedisAI currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between RedisAI and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given RedisAI version, check with the backend documentation about incompatible features between the version of your backend and the version RedisAI is built with.
RedisAI | PyTorch | TensorFlow | TFLite | ONNXRuntime |
---|---|---|---|---|
1.0.3 | 1.5.0 | 1.15.0 | 2.0.0 | 1.2.0 |
1.2.7 | 1.11.0 | 2.8.0 | 2.0.0 | 1.11.1 |
master | 1.11.0 | 2.8.0 | 2.0.0 | 1.11.1 |
Note: Keras and TensorFlow 2.x are supported through graph freezing. See this script to see how to export a frozen graph from Keras and TensorFlow 2.x.
To load the module upon starting the Redis server, simply use the --loadmodule
command line switch, the loadmodule
configuration directive or the Redis MODULE LOAD
command with the path to module's library.
For example, to load the module from the project's path with a server command line switch use the following:
redis-server --loadmodule ./install-cpu/redisai.so
Once loaded, you can interact with RedisAI using redis-cli. Basic information and examples for using the module is described here.
Some languages already have client libraries that provide support for RedisAI's commands. The following table lists the known ones:
Project | Language | License | Author | URL |
---|---|---|---|---|
JRedisAI | Java | BSD-3 | RedisLabs | Github |
redisai-py | Python | BSD-3 | RedisLabs | Github |
redisai-go | Go | BSD-3 | RedisLabs | Github |
redisai-js | Typescript/Javascript | BSD-3 | RedisLabs | Github |
redis-modules-sdk | TypeScript | BSD-3-Clause | Dani Tseitlin | Github |
redis-modules-java | Java | Apache-2.0 | dengliming | Github |
smartredis | C++ | BSD-2-Clause | Cray Labs | Github |
smartredis | C | BSD-2-Clause | Cray Labs | Github |
smartredis | Fortran | BSD-2-Clause | Cray Labs | Github |
smartredis | Python | BSD-2-Clause | Cray Labs | Github |
The full documentation for RedisAI's API can be found at the Commands page.
Read the docs at redisai.io.
If you have questions, want to provide feedback or perhaps report an issue or contribute some code, here's where we're listening to you:
RedisAI is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1).