Skip to content

Commit 8fc77bc

Browse files
authored
Merge pull request #806 from RedisAI/backends_update
changed PT and TF versions
2 parents 7aca1eb + b8f34ec commit 8fc77bc

38 files changed

+198
-2393
lines changed

.circleci/config.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -359,7 +359,7 @@ jobs:
359359
enabled: true
360360
docker_layer_caching: true
361361
resource_class: gpu.nvidia.small
362-
image: ubuntu-1604-cuda-11.1:202012-01
362+
image: ubuntu-2004-cuda-11.2:202103-01
363363

364364
steps:
365365
- build-and-test-gpu-steps
@@ -371,7 +371,7 @@ jobs:
371371
enabled: true
372372
docker_layer_caching: true
373373
resource_class: gpu.nvidia.small
374-
image: ubuntu-1604-cuda-11.1:202012-01
374+
image: ubuntu-2004-cuda-11.2:202103-01
375375

376376
steps:
377377
- only_run_if_forked_pull_request

Dockerfile.arm

Lines changed: 0 additions & 83 deletions
This file was deleted.

Dockerfile.jetson

Lines changed: 0 additions & 102 deletions
This file was deleted.

docs/developer-backends.md

Lines changed: 8 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,17 @@
11
# RedisAI Development Backends
22

3-
This document describes how a backend for RedisAI can be built, from this repository. It highlights the supported compilation devices on a per-backend basis, and highlights the tools and commands required. Unless indicated otherwise, a backend is compiled in a docker, which is responsible for the configuration and installation of all tools required for a given backend on a per-platform basis.
3+
This document describes how ONNXRuntime backend can be built from this repository.
4+
We build the ONNXRuntime library with the DISABLE_EXTERNAL_INITIALIZERS=ON build flag. As a result, loading ONNX models that use external files to store the initial (usually very large) values of the model's operations, is invalid. Hence, initialization values must be part of the serialized model, which is also the standard use case.
5+
6+
It is compiled in a docker, which is responsible for the configuration and installation of all tools required the build process.
47

58
To follow these instructions, this repository must be cloned with all of its submodules (i.e *git clone --recursive https://github.com/redisai/redisai*)
69

7-
GNU Make is used as a runner for the dockerfile generator. Python is the language used for the generator script, and jinja is the templating library used to create the docker file from a template *dockerfile.tmpl* that can be found in the directory of a given backend listed below.
10+
GNU Make is used as a runner for the dockerfile generator. Python is the language used for the generator script, and jinja is the templating library used to create the docker file from the template *dockerfile.tmpl*, located in the `/opt/build/onnxruntime` directory.
811

9-
## Tools
12+
### Tools
1013

11-
Building the backends requires installation of the following tools:
14+
Building the backend requires installation of the following tools:
1215

1316
1. gnu make
1417
1. python (3.0 or higher)
@@ -21,17 +24,11 @@ On ubuntu bionic these can be installed by running the following steps, to insta
2124
sudo apt install python3 python3-dev make docker
2225
python3 -m venv /path/to/venv
2326
source /path/to/venv/bin/activate
24-
pip install jinja
27+
pip install jinja2
2528
```
2629

2730
-------
2831

29-
## Backends
30-
31-
### onnxruntime
32-
33-
We build Onnxruntime library with DISABLE_EXTERNAL_INITIALIZERS=ON build flag. This means that loading ONNX models that use external files to store the initial (usually very large) values of the model's operations, is invalid. That is, initializers values must be part of the serialized model, which is also the standard use case.
34-
3532
**Compilation target devices:**
3633

3734
1. x86\_64 bit linux systems

docs/developer.md

Lines changed: 9 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -170,9 +170,16 @@ sudo ./opt/system-setup.py
170170

171171
**Building**
172172

173-
To compile RedisAI, run *make -C opt all*, from the root of the repository.
173+
Before compiling RedisAI module, make sure that you have the backends' shared libraries in the `deps` directory. These can be easily obtained by running:
174+
```./get_deps```
175+
or
176+
```./get_deps gpu```
177+
for having GPU support.
178+
Except for ONNXRuntime backend which is stored in a S3 bucket, other backends' binary files are downloaded directly from their official websites.
179+
Further details about building ONNXRuntime backend for RedisAI is described in [this document](developer-backends.md).
180+
181+
To compile RedisAI, run `make -C opt` or `make -C opt GPU=1`, from the root of the repository. This will create the module's shared library `redisai.so` in `instali-cpu`/`install-gpu` directory respectively.
174182

175-
Build the backends is described in [this document](developer-backends.md).
176183

177184
### Testing
178185

0 commit comments

Comments
 (0)