Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[docs] Remove DLR from documents #468

Merged
merged 1 commit into from
Feb 9, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@ You can install extra extensions to enable the following models:

- PaddlePaddle model
- TFLite model
- Neo DLR (TVM) model
- XGBoost model
- LightGBM model
- Sentencepiece model
Expand Down
1 change: 0 additions & 1 deletion benchmark/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -191,7 +191,6 @@ By default, the above script will use MXNet as the default Engine, but you can a
-e OnnxRuntime # pytorch
-e TFLite # TFLite
-e TensorRT # TensorRT
-e DLR # Neo DLR
-e XGBoost # XGBoost
-e LightGBM # LightGBM
-e Python # Python script
Expand Down
1 change: 0 additions & 1 deletion benchmark/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,6 @@ dependencies {
runtimeOnly "ai.djl.mxnet:mxnet-model-zoo"
runtimeOnly "ai.djl.paddlepaddle:paddlepaddle-model-zoo"
runtimeOnly "ai.djl.tflite:tflite-engine"
runtimeOnly "ai.djl.dlr:dlr-engine"
runtimeOnly "ai.djl.ml.xgboost:xgboost"
runtimeOnly project(":engines:python")
runtimeOnly "ai.djl.tensorrt:tensorrt"
Expand Down
1 change: 0 additions & 1 deletion benchmark/snapcraft/snapcraft.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@ description: |
- ONNXRuntime
- TensorRT
- TensorFlow Lite
- Neo DLR
- XGBoost
- Python

Expand Down
3 changes: 0 additions & 3 deletions benchmark/src/main/java/ai/djl/benchmark/Benchmark.java
Original file line number Diff line number Diff line change
Expand Up @@ -124,9 +124,6 @@ private static void configEngines(boolean multithreading) {
if (System.getProperty("ai.djl.tflite.disable_alternative") == null) {
System.setProperty("ai.djl.tflite.disable_alternative", "true");
}
if (System.getProperty("ai.djl.dlr.disable_alternative") == null) {
System.setProperty("ai.djl.dlr.disable_alternative", "true");
}
if (System.getProperty("ai.djl.paddlepaddle.disable_alternative") == null) {
System.setProperty("ai.djl.paddlepaddle.disable_alternative", "true");
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,6 @@
<el-option label="PaddlePaddle" value="PaddlePaddle"></el-option>
<el-option label="TFLite" value="TFLite"></el-option>
<el-option label="XGBoost" value="XGBoost"></el-option>
<el-option label="DLR" value="DLR"></el-option>
</el-select>
</el-form-item>

Expand Down
14 changes: 3 additions & 11 deletions serving/docs/configurations.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,18 +63,11 @@ DJLServing build on top of Deep Java Library (DJL). Here is a list of settings f
| PADDLE_LIBRARY_PATH | env var/system prop | User provided custom PaddlePaddle native library |
| ai.djl.paddlepaddle.disable_alternative | system prop | Disable alternative engine |

### Neo DLR (TVM)

| Key | Type | Description |
|--------------------------------|-------------|-----------------------------------------|
| DLR_LIBRARY_PATH | env var | User provided custom DLR native library |
| ai.djl.dlr.disable_alternative | system prop | Disable alternative engine |

### Huggingface tokenizers

| Key | Type | Description |
|------------------|---------|-----------------------------------------|
| TOKENIZERS_CACHE | env var | User provided custom DLR native library |
| Key | Type | Description |
|------------------|---------|-----------------------------------------------------------|
| TOKENIZERS_CACHE | env var | User provided custom Huggingface tokenizer native library |

### Python

Expand Down Expand Up @@ -193,7 +186,6 @@ The follow table show some engine specific environment variables that is overrid
| TF_NUM_INTEROP_THREADS | TensorFlow | default 1, OMP_NUM_THREADS will override this value |
| TF_NUM_INTRAOP_THREADS | TensorFlow | default 1 |
| TF_CPP_MIN_LOG_LEVEL | TensorFlow | default 1 |
| TVM_NUM_THREADS | DLR/TVM | default 1, OMP_NUM_THREADS will override this value |
| MXNET_ENGINE_TYPE | MXNet | this value must be `NaiveEngine` |

## Appendix
Expand Down
3 changes: 0 additions & 3 deletions serving/src/main/java/ai/djl/serving/ModelServer.java
Original file line number Diff line number Diff line change
Expand Up @@ -582,9 +582,6 @@ private String inferEngine(Path modelDir, String modelName) {
return "PaddlePaddle";
} else if (Files.isRegularFile(modelDir.resolve(modelName + ".json"))) {
return "XGBoost";
} else if (Files.isRegularFile(modelDir.resolve(modelName + ".dylib"))
|| Files.isRegularFile(modelDir.resolve(modelName + ".so"))) {
return "DLR";
}
logger.warn("Failed to detect engine of the model: {}", modelDir);
return null;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -114,9 +114,6 @@ public void installEngine(String engineName) throws IOException {
case "XGBoost":
installDependency("ai.djl.ml.xgboost:xgboost:" + djlVersion);
break;
case "DLR":
installDependency("ai.djl.dlr:dlr-engine:" + djlVersion);
break;
default:
break;
}
Expand Down
1 change: 0 additions & 1 deletion serving/src/main/puml/architecture.puml
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,6 @@ package "DJL Serving - single process" {
MXNet
OnnxRuntime
TFLite
DLR
XGBoost
]
}
Expand Down
5 changes: 0 additions & 5 deletions serving/src/test/java/ai/djl/serving/ModelServerTest.java
Original file line number Diff line number Diff line change
Expand Up @@ -188,11 +188,6 @@ public void testModelStore()

String expected = modelDir.toUri().toURL().toString();

Path dlr = modelDir.resolve("test_model.so");
Files.createFile(dlr);
url = server.mapModelUrl(modelDir);
assertEquals(url, "test_model::DLR:*=" + expected);

Path xgb = modelDir.resolve("test_model.json");
Files.createFile(xgb);
url = server.mapModelUrl(modelDir);
Expand Down