Skip to content

Commit 0890e94

Browse files
letonghanyinghu5
andauthored
Refine CodeTrans README (#1960)
Signed-off-by: letonghan <letong.han@intel.com> Co-authored-by: Ying Hu <ying.hu@intel.com>
1 parent 581e954 commit 0890e94

File tree

5 files changed

+43
-9
lines changed

5 files changed

+43
-9
lines changed

CodeTrans/README.md

Lines changed: 8 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -22,12 +22,11 @@ This Code Translation use case demonstrates Text Generation Inference across mul
2222

2323
The table below lists currently available deployment options. They outline in detail the implementation of this example on selected hardware.
2424

25-
| Category | Deployment Option | Description |
26-
| ---------------------- | -------------------- | ----------------------------------------------------------------- |
27-
| On-premise Deployments | Docker compose | [CodeTrans deployment on Xeon](./docker_compose/intel/cpu/xeon) |
28-
| | | [CodeTrans deployment on Gaudi](./docker_compose/intel/hpu/gaudi) |
29-
| | | [CodeTrans deployment on AMD ROCm](./docker_compose/amd/gpu/rocm) |
30-
| | Kubernetes | [Helm Charts](./kubernetes/helm) |
31-
| | | [GMC](./kubernetes/gmc) |
32-
| | Azure | Work-in-progress |
33-
| | Intel Tiber AI Cloud | Work-in-progress |
25+
| Category | Deployment Option | Description |
26+
| ---------------------- | -------------------- | --------------------------------------------------------------------------- |
27+
| On-premise Deployments | Docker compose | [CodeTrans deployment on Xeon](./docker_compose/intel/cpu/xeon/README.md) |
28+
| | | [CodeTrans deployment on Gaudi](./docker_compose/intel/hpu/gaudi/README.md) |
29+
| | | [CodeTrans deployment on AMD ROCm](./docker_compose/amd/gpu/rocm/README.md) |
30+
| | Kubernetes | [Helm Charts](./kubernetes/helm/README.md) |
31+
| | Azure | Work-in-progress |
32+
| | Intel Tiber AI Cloud | Work-in-progress |

CodeTrans/README_miscellaneous.md

Lines changed: 35 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -44,3 +44,38 @@ Some HuggingFace resources, such as some models, are only accessible if the deve
4444

4545
2. (Docker only) If all microservices work well, check the port ${host_ip}:7777, the port may be allocated by other users, you can modify the `compose.yaml`.
4646
3. (Docker only) If you get errors like "The container name is in use", change container name in `compose.yaml`.
47+
48+
## Monitoring OPEA Services with Prometheus and Grafana Dashboard
49+
50+
OPEA microservice deployment can easily be monitored through Grafana dashboards using data collected via Prometheus. Follow the [README](https://github.com/opea-project/GenAIEval/blob/main/evals/benchmark/grafana/README.md) to setup Prometheus and Grafana servers and import dashboards to monitor the OPEA services.
51+
52+
![example dashboards](./assets/img/example_dashboards.png)
53+
![tgi dashboard](./assets/img/tgi_dashboard.png)
54+
55+
## Tracing with OpenTelemetry and Jaeger
56+
57+
> NOTE: This feature is disabled by default. Please use the compose.telemetry.yaml file to enable this feature.
58+
59+
OPEA microservice and [TGI](https://huggingface.co/docs/text-generation-inference/en/index)/[TEI](https://huggingface.co/docs/text-embeddings-inference/en/index) serving can easily be traced through [Jaeger](https://www.jaegertracing.io/) dashboards in conjunction with [OpenTelemetry](https://opentelemetry.io/) Tracing feature. Follow the [README](https://github.com/opea-project/GenAIComps/tree/main/comps/cores/telemetry#tracing) to trace additional functions if needed.
60+
61+
Tracing data is exported to http://{EXTERNAL_IP}:4318/v1/traces via Jaeger.
62+
Users could also get the external IP via below command.
63+
64+
```bash
65+
ip route get 8.8.8.8 | grep -oP 'src \K[^ ]+'
66+
```
67+
68+
Access the Jaeger dashboard UI at http://{EXTERNAL_IP}:16686
69+
70+
For TGI serving on Gaudi, users could see different services like opea, TEI and TGI.
71+
![Screenshot from 2024-12-27 11-58-18](https://github.com/user-attachments/assets/6126fa70-e830-4780-bd3f-83cb6eff064e)
72+
73+
Here is a screenshot for one tracing of TGI serving request.
74+
![Screenshot from 2024-12-27 11-26-25](https://github.com/user-attachments/assets/3a7c51c6-f422-41eb-8e82-c3df52cd48b8)
75+
76+
There are also OPEA related tracings. Users could understand the time breakdown of each service request by looking into each opea:schedule operation.
77+
![image](https://github.com/user-attachments/assets/6137068b-b374-4ff8-b345-993343c0c25f)
78+
79+
There could be asynchronous function such as `llm/MicroService_asyn_generate` and user needs to check the trace of the asynchronous function in another operation like
80+
opea:llm_generate_stream.
81+
![image](https://github.com/user-attachments/assets/a973d283-198f-4ce2-a7eb-58515b77503e)
-29.8 KB
Loading
100 KB
Loading
414 KB
Loading

0 commit comments

Comments
 (0)