Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add workflow test for microservice #97

Merged
merged 29 commits into from
May 29, 2024
Merged
Show file tree
Hide file tree
Changes from 27 commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
89fcaa4
add workflow test for microservice
chensuyue May 27, 2024
fd4787c
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] May 27, 2024
9292749
for test only
chensuyue May 27, 2024
c70a030
fix typo
chensuyue May 27, 2024
9c2c0fa
Merge branch 'suyue/ci_comps' of https://github.com/opea-project/GenA…
chensuyue May 27, 2024
4af67f7
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] May 27, 2024
d397417
trigger test
chensuyue May 27, 2024
dee0395
Merge branch 'suyue/ci_comps' of https://github.com/opea-project/GenA…
chensuyue May 27, 2024
15d8153
bug fix
chensuyue May 28, 2024
8b7efdf
add sleep time
chensuyue May 28, 2024
87d801c
fix port
chensuyue May 28, 2024
d0b70d4
Merge remote-tracking branch 'origin/main' into suyue/ci_comps
Spycsh May 29, 2024
7f62840
fix test issues
Spycsh May 29, 2024
2dfb63b
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] May 29, 2024
61551e6
Merge branch 'main' into suyue/ci_comps
letonghan May 29, 2024
b2a0a84
test_retrievers.sh
Spycsh May 29, 2024
e6a2099
Merge branch 'suyue/ci_comps' of https://github.com/opea-project/GenA…
Spycsh May 29, 2024
3dd1bc8
revert docker rm
Spycsh May 29, 2024
8a69325
update readmes
letonghan May 29, 2024
b29ca36
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] May 29, 2024
3f26940
update test workflow
chensuyue May 29, 2024
9433b13
for test
chensuyue May 29, 2024
c331ca7
bug fix
chensuyue May 29, 2024
f6abd92
for test
chensuyue May 29, 2024
223eec8
for test
chensuyue May 29, 2024
fea4db1
fix embedding issue and refactor readme
Spycsh May 29, 2024
341bbf8
Merge branch 'suyue/ci_comps' of https://github.com/opea-project/GenA…
Spycsh May 29, 2024
da7453e
remove test code
chensuyue May 29, 2024
0387349
Merge branch 'suyue/ci_comps' of https://github.com/opea-project/GenA…
chensuyue May 29, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
50 changes: 0 additions & 50 deletions .github/workflows/embeddings-comp-test.yml

This file was deleted.

50 changes: 0 additions & 50 deletions .github/workflows/llms-comp-test.yml

This file was deleted.

87 changes: 87 additions & 0 deletions .github/workflows/microservice-test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
# Copyright (C) 2024 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

name: MicroService-test

on:
pull_request:
branches: [main]
types: [opened, reopened, ready_for_review, synchronize] # added `ready_for_review` since draft is skipped
paths:
- comps/**
- "!**.md"
- "!**.txt"
- .github/workflows/microservice.yml

# If there is a new commit, the previous jobs will be canceled
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true

jobs:
job1:
name: Get-test-matrix
runs-on: ubuntu-latest
outputs:
run_matrix: ${{ steps.get-test-matrix.outputs.run_matrix }}
steps:
- name: Checkout out Repo
uses: actions/checkout@v4
with:
ref: "refs/pull/${{ github.event.number }}/merge"
fetch-depth: 0
- name: Get test matrix
id: get-test-matrix
run: |
set -xe
changed_files=$(git diff --name-only ${{ github.event.pull_request.base.sha }} ${{ github.event.pull_request.head.sha }} \
| grep 'comps/' | grep -vE '*.md|*.txt')
services=$(printf '%s\n' "${changed_files[@]}" | grep '/' | cut -d'/' -f2 | sort -u)
run_matrix="{\"include\":["
for service in ${services}; do
hardware="gaudi" # default hardware, set based on the changed files
run_matrix="${run_matrix}{\"service\":\"${service}\",\"hardware\":\"${hardware}\"},"
done
run_matrix=$run_matrix"]}"
echo "run_matrix=${run_matrix}" >> $GITHUB_OUTPUT

Microservice-test:
needs: job1
strategy:
matrix: ${{ fromJSON(needs.job1.outputs.run_matrix) }}
runs-on: ${{ matrix.hardware }}
continue-on-error: true
steps:
- name: Clean Up Working Directory
run: sudo rm -rf ${{github.workspace}}/*

- name: Checkout out Repo
uses: actions/checkout@v4
with:
ref: "refs/pull/${{ github.event.number }}/merge"

- name: Run microservice test
env:
HUGGINGFACEHUB_API_TOKEN: ${{ secrets.HUGGINGFACEHUB_API_TOKEN }}
service: ${{ matrix.service }}
hardware: ${{ matrix.hardware }}
run: |
cd tests
if [ -f test_${service}.sh ]; then timeout 10m bash test_${service}.sh; else echo "Test script not found, skip test!"; fi

- name: Clean up container
env:
service: ${{ matrix.service }}
hardware: ${{ matrix.hardware }}
if: cancelled() || failure()
run: |
cid=$(docker ps -aq --filter "name=test-comps-*")
if [[ ! -z "$cid" ]]; then docker stop $cid && docker rm $cid && sleep 1s; fi
echo y | docker system prune

- name: Publish pipeline artifact
if: ${{ !cancelled() }}
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.service }}-${{ matrix.hardware }}
path: ${{ github.workspace }}/tests/*.log
22 changes: 12 additions & 10 deletions comps/asr/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,43 +2,45 @@

ASR (Audio-Speech-Recognition) microservice helps users convert speech to text. When building a talking bot with LLM, users will need to convert their audio inputs (What they talk, or Input audio from other sources) to text, so the LLM is able to tokenize the text and generate an answer. This microservice is built for that conversion stage.

# 🚀Start Microservice with Python
# 🚀1. Start Microservice with Python (Option 1)

To start the ASR microservice with Python, you need to first install python packages.

## Install Requirements
## 1.1 Install Requirements

```bash
pip install -r requirements.txt
```

## Start ASR Service with Python Script
## 1.2 Start ASR Service with Python Script

```bash
python asr.py
```

# 🚀Start Microservice with Docker
# 🚀2. Start Microservice with Docker (Option 2)

Alternatively, you can also start the ASR microservice with Docker.

## Build Docker Image
## 2.1 Build Docker Image

```bash
cd ../../
docker build -t opea/gen-ai-comps:asr --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/asr/Dockerfile .
docker build -t opea/asr:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/asr/Dockerfile .
```

## Run Docker with CLI
## 2.2 Run Docker with CLI

```bash
docker run -p 9099:9099 --network=host --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy opea/gen-ai-comps:asr
docker run -p 9099:9099 --network=host --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy opea/asr:latest
```

# Test
# 🚀3. Consume ASR Service

You can use the following `curl` command to test whether the service is up. Notice that the first request can be slow because it needs to download the models.

```bash
curl http://localhost:9099/v1/audio/transcriptions -H "Content-Type: application/json" -d '{"url": "https://github.com/intel/intel-extension-for-transformers/raw/main/intel_extension_for_transformers/neural_chat/assets/audio/sample_2.wav"}'
curl http://localhost:9099/v1/audio/transcriptions \
-H "Content-Type: application/json" \
-d '{"url": "https://github.com/intel/intel-extension-for-transformers/raw/main/intel_extension_for_transformers/neural_chat/assets/audio/sample_2.wav"}'
```
45 changes: 28 additions & 17 deletions comps/dataprep/redis/README.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,18 @@
# Dataprep Microservice with Redis

# 🚀Start Microservice with Python
# 🚀1. Start Microservice with Python(Option 1)

## Install Requirements
## 1.1 Install Requirements

```bash
pip install -r requirements.txt
```

## Start Redis Stack Server
## 1.2 Start Redis Stack Server

Please refer to this [readme](../../../vectorstores/langchain/redis/README.md).

## Setup Environment Variables
## 1.3 Setup Environment Variables

```bash
export REDIS_URL="redis://${your_ip}:6379"
Expand All @@ -22,46 +22,57 @@ export LANGCHAIN_API_KEY=${your_langchain_api_key}
export LANGCHAIN_PROJECT="opea/gen-ai-comps:dataprep"
```

## Start Document Preparation Microservice for Redis with Python Script
## 1.4 Start Document Preparation Microservice for Redis with Python Script

Start document preparation microservice for Redis with below command.

```bash
python prepare_doc_redis.py
```

# 🚀Start Microservice with Docker
# 🚀2. Start Microservice with Docker (Option 2)

## Build Docker Image
## 2.1 Start Redis Stack Server

```bash
cd ../../../../
docker build -t opea/dataprep-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/dataprep/redis/docker/Dockerfile .
```
Please refer to this [readme](../../../vectorstores/langchain/redis/README.md).

## Run Docker with CLI
## 2.2 Setup Environment Variables

```bash
export REDIS_URL="redis://${your_ip}:6379"
export INDEX_NAME=${your_index_name}
export LANGCHAIN_TRACING_V2=true
export LANGCHAIN_API_KEY=${your_langchain_api_key}
export LANGCHAIN_PROJECT="opea/gen-ai-comps:dataprep"
export LANGCHAIN_PROJECT="opea/dataprep"
```

## 2.3 Build Docker Image

docker run -d --name="dataprep-redis-server" -p 6007:6007 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e REDIS_URL=$REDIS_URL -e INDEX_NAME=$INDEX_NAME opea/dataprep-redis:latest
```bash
cd ../../../../
docker build -t opea/dataprep-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/dataprep/redis/docker/Dockerfile .
```

## 2.4 Run Docker with CLI (Option A)

```bash
docker run -d --name="dataprep-redis-server" -p 6007:6007 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e REDIS_URL=$REDIS_URL -e INDEX_NAME=$INDEX_NAME -e TEI_ENDPOINT=$TEI_ENDPOINT opea/dataprep-redis:latest
```

## Run Docker with Docker Compose
## 2.5 Run with Docker Compose (Option B)

```bash
cd comps/dataprep/redis/docker
docker compose -f docker-compose-dataprep-redis.yaml up -d
```

# Invoke Microservice
# 🚀3. Consume Microservice

Once document preparation microservice for Redis is started, user can use below command to invoke the microservice to convert the document to embedding and save to the database.

```bash
curl -X POST -H "Content-Type: application/json" -d '{"path":"/path/to/document"}' http://localhost:6007/v1/dataprep
curl -X POST \
-H "Content-Type: application/json" \
-d '{"path":"/path/to/document"}' \
http://localhost:6007/v1/dataprep
```
Loading
Loading