Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python Model API package: add main documentation #3268

Merged
Changes from 1 commit
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
853599a
add documentation to Model API
anzhella-pankratova Feb 18, 2022
cb9aabe
add readme.md
anzhella-pankratova Feb 18, 2022
e88ecb8
fix spelling
anzhella-pankratova Feb 21, 2022
70cf732
add Model API section to object_detection_demo readme
anzhella-pankratova Feb 21, 2022
94ffd56
remove extra whitespace
anzhella-pankratova Feb 21, 2022
71c9f9a
add bullet points
anzhella-pankratova Feb 21, 2022
e1320e4
Apply suggestions
anzhella-pankratova Feb 21, 2022
d581e16
modify the usage example
anzhella-pankratova Feb 21, 2022
c6bd178
Modify documentation
anzhella-pankratova Feb 22, 2022
b5739fd
add extra module
anzhella-pankratova Feb 24, 2022
8701286
don't check relative links for Model API package
anzhella-pankratova Feb 24, 2022
8cca162
update check-documentation.py
anzhella-pankratova Feb 24, 2022
4b5d7fa
prepare-documentation for Python Model API
anzhella-pankratova Feb 24, 2022
325e4bf
suggestions
anzhella-pankratova Feb 25, 2022
29d247b
move the list of supported demos to demos/README.md
anzhella-pankratova Feb 25, 2022
36eb8a9
remove list of demos, remove statement in documentation
anzhella-pankratova Feb 28, 2022
41398d2
add documentation to Model API
anzhella-pankratova Feb 18, 2022
94d3976
add readme.md
anzhella-pankratova Feb 18, 2022
3ec58fc
fix spelling
anzhella-pankratova Feb 21, 2022
6ff38da
add Model API section to object_detection_demo readme
anzhella-pankratova Feb 21, 2022
0b5e278
remove extra whitespace
anzhella-pankratova Feb 21, 2022
a0ac903
add bullet points
anzhella-pankratova Feb 21, 2022
3951fb2
Apply suggestions
anzhella-pankratova Feb 21, 2022
a78852d
modify the usage example
anzhella-pankratova Feb 21, 2022
6a8144a
Modify documentation
anzhella-pankratova Feb 22, 2022
31bf4d7
add extra module
anzhella-pankratova Feb 24, 2022
f543d57
don't check relative links for Model API package
anzhella-pankratova Feb 24, 2022
a4adf83
update check-documentation.py
anzhella-pankratova Feb 24, 2022
ee53da8
prepare-documentation for Python Model API
anzhella-pankratova Feb 24, 2022
ede0f45
suggestions
anzhella-pankratova Feb 25, 2022
19872b1
move the list of supported demos to demos/README.md
anzhella-pankratova Feb 25, 2022
384c1d9
remove list of demos, remove statement in documentation
anzhella-pankratova Feb 28, 2022
e1217b4
OMZ models instead of architectures, OV supported Python instead cert…
vladimir-dudnik Mar 1, 2022
52ed496
pull recent changes
anzhella-pankratova Mar 2, 2022
06d0e99
remove python in documentation, update package structure section
anzhella-pankratova Mar 2, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
add readme.md
  • Loading branch information
anzhella-pankratova authored and vladimir-dudnik committed Mar 1, 2022
commit 94d397637fa1ac63f512bcc77b2490cf9cb9b069
145 changes: 145 additions & 0 deletions demos/common/python/openvino/model_zoo/model_api/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,145 @@
# Python* Model API package

Model API package is a set of wrapper classes for particular tasks and model architectures, simplifying data preprocess and postprocess as well as routine procedures (model loading, asynchronous execution, etc...)
Model API wrappers hide the specific code inside and work as black-box: application feeds model class with input data, then the model returns post-processed output data in user-friendly format.
Wovchena marked this conversation as resolved.
Show resolved Hide resolved
Wovchena marked this conversation as resolved.
Show resolved Hide resolved

## Package structure

The Python* Model API consists of 3 libraries:
ivikhrev marked this conversation as resolved.
Show resolved Hide resolved
* _adapters_ - implement common interface to allow Model API wrappers usage with different executors: OpenVINO, ONNX, etc. See [Model API adapters](#model-api-adapters) section
* _models_ - implement wrappers for each architectures. See [Model API Wrappers](#model-api-wrappers) section
* _pipelines_ - implement pipelines for model inference and manage the synchronous/asynchronous execution. See [Model API Pipelines](#model-api-pipelines) section

## Building Python* Model API package
Wovchena marked this conversation as resolved.
Show resolved Hide resolved
For installation Python (version 3.6 or higher) is required. The installation is available from source.

Use the following command to install Python Model API from source:
```sh
pip install <omz_dir>/demos/common/python
```

Alternatively, you can generate the package using a wheel. Follow the steps below:
1. Build the wheel.

```sh
python <omz_dir>/demos/common/python/setup.py bdist_wheel
```
The wheel should appear in the dist folder.
Name example: `openmodelzoo_modelapi-0.0.0-py3-none-any.whl`

2. Install the package in the clean environment with `--force-reinstall` key.
```sh
pip install openmodelzoo_modelapi-0.0.0-py3-none-any.whl --force-reinstall
```

To verify the package is installed, you might use the following command:
```sh
python -c "from openvino.model_zoo import model_api"
```

> **NOTE**: On Linux and macOS, you may need to type `python3` instead of `python`. You may also need to [install pip](https://pip.pypa.io/en/stable/installation/).
eaidova marked this conversation as resolved.
Show resolved Hide resolved
> For example, on Ubuntu execute the following command to get pip installed: `sudo apt install python3-pip`.

## Model API Wrappers

The Python* Model API package suggests ready-to-use model wrappers, which implement standardized preprocessing/postprocessing functions per "task type" and might be reused in applications as "black-box" models.
anzhella-pankratova marked this conversation as resolved.
Show resolved Hide resolved
Also, the simple wrapper interface allows to create custom wrappers covering different architectures.

The following tasks can be solved with wrappers usage:

| Task type | Model API wrappers |
|----------------------------|--------------------|
| Background Matting | `VideoBackgroundMatting`, `ImageMattingWithBackground` |
| Classifiaction | `Classification` |
| Deblurring | `Deblurring` |
| Human Pose Estimation | `HpeAssociativeEmbedding`, `OpenPose` |
| Instance Segmentaiton | `MaskRCNNModel`, `YolactModel` |
| Monocular Depth Estimation | `MonoDepthModel` |
| Named Entity Recognition | `BertNamedEntityRecognition` |
| Object Detection | `CenterNet`, `DETR`, `CTPN`, `FaceBoxes`, `RetinaFace`, `RetinaFacePyTorch`, `SSD`, `UltraLightweightFaceDetection`, `YOLO`, `YoloV3ONNX`, `YoloV4`, `YOLOF`, `YOLOX` |
eaidova marked this conversation as resolved.
Show resolved Hide resolved
| Question Answering | `BertQuestionAnswering` |
| Salient Object Detection | `SalientObjectDetectionModel` |
| Semantic Segmentation | `SegmentationModel` |

## Model API Adapters

Model API wrappers are executor-agnostic, meaning it does not implement the specific model inference or model loading, instead it can be used with different executors having the implementation of common interface methods in adapter class respectively.

Currently, `OpenvinoAdapter` and `OVMSAdapter` are supported.

#### OpenVINO executor
eaidova marked this conversation as resolved.
Show resolved Hide resolved

`OpenvinoAdapter` hides the OpenVINO™ toolkit API, that allows Model API wrappers launching with models represented in Intermediate Representation (IR) format.
It accepts a path to either `xml` model file or `onnx` model file.

For OpenVINO executor employment, you need to install the requirements:
```sh
pip install <omz_dir>/demos/common/python/requirements_openvino.txt
```

#### OpenVINO Model Server executor

`OVMSAdapter` hides the OpenVINO Model Server python client API, that allows Model API wrappers launching with models served by OVMS.

Refer to __[`OVMSAdapter`](adapters/ovms_adapter.md)__ to learn about running demos with OVMS.

For OpenVINO Model Server executor employment, you need to install the requirements:
```sh
pip install <omz_dir>/demos/common/python/requirements_ovms.txt
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can it be done with extra modules in model api install?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

-r missed in command

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can it be done with extra modules in model api install?

I'll investigate it

```

## Model API Pipelines

Model API Pipelines represent the high-level wrappers upon the input data and accessing model results management.
It performs the data submission for model inference, verification of inference status, whether the result is ready or not, and results accessing.
Wovchena marked this conversation as resolved.
Show resolved Hide resolved

The `AsyncPipeline` is available, which handle the asynchronous execution of single model.

## Ready-to-use Model API solutions

To apply Model API wrappers in custom applications, learn the provided example of common scenario of how to use Python* Model API.

In the example, the SSD architecture is used to predict bounding boxes on input image `"sample.png"`. The model execution is produced by `OpenvinoAdapter`, therefore we submit the path to model's `xml` file. The model is loaded on CPU device inside the adapter.

Once the SSD model wrapper instance is created, we get the predictions by the model in one line: `ssd_model(input_data)` - the wrapper performs the preprocess method, synchronous inference on OpenVINO side and postprocess method.

```python
import cv2
from openvino.model_zoo.model_api.models import SSD
anzhella-pankratova marked this conversation as resolved.
Show resolved Hide resolved
from openvino.model_zoo.model_api.adapters import OpenvinoAdapter, create_core
anzhella-pankratova marked this conversation as resolved.
Show resolved Hide resolved


# a helper function for bboxes visualization
def draw_detections(image, detections):
for detection in detections:
class_id = int(detection.id)
color = (255, 0, 0)
det_label = '#{}'.format(class_id)
xmin, ymin, xmax, ymax = detection.get_coords()
cv2.rectangle(image, (xmin, ymin), (xmax, ymax), color, 2)
cv2.putText(image, '{} {:.1%}'.format(det_label, detection.score),
(xmin, ymin - 7), cv2.FONT_HERSHEY_COMPLEX, 0.6, color, 1)
return image


def main():
input_data = cv2.imread("sample.png")
model_path = "public/mobilenet-ssd/FP32/mobilenet-ssd.xml"

model_adapter = OpenvinoAdapter(create_core(), model_path, device="CPU")
ssd_model = SSD(model_adapter, preload=True)

results = ssd_model(input_data)
eaidova marked this conversation as resolved.
Show resolved Hide resolved

image_with_bboxes = draw_detections(input_data, results)
cv2.imshow('Detection Results', image_with_bboxes)
key = cv2.waitKey(0)
if key in {ord('q'), ord('Q'), 27}:
return


if __name__ == '__main__':
main()
```
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We won't ever verify the snippet works. To me it's the strong reason to delete it. You should refer to a demo instead

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the majority prefers to delete it, I will do it

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not about voting.

  1. This is about explaining why your solution is correct.
  2. Even if you insist on conducting a poll, your voice won't count. My voice won't count either. Only the voice of the person who is responsible for OMZ matters. Which is @vladimir-dudnik's one

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My point is the demos contain only complex cases with asynchronous models execution, it is not the only one usage scenario. We should provide somewhere the example of simple synchronous model call.

Didn't get point about the snippet work verification. Many packages have a documentation with API examples. The package is going to be updated with new releases, and with the new releases the documentation will be also updated as well as the snippet.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can write a dedicated sample and refer to it. You will anyway need to cover your lib with tests. The sample could be a part of tests.

Didn't get point about the snippet work verification. Many packages have a documentation with API examples. The package is going to be updated with new releases, and with the new releases the documentation will be also updated as well as the snippet.

Such packages have people whos work to continuously check the example is consistent. We don't have such people. If packages don't do that, they usually end up having broken examples.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here is the solution. Keep this section, but remove it after tests are added

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay


To study the complex scenarios, refer to [Open Model Zoo Python* demos](https://github.com/openvinotoolkit/open_model_zoo/tree/master/demos), where the asynchronous inference is applied.