Skip to content

Commit

Permalink
Merge pull request PaddlePaddle#1417 from MissPenguin/dygraph
Browse files Browse the repository at this point in the history
update hubserving
  • Loading branch information
MissPenguin authored Dec 14, 2020
2 parents 0e32093 + af0f81d commit b1623d6
Show file tree
Hide file tree
Showing 14 changed files with 17 additions and 726 deletions.
2 changes: 1 addition & 1 deletion deploy/hubserving/ocr_cls/params.py
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ def read_params():
cfg = Config()

#params for text classifier
cfg.cls_model_dir = "./inference/ch_ppocr_mobile_v1.1_cls_infer/"
cfg.cls_model_dir = "./inference/ch_ppocr_mobile_v2.0_cls_infer/"
cfg.cls_image_shape = "3, 48, 192"
cfg.label_list = ['0', '180']
cfg.cls_batch_num = 30
Expand Down
12 changes: 1 addition & 11 deletions deploy/hubserving/ocr_det/params.py
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ def read_params():

#params for text detector
cfg.det_algorithm = "DB"
cfg.det_model_dir = "./inference/ch_ppocr_mobile_v1.1_det_infer/"
cfg.det_model_dir = "./inference/ch_ppocr_mobile_v2.0_det_infer/"
cfg.det_limit_side_len = 960
cfg.det_limit_type = 'max'

Expand All @@ -27,16 +27,6 @@ def read_params():
# cfg.det_east_cover_thresh = 0.1
# cfg.det_east_nms_thresh = 0.2

# #params for text recognizer
# cfg.rec_algorithm = "CRNN"
# cfg.rec_model_dir = "./inference/ch_det_mv3_crnn/"

# cfg.rec_image_shape = "3, 32, 320"
# cfg.rec_char_type = 'ch'
# cfg.rec_batch_num = 30
# cfg.rec_char_dict_path = "./ppocr/utils/ppocr_keys_v1.txt"
# cfg.use_space_char = True

cfg.use_zero_copy_run = False
cfg.use_pdserving = False

Expand Down
7 changes: 4 additions & 3 deletions deploy/hubserving/ocr_system/params.py
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ def read_params():

#params for text detector
cfg.det_algorithm = "DB"
cfg.det_model_dir = "./inference/ch_ppocr_mobile_v1.1_det_infer/"
cfg.det_model_dir = "./inference/ch_ppocr_mobile_v2.0_det_infer/"
cfg.det_limit_side_len = 960
cfg.det_limit_type = 'max'

Expand All @@ -29,7 +29,7 @@ def read_params():

#params for text recognizer
cfg.rec_algorithm = "CRNN"
cfg.rec_model_dir = "./inference/ch_ppocr_mobile_v1.1_rec_infer/"
cfg.rec_model_dir = "./inference/ch_ppocr_mobile_v2.0_rec_infer/"

cfg.rec_image_shape = "3, 32, 320"
cfg.rec_char_type = 'ch'
Expand All @@ -41,13 +41,14 @@ def read_params():

#params for text classifier
cfg.use_angle_cls = True
cfg.cls_model_dir = "./inference/ch_ppocr_mobile_v1.1_cls_infer/"
cfg.cls_model_dir = "./inference/ch_ppocr_mobile_v2.0_cls_infer/"
cfg.cls_image_shape = "3, 48, 192"
cfg.label_list = ['0', '180']
cfg.cls_batch_num = 30
cfg.cls_thresh = 0.9

cfg.use_zero_copy_run = False
cfg.use_pdserving = False
cfg.drop_score = 0.5

return cfg
8 changes: 4 additions & 4 deletions deploy/hubserving/readme.md
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -33,11 +33,11 @@ pip3 install paddlehub --upgrade -i https://pypi.tuna.tsinghua.edu.cn/simple
```

### 2. 下载推理模型
安装服务模块前,需要准备推理模型并放到正确路径。默认使用的是v1.1版的超轻量模型,默认模型路径为:
安装服务模块前,需要准备推理模型并放到正确路径。默认使用的是v2.0版的超轻量模型,默认模型路径为:
```
检测模型:./inference/ch_ppocr_mobile_v1.1_det_infer/
识别模型:./inference/ch_ppocr_mobile_v1.1_rec_infer/
方向分类器:./inference/ch_ppocr_mobile_v1.1_cls_infer/
检测模型:./inference/ch_ppocr_mobile_v2.0_det_infer/
识别模型:./inference/ch_ppocr_mobile_v2.0_rec_infer/
方向分类器:./inference/ch_ppocr_mobile_v2.0_cls_infer/
```

**模型路径可在`params.py`中查看和修改。** 更多模型可以从PaddleOCR提供的[模型库](../../doc/doc_ch/models_list.md)下载,也可以替换成自己训练转换好的模型。
Expand Down
8 changes: 4 additions & 4 deletions deploy/hubserving/readme_en.md
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -34,11 +34,11 @@ pip3 install paddlehub --upgrade -i https://pypi.tuna.tsinghua.edu.cn/simple
```

### 2. Download inference model
Before installing the service module, you need to prepare the inference model and put it in the correct path. By default, the ultra lightweight model of v1.1 is used, and the default model path is:
Before installing the service module, you need to prepare the inference model and put it in the correct path. By default, the ultra lightweight model of v2.0 is used, and the default model path is:
```
detection model: ./inference/ch_ppocr_mobile_v1.1_det_infer/
recognition model: ./inference/ch_ppocr_mobile_v1.1_rec_infer/
text direction classifier: ./inference/ch_ppocr_mobile_v1.1_cls_infer/
detection model: ./inference/ch_ppocr_mobile_v2.0_det_infer/
recognition model: ./inference/ch_ppocr_mobile_v2.0_rec_infer/
text direction classifier: ./inference/ch_ppocr_mobile_v2.0_cls_infer/
```

**The model path can be found and modified in `params.py`.** More models provided by PaddleOCR can be obtained from the [model library](../../doc/doc_en/models_list_en.md). You can also use models trained by yourself.
Expand Down
79 changes: 0 additions & 79 deletions deploy/pdserving/det_local_server.py

This file was deleted.

78 changes: 0 additions & 78 deletions deploy/pdserving/det_web_server.py

This file was deleted.

114 changes: 0 additions & 114 deletions deploy/pdserving/ocr_local_server.py

This file was deleted.

Loading

0 comments on commit b1623d6

Please sign in to comment.