Skip to content

Paddle2Onnx不支持转换量化后的模型 #1740

Open
@ChihoJack

Description

  • 【FastDeploy版本】: fastdeploy-python 0.0.0

  • 【编译命令】瑞芯微 RK3588编译命令

  • 【系统平台】: Linux x64(Ubuntu 18.04)

  • 【硬件】:瑞芯微RK3588S

  • 【编译语言】:Python 3.8.16

  • 【模型无法转换】

  • 运行命令:
    paddle2onnx --model_dir ch_PP-OCRv3_det_infer
    --model_filename inference.pdmodel
    --params_filename inference.pdiparams
    --save_file ch_PP-OCRv3_det_infer/ch_PP-OCRv3_det_infer.onnx
    --enable_dev_version True

  • 报错信息:
    [Paddle2ONNX] Start to parse PaddlePaddle model...
    [Paddle2ONNX] Model file path: ch_PP-OCRv3_det_infer/inference.pdmodel
    [Paddle2ONNX] Paramters file path: ch_PP-OCRv3_det_infer/inference.pdiparams
    [Paddle2ONNX] Start to parsing Paddle model...
    [Paddle2ONNX] [Info] The Paddle model is a quantized model.
    [Paddle2ONNX] Oops, there are some operators not supported yet, including fake_channel_wise_quantize_dequantize_abs_max,fake_quantize_dequantize_moving_average_abs_max,
    [ERROR] Due to the unsupported operators, the conversion is aborted.
    Aborted (core dumped)

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions