Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

模型(sdv1.4) onnx-> mnn Convert 异常 #2477

Closed
fuerpy opened this issue Jul 7, 2023 · 5 comments
Closed

模型(sdv1.4) onnx-> mnn Convert 异常 #2477

fuerpy opened this issue Jul 7, 2023 · 5 comments
Labels
Converter question Further information is requested

Comments

@fuerpy
Copy link

fuerpy commented Jul 7, 2023

平台(如果交叉编译请再附上交叉编译目标平台):

Platform(Include target platform as well if cross-compiling):

Linux

Github版本:

Github Version:

直接下载ZIP包请提供下载日期以及压缩包注释里的git版本(可通过7z l zip包路径命令并在输出信息中搜索Comment 获得,形如Comment = bc80b11110cd440aacdabbf59658d630527a7f2b)。 git clone请提供 git commit 第一行的commit id

Provide date (or better yet, git revision from the comment section of the zip. Obtainable using 7z l PATH/TO/ZIP and search for Comment in the output) if downloading source as zip,otherwise provide the first commit id from the output of git commit

编译方式:

Compiling Method

./MNNConvert -f ONNX --modelFile model.onnx --MNNModel unet.mnn --bizCode biz --fp16

编译日志:

Build Log:

The device support i8sdot:0, support fp16:0, support i8mm: 0
Start to Convert Other Model Format To MNN Model...
[10:48:10] /media/zzy/4T/stable_diffusion_mnn/ALIBABA_MNN/MNN/tools/converter/source/onnx/onnxConverter.cpp:98: ONNX Model ir version: 7
[10:48:10] /media/zzy/4T/stable_diffusion_mnn/ALIBABA_MNN/MNN/tools/converter/source/onnx/onnxConverter.cpp:99: ONNX Model opset version: 14

[10:48:10] /media/zzy/4T/stable_diffusion_mnn/ALIBABA_MNN/MNN/tools/converter/source/onnx/onnxOpConverter.cpp:300: Don't support 10
[10:48:10] /media/zzy/4T/stable_diffusion_mnn/ALIBABA_MNN/MNN/tools/converter/source/onnx/onnxOpConverter.cpp:300: Don't support 10
[10:48:10] /media/zzy/4T/stable_diffusion_mnn/ALIBABA_MNN/MNN/tools/converter/source/onnx/onnxOpConverter.cpp:300: Don't support 10
[10:48:10] /media/zzy/4T/stable_diffusion_mnn/ALIBABA_MNN/MNN/tools/converter/source/onnx/onnxOpConverter.cpp:300: Don't support 10
[10:48:10] /media/zzy/4T/stable_diffusion_mnn/ALIBABA_MNN/MNN/tools/converter/source/onnx/onnxOpConverter.cpp:300: Don't support 10
[10:48:10] /media/zzy/4T/stable_diffusion_mnn/ALIBABA_MNN/MNN/tools/converter/source/onnx/onnxOpConverter.cpp:300: Don't support 10
[10:48:10] /media/zzy/4T/stable_diffusion_mnn/ALIBABA_MNN/MNN/tools/converter/source/onnx/onnxOpConverter.cpp:300: Don't support 10
[10:48:10] /media/zzy/4T/stable_diffusion_mnn/ALIBABA_MNN/MNN/tools/converter/source/onnx/onnxOpConverter.cpp:300: Don't support 10
Start to Optimize the MNN Net...
Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Not support data type.Convert MatMul Convolution use shared const B inputs, may increase the model size
81 op name is empty or dup, set to Unsqueeze81
433 op name is empty or dup, set to BinaryOp433
436 op name is empty or dup, set to BinaryOp436
565 op name is empty or dup, set to Unsqueeze565
745 op name is empty or dup, set to BinaryOp745
746 op name is empty or dup, set to Unsqueeze746
1281 op name is empty or dup, set to BinaryOp1281
1282 op name is empty or dup, set to Unsqueeze1282
1335 op name is empty or dup, set to BinaryOp1335
1336 op name is empty or dup, set to Unsqueeze1336
1337 op name is empty or dup, set to Const1337
1373 op name is empty or dup, set to Unsqueeze1373
2073 op name is empty or dup, set to BinaryOp2073
2074 op name is empty or dup, set to Unsqueeze2074
2488 op name is empty or dup, set to Const2488
10883 op name is empty or dup, set to BinaryOp10883
10884 op name is empty or dup, set to Unsqueeze10884
10886 op name is empty or dup, set to Unsqueeze10886
10887 op name is empty or dup, set to StridedSlice10887
10889 op name is empty or dup, set to BinaryOp10889
11687 op name is empty or dup, set to Unsqueeze11687
11704 op name is empty or dup, set to BinaryOp11704
11705 op name is empty or dup, set to Unsqueeze11705
12494 op name is empty or dup, set to Unsqueeze12494
12497 op name is empty or dup, set to Unsqueeze12497
12499 op name is empty or dup, set to Unsqueeze12499
10202 op name is empty or dup, set to BinaryOp10202
10267 op name is empty or dup, set to Unsqueeze10267
11742 op name is empty or dup, set to BinaryOp11742
11751 op name is empty or dup, set to Unsqueeze11751
11795 op name is empty or dup, set to Unsqueeze11795
11804 op name is empty or dup, set to BinaryOp11804
inputTensors : [ sample, timestep, encoder_hidden_states, ]
outputTensors: [ out_sample, ]
The model has subgraphs, please use MNN::Module to run it

@fuerpy
Copy link
Author

fuerpy commented Jul 7, 2023

@jxt1234
Copy link
Collaborator

jxt1234 commented Jul 9, 2023

只是这个 “Not support data type” 属于异常,其他log 是正常的,可以先用 testMNNFromOnnx.py 校验。
onnx 中存在不支持的数据类型一般是 fp16 ,检查一下你的 onnx 模型是否做了 fp16 的压缩

@jxt1234 jxt1234 added question Further information is requested Converter labels Jul 9, 2023
@fuerpy
Copy link
Author

fuerpy commented Jul 9, 2023

@jxt1234 我的unet是想转换成fp16,所以在转换为onnx的时候转换成了 fp1.6G大小的文件,按照您回复的意思是不是应该转换成onnx时保持fp32,然后通过mnn的转换工具转换成fp16,另外我记得onnx如果文件大于2gb,会拆分成一个.onnx和weights.pb,这种情况下mnnconvert这个工具该如何使用呢?谢谢您的回复

@jxt1234
Copy link
Collaborator

jxt1234 commented Sep 4, 2023

这种情况仍然传 XXX.onnx 就可以,会自动读取 weights.pb 的信息

@jxt1234 jxt1234 closed this as completed Sep 4, 2023
@jxt1234
Copy link
Collaborator

jxt1234 commented Sep 4, 2023

onnx fp16 在 2.7.0 已经支持

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Converter question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants