Skip to content

Commit

Permalink
Use load_param_mem with ncnn CPU inference (chaiNNer-org#2142)
Browse files Browse the repository at this point in the history
Use load_param_mem with ncnn CPU inference

Co-authored-by: Jeremy Rand <jeremyrand@danwin1210.de>
  • Loading branch information
JeremyRand and Jeremy Rand authored Aug 25, 2023
1 parent 7a11c2b commit 5f69c2e
Showing 1 changed file with 1 addition and 6 deletions.
7 changes: 1 addition & 6 deletions backend/src/nodes/impl/ncnn/session.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,18 +36,13 @@ def create_ncnn_net(
net.set_vulkan_device(exec_options.ncnn_gpu_index)

# Load model param and bin
net.load_param_mem(model.model.write_param())
if use_gpu:
net.load_param_mem(model.model.write_param())
net.load_model_mem(model.model.bin)
else:
with tempfile.TemporaryDirectory() as tmp_model_dir:
param_filename = tmp_model_dir + "/ncnn-model.param"
bin_filename = tmp_model_dir + "/ncnn-model.bin"

model.model.write_param(param_filename)
model.model.write_bin(bin_filename)

net.load_param(param_filename)
net.load_model(bin_filename)

return net
Expand Down

0 comments on commit 5f69c2e

Please sign in to comment.