Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add export & inference for hPINNs #902

Merged
merged 4 commits into from
May 16, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 20 additions & 0 deletions docs/zh/examples/hpinns.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,26 @@
python holography.py mode=eval EVAL.pretrained_model_path=https://paddle-org.bj.bcebos.com/paddlescience/models/hPINNs/hpinns_pretrained.pdparams
```

=== "模型导出命令"

``` sh
python holography.py mode=export
```

=== "模型推理命令"

``` sh
# linux
wget -nc https://paddle-org.bj.bcebos.com/paddlescience/datasets/hPINNs/hpinns_holo_train.mat -P ./datasets/
wget -nc https://paddle-org.bj.bcebos.com/paddlescience/datasets/hPINNs/hpinns_holo_valid.mat -P ./datasets/
# windows
# curl https://paddle-org.bj.bcebos.com/paddlescience/datasets/hPINNs/hpinns_holo_train.mat --output ./datasets/hpinns_holo_train.mat
# curl https://paddle-org.bj.bcebos.com/paddlescience/datasets/hPINNs/hpinns_holo_valid.mat --output ./datasets/hpinns_holo_valid.mat
python holography.py mode=infer
```



| 预训练模型 | 指标 |
|:--| :--|
| [hpinns_pretrained.pdparams](https://paddle-org.bj.bcebos.com/paddlescience/models/hPINNs/hpinns_pretrained.pdparams) | loss(opt_sup): 0.05352<br>MSE.eval_metric(opt_sup): 0.00002<br>loss(val_sup): 0.02205<br>MSE.eval_metric(val_sup): 0.00001 |
Expand Down
20 changes: 20 additions & 0 deletions examples/hpinns/conf/hpinns.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ seed: 42
output_dir: ${hydra:run.dir}
DATASET_PATH: ./datasets/hpinns_holo_train.mat
DATASET_PATH_VALID: ./datasets/hpinns_holo_valid.mat
log_freq: 20

# set working condition
TRAIN_MODE: aug_lag # "soft", "penalty", "aug_lag"
Expand Down Expand Up @@ -65,3 +66,22 @@ TRAIN:
# evaluation settings
EVAL:
pretrained_model_path: null

# inference settings
INFER:
pretrained_model_path: "https://paddle-org.bj.bcebos.com/paddlescience/models/hPINNs/hpinns_pretrained.pdparams"
export_path: ./inference/hpinns
pdmodel_path: ${INFER.export_path}.pdmodel
pdiparams_path: ${INFER.export_path}.pdiparams
output_keys: ["e_re", "e_im", "eps"]
device: gpu
engine: native
precision: fp32
onnx_path: ${INFER.export_path}.onnx
ir_optim: true
min_subgraph_size: 10
gpu_mem: 8000
gpu_id: 0
batch_size: 128
max_batch_size: 128
num_cpu_threads: 4
67 changes: 66 additions & 1 deletion examples/hpinns/holography.py
Original file line number Diff line number Diff line change
Expand Up @@ -409,14 +409,79 @@ def evaluate(cfg: DictConfig):
solver.eval()


def export(cfg: DictConfig):
# set model
model_re = ppsci.arch.MLP(**cfg.MODEL.re_net)
model_im = ppsci.arch.MLP(**cfg.MODEL.im_net)
model_eps = ppsci.arch.MLP(**cfg.MODEL.eps_net)

# register transform
model_re.register_input_transform(func_module.transform_in)
model_im.register_input_transform(func_module.transform_in)
model_eps.register_input_transform(func_module.transform_in)

model_re.register_output_transform(func_module.transform_out_real_part)
model_im.register_output_transform(func_module.transform_out_imaginary_part)
model_eps.register_output_transform(func_module.transform_out_epsilon)

# wrap to a model_list
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

当前导出后推理会报错:

RuntimeError: (PreconditionNotMet) The variable named x is not found in the scope of the executor.
[Hint: scope->FindVar(name) should not be null.] (at ../paddle/fluid/inference/api/analysis_predictor.cc:2333)

这是由于导出时的input_keys和推理时的input_keys不匹配导致的。
这个模型中有input transform,即:("x","y")->transform->("x_cos_1","y_cons_1",...)->forward->("e_re",...),当前export函数中的input_keys是("x_cos_1","y_cons_1",...),而infer中是("x","y")。
因此,需要更改:

  1. 这里增加transform register代码
# register transform
  model_re.register_input_transform(func_module.transform_in)
  model_im.register_input_transform(func_module.transform_in)
  model_eps.register_input_transform(func_module.transform_in)

  model_re.register_output_transform(func_module.transform_out_real_part)
  model_im.register_output_transform(func_module.transform_out_imaginary_part)
  model_eps.register_output_transform(func_module.transform_out_epsilon)
  1. 更改下面input_spec

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

已修改,感谢纠正🌹

model_list = ppsci.arch.ModelList((model_re, model_im, model_eps))

# initialize solver
solver = ppsci.solver.Solver(
model_list,
pretrained_model_path=cfg.INFER.pretrained_model_path,
)

# export model
from paddle.static import InputSpec

input_spec = [
{key: InputSpec([None, 1], "float32", name=key) for key in ["x", "y"]},
]
solver.export(input_spec, cfg.INFER.export_path)


def inference(cfg: DictConfig):
from deploy.python_infer import pinn_predictor

predictor = pinn_predictor.PINNPredictor(cfg)

valid_dict = ppsci.utils.reader.load_mat_file(
cfg.DATASET_PATH_VALID, ("x_val", "y_val", "bound")
)
input_dict = {"x": valid_dict["x_val"], "y": valid_dict["y_val"]}

output_dict = predictor.predict(input_dict, cfg.INFER.batch_size)

# mapping data to cfg.INFER.output_keys
output_dict = {
store_key: output_dict[infer_key]
for store_key, infer_key in zip(cfg.INFER.output_keys, output_dict.keys())
}

ppsci.visualize.save_vtu_from_dict(
"./hpinns_pred.vtu",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

改为:osp.join(cfg.output_dir, "hpinns_pred.vtu"),以防文件覆盖

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

改为:osp.join(cfg.output_dir, "hpinns_pred.vtu"),以防文件覆盖

inference的产物文件就不放到output_dir下面了吧,这个就保持hpinns_pred.vtu比较好

{**input_dict, **output_dict},
input_dict.keys(),
cfg.INFER.output_keys,
)


@hydra.main(version_base=None, config_path="./conf", config_name="hpinns.yaml")
def main(cfg: DictConfig):
if cfg.mode == "train":
train(cfg)
elif cfg.mode == "eval":
evaluate(cfg)
elif cfg.mode == "export":
export(cfg)
elif cfg.mode == "infer":
inference(cfg)
else:
raise ValueError(f"cfg.mode should in ['train', 'eval'], but got '{cfg.mode}'")
raise ValueError(
f"cfg.mode should in ['train', 'eval', 'export', 'infer'], but got '{cfg.mode}'"
)


if __name__ == "__main__":
Expand Down