-
Notifications
You must be signed in to change notification settings - Fork 5.9k
【inference】Inference support config is based on the path and model prefix #65813
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
你的PR提交成功,感谢你对开源项目的贡献! |
| } | ||
| params_file_ = prog_file + "/" + params_file + ".pdiparams"; | ||
|
|
||
| } else if (fs::is_directory(params_file)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个分支是支持先传前缀,后传目录? 是否有这个必要呢
| return false; | ||
| } | ||
|
|
||
| AnalysisConfig::AnalysisConfig(const std::string &prog_file, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- SetModel函数同样需要处理;
- 如果传的是相对路径呢?
- 更改了Config的构造函数和SetModel函数之后,他们的两个输入参数就不是prog_file和params_file含义了,函数参数的命名以及对应接口的注释需要修改。这都是暴露给用户的api,更改了含义需要在文档中体现到。
6c2fd00 to
2a6764c
Compare
| bool use_pir = config_.use_pir_; | ||
| bool use_new_executor = config_.use_new_executor_; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这两个临时变量是多余的,enforce里直接用就可以了
| /// \param[in] prog_file model file path of the combined model or the | ||
| /// directory path containing the model. \param[in] params_file params file | ||
| /// path of the combined model or the model prefix. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个注释有些问题,函数参数名称已经变了 prog_file -> prog_file_or_model_dir,其他地方同
| std::ifstream fin(prog_file_, std::ios::in | std::ios::binary); | ||
| PADDLE_ENFORCE_EQ( | ||
| static_cast<bool>(fin.is_open()), | ||
| true, | ||
| platform::errors::NotFound( | ||
| "Cannot open file %s, please confirm whether the file is normal.", | ||
| prog_file_)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
paddle/fluid/inference/api/helper.h里有IsFileExists函数可以使用
| bool is_directory(const std::string &path) { | ||
| struct stat info; | ||
| if (stat(path.c_str(), &info) != 0) { | ||
| return false; | ||
| } else if (info.st_mode & S_IFDIR) { | ||
| return true; | ||
| } | ||
| return false; | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个函数也放在paddle/fluid/inference/api/helper.h中去吧
| AnalysisConfig::AnalysisConfig(const std::string &prog_file_or_model_dir, | ||
| const std::string ¶ms_file_or_model_prefix) { | ||
| if (is_directory(prog_file_or_model_dir)) { | ||
| if (FLAGS_enable_pir_api) { | ||
| prog_file_ = | ||
| prog_file_or_model_dir + "/" + params_file_or_model_prefix + ".json"; | ||
| } else { | ||
| prog_file_ = prog_file_or_model_dir + "/" + params_file_or_model_prefix + | ||
| ".pdmodel"; | ||
| } | ||
| params_file_ = prog_file_or_model_dir + "/" + params_file_or_model_prefix + | ||
| ".pdiparams"; | ||
| } else { | ||
| prog_file_ = prog_file_or_model_dir; | ||
| params_file_ = params_file_or_model_prefix; | ||
| } | ||
|
|
||
| std::ifstream fin(prog_file_, std::ios::in | std::ios::binary); | ||
| PADDLE_ENFORCE_EQ( | ||
| static_cast<bool>(fin.is_open()), | ||
| true, | ||
| platform::errors::NotFound( | ||
| "Cannot open file %s, please confirm whether the file is normal.", | ||
| prog_file_)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个构造函数中可以调用SetModel函数,就不用维护两份同样的代码了
paddle/fluid/inference/api/helper.h
Outdated
| return exists; | ||
| } | ||
|
|
||
| bool is_directory(const std::string &path) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
static修饰下
…refix (PaddlePaddle#65813) * config * config * fix * fix * fix windows * fix cinn 编译bug: * fix * 加注释和setmodel * 加注释和setmodel * fix没打开pir * fix ci上开启环境变量失效 * fix bug * fix test_analyzer_capi_exp_pd_config 单侧 * 解决单测覆盖率不够 * fix * fix * fix
PR Category
Inference
PR Types
New features
Description
card-71500
推理支持Config,目前Config可以传入模型文件名和参数文件名,也支持模型路径和模型前缀。
config.set_model也可以传入模型文件名,也支持传入模型路径和模型前缀了
根据FLAGS_enable_pir_api是否为true,如果为true,则补全后缀为.json,如果为false,则补全后缀为.pdmodel
另外修复了test_save_optimized_pass.py的bug