-
Notifications
You must be signed in to change notification settings - Fork 5.7k
Add a simple C++ inference example for fluid #7097
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
0c588de
to
c9e09e9
Compare
c9e09e9
to
332fa97
Compare
paddle/inference/inference.cc
Outdated
|
||
namespace paddle { | ||
|
||
void InferenceDesc::LoadInferenceModel(std::string& dirname) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Desc means description. It should be used only for the description of the internal data structure, like OpDesc describes operator.
It should not have Execute
, Load
, etc.
Maybe Inference
or InferenceEngine
is a better name
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
const T&
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. I know. The class's name will be designed carefully. I just write a simple example to understand the inferring process of fluid. All the codes will be rewritten in next PRs.
paddle/inference/inference.cc
Outdated
void InferenceDesc::LoadInferenceModel(std::string& dirname) { | ||
#ifdef PADDLE_USE_PTOOLS | ||
std::string model_filename = dirname + "/__model__"; | ||
LOG(INFO) << "Using PicklingTools, loading model from " << model_filename; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should not support pickle in C++.
Maybe we should have a protobuf data structure for inference? I am not sure.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree with it and can't agree more.
paddle/inference/inference.h
Outdated
public: | ||
InferenceDesc() : program_(nullptr), load_program_(nullptr) {} | ||
~InferenceDesc() { | ||
if (!program_) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just delete program_
; is OK.
In C++, delete ptr;
is valid when ptr is nullptr, and will do nothing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
paddle/inference/inference.cc
Outdated
} | ||
|
||
bool InferenceDesc::IsParameter(framework::VarDesc* var) { | ||
if (var->Persistable()) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is very tricky. Not all persistable variable is a parameter.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe it is better to have a protobuf structure for inference.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. In the inference process, we only need to load parameters from memory. This is right when no other persistable variables except parameters are used in the inference network.
As shown in the output of the example, there are many unused variables, we also need to optimize the inference program.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Excellent works!
However, maybe we can make a long-term plan for an inference SDK?
332fa97
to
bc1a542
Compare
@reyoung Thanks for your review. |
bc1a542
to
9b3f2c3
Compare
paddle/inference/example.cc
Outdated
|
||
int main(int argc, char* argv[]) { | ||
std::string dirname = | ||
"/home/work/liuyiqun/PaddlePaddle/Paddle/paddle/inference/" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do not use absolute/personal path
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe you can upload this model, and download it by verifying MD5.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do not use absolute/personal path
Done.
Maybe you can upload this model, and download it by verifying MD5.
I think the current version of model is not stable and is not suitable to upload for testing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we add a unitest for inference example?
paddle/inference/example.cc
Outdated
|
||
int main(int argc, char* argv[]) { | ||
std::string dirname = | ||
"/home/work/liuyiqun/PaddlePaddle/Paddle/paddle/inference/" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe you can upload this model, and download it by verifying MD5.
paddle/inference/inference.cc
Outdated
if (IsParameter(var)) { | ||
LOG(INFO) << "parameter's name: " << var->Name(); | ||
|
||
// framework::VarDesc new_var = *var; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove unused line
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
08749e2
to
5b3cf4e
Compare
I think after removing the dependency of Pickle, it will be easy to add this example as an unittest. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Merge it first for developing faster.
To test this PR, you need to:
C++
sub-directory using commandmake -f Makefile.Linux libptools.so
.PTOOLS_ROOT
to the root directory of PicklingTools, then build PaddlePaddle from source codes of this PR.pip install -U /home/liuyiqun01/PaddlePaddle/Paddle/build_paddle/dist/opt/paddle/share/wheels/paddlepaddle_gpu-0.11.0-cp27-cp27mu-linux_x86_64.whl
to install the latest Paddle and runpython test_recognize_digits_mlp.py
to train the model, then you will get the inference model in sub-directoryrecognize_digits_mlp.inference.model
After these steps, you can run the example using:
You will get output as following: