-
Notifications
You must be signed in to change notification settings - Fork 5.7k
doc/inference api #11332
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
doc/inference api #11332
Conversation
}; | ||
|
||
struct PaddleBuf { | ||
void* data; // pointer to the data memory. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
need to document the data layout. Is it row-major?
The main interface is `PaddlePredictor`, there are following methods | ||
|
||
- `bool Run(const std::vector<PaddleTensor>& inputs, std::vector<PaddleTensor>* output_data)` | ||
- take inputs and output `output_data` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Document pointer ownership?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Document thread-safety?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The return is a unique_ptr
, the ownership will transparently transfer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
high_level_api.md 需要挪到doc/fluid下面么?
# Inference High-level APIs | ||
This document describes the high-level inference APIs one can use to easily deploy a Paddle model for an application. | ||
|
||
The APIs are described in `paddle_inference_api.h`, just one header file, and two libaries `libpaddle_fluid.so` and `libpaddle_fluid_api.so` are needed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
对api来说,提供一个libpaddle_fluid_api.so
是否就可以了?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
可以编译进去吗,我之前发现不行,还需要 paddle_fluid.so
|
||
## PaddleTensor | ||
Forget the hard-to-understand `LoDTensor`, | ||
we provide the `PaddleTensor` data structure is to give a general tensor interface. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
8行有两个谓语 provide和 is
|
||
The data is stored in a continuous memory `PaddleBuf`, and tensor's data type is specified by a `PaddleDType`. | ||
The `name` field is used to specify the name of input variable, | ||
that is important when there are multiple inputs and need to distiuish which variable to set the content. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
which variable to set the content 缺少谓语
that is important when there are multiple inputs and need to distiuish which variable to set the content. | ||
|
||
## engine | ||
The inference APIs has different underlying implementation, currently there are two valid engines: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
have different underlying implementations
## engine | ||
The inference APIs has different underlying implementation, currently there are two valid engines: | ||
|
||
- the native engine, which is consists of the native operators and framework, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is consists of 语法请修正
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这句没问题啊
The inference APIs has different underlying implementation, currently there are two valid engines: | ||
|
||
- the native engine, which is consists of the native operators and framework, | ||
- the Anakin engine, which is a Anakin library embeded. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里可以给个Anakin的链接
|
||
The native engine takes a native Paddle model as input, and supports any model that trained by Paddle. | ||
|
||
The Anakin engine can only take the Anakin model as input(user need to manully transform the format first) and currently not all Paddle models are supported. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
41行需要放到38行后面,43行需要放到39行后面么?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
- take inputs and output `output_data` | ||
- `Clone` to clone a predictor from an existing one, with model parameter shared. | ||
|
||
There is a factory method to help create a predictor |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
缺句号。
## Reference | ||
|
||
- [paddle_inference_api.h](./paddle_inference_api.h) | ||
- [demos](./demo) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
在另外一个 pr 里加
… feature/anakin-ci
The APIs are described in `paddle_inference_api.h`, just one header file, and two libaries `libpaddle_fluid.so` and `libpaddle_fluid_api.so` are needed. | ||
|
||
## PaddleTensor | ||
Forget the hard-to-understand `LoDTensor`, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can remove this line
No description provided.