Skip to content

Consider adding a PADDLE_INFERENCE option and a PADDLE_MOBILE macro #4221

Closed
@hedaoyuan

Description

@hedaoyuan

At present, when we do model inference in the mobile environment, hoping the paddle can be small enough. So, when compiling the paddle for the mobile environment (Android, IOS), we need to be able to crop the paddle modules, thereby reducing the size of the inference program.

Based on the previous survey #1845, we found several modules (like libpaddle_pserver.a, libpaddle_trainer_lib.a, libpaddle_api.a and so on) that were not related to inference, which took some volume in the final inference program. So, consider adding a PADDLE_INFERENCE switch to do module clipping at the compile time. At present, in some of the CMakeLists.txt files have been used WITH_C_API done something like that. Further work is replacing WITH_C_API with PADDLE_INFERENCE and needs to refine the CMakeLists.txt files for the modules clipping.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions