Skip to content

Commit

Permalink
[Dy2Stat] Fix ProgramTranslator.save_inference_model API Doc (PaddleP…
Browse files Browse the repository at this point in the history
…addle#24584)

As the title.
  • Loading branch information
zhhsplendid authored May 15, 2020
1 parent c4dd596 commit 5ff4535
Showing 1 changed file with 11 additions and 11 deletions.
22 changes: 11 additions & 11 deletions python/paddle/fluid/dygraph/dygraph_to_static/program_translator.py
Original file line number Diff line number Diff line change
Expand Up @@ -559,14 +559,14 @@ def save_inference_model(self, dirname, feed=None, fetch=None):
Args:
dirname (str): the directory to save the inference model.
feed (list[int], optional): the input variable indices of the saved
inference model. If None, all input variables of the
ProgramTranslator would be the inputs of the saved inference
model. Default None.
fetch (list[int], optional): the output variable indices of the
saved inference model. If None, all output variables of the
TracedLayer object would be the outputs of the saved inference
model. Default None.
feed (list[int], optional): the indices of the input variables of the
dygraph functions which will be saved as input variables in
inference model. If None, all input variables of the dygraph function
would be the inputs of the saved inference model. Default None.
fetch (list[int], optional): the indices of the returned variable of the
dygraph functions which will be saved as output variables in
inference model. If None, all output variables of the dygraph function
would be the outputs of the saved inference model. Default None.
Returns:
None
Examples:
Expand Down Expand Up @@ -599,12 +599,12 @@ def forward(self, x):
adam.minimize(loss)
net.clear_gradients()
# Save inference model.
# Note that fetch=[0] means we set 'y' as the inference output.
# Note that fetch=[0] means we set 'z' as the inference output.
prog_trans = ProgramTranslator()
prog_trans.save_inference_model("./dy2stat_infer_model", fetch=[0])
# In this example, the inference model will be pruned based on input (x) and
# output (y). The pruned inference program is going to be saved in the folder
# In this example, the inference model will be pruned based on output (z).
# The pruned inference program is going to be saved in the folder
# "./dy2stat_infer_model" and parameters are going to be saved in separate
# files in the folder.
"""
Expand Down

0 comments on commit 5ff4535

Please sign in to comment.