Skip to content
This repository was archived by the owner on Nov 22, 2022. It is now read-only.

make predict_iter support running through multiple batches #251

Closed
wants to merge 1 commit into from

Conversation

liaimi
Copy link

@liaimi liaimi commented Jan 28, 2019

Summary: This diff adds a flow for offline prediction using pytext models, making predict_iter support running through multiple batches and it also has a minor fix for glue_benchmark script.

Differential Revision: D13831370

Summary: This diff adds a flow for offline prediction using pytext models, making predict_iter support running through multiple batches and it also has a minor fix for glue_benchmark script.

Differential Revision: D13831370

fbshipit-source-id: d532a976c0a8881fd51a8a913d19465f9c958c6d
@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Jan 28, 2019
@liaimi liaimi changed the title add offline prediction flow make predict_iter support running through multiple batches Jan 28, 2019
# only return the first batch since there is only one
return input, context
if batch_size is not None:
return it
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This breaks Task.predict and thus pytext predict-py, as a 2-tuple is assumed to be returned. Also the if-predicate is always true because batch_size gets defaulted to len(ds) above.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed Do not delete this pull request or issue due to inactivity.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants