Skip to content
This repository has been archived by the owner on Nov 22, 2022. It is now read-only.

batch packing for LM #413

Closed
wants to merge 1 commit into from

Conversation

borguz
Copy link
Contributor

@borguz borguz commented Mar 19, 2019

Summary: Take stream of tokens, pack it into square batch of size batch_size x max_seq_len with no padding (except last batch).

Differential Revision: D14518399

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Mar 19, 2019
borguz added a commit to borguz/pytext-1 that referenced this pull request Mar 20, 2019
Summary:
Pull Request resolved: facebookresearch#413

Take stream of tokens, pack it into square batch of size batch_size x max_seq_len with no padding (except last batch).

Differential Revision: D14518399

fbshipit-source-id: 83b2af1eceb1cb9b196015c87bf25ad208d89a1c
Summary:
Pull Request resolved: facebookresearch#413

Take stream of tokens, pack it into square batch of size batch_size x max_seq_len with no padding (except last batch).

Differential Revision: D14518399

fbshipit-source-id: 9bad7891a764549283897992cec657892fa340e2
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 19e6274.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants