Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Static grad scaler #6135

Merged
merged 3 commits into from
Sep 2, 2021
Merged

Conversation

leaves-zwx
Copy link
Contributor

No description provided.

@leaves-zwx leaves-zwx closed this Sep 2, 2021
@leaves-zwx leaves-zwx reopened this Sep 2, 2021
@strint strint merged commit 9e31008 into fea/graph/refine_outputs_regst_num Sep 2, 2021
@strint strint deleted the static_grad_scaler branch September 2, 2021 05:10
oneflow-ci-bot added a commit that referenced this pull request Sep 2, 2021
* dirty fast implement of nn.Graph pipeline outputs buffer size

* add debug profile range and fix bug of buffer

* support consistent_tensor.to(copy=True)

* first add

* add api

* auto format by CI

* refactor graph outputs buffer num

* refactor _consistent_tensor_to

* pass test

* avoid gc id re use

* use consistent to for copy

* let cur_rank_phy_tensor be function::Empty(...) if !parallel_id.has_value()

* auto format by CI

* no OF_ENV_BARRIER when sync vm

* auto format by CI

* auto format by CI

* StaticGradScaler (#6135)

* refine grad scalar

Co-authored-by: chengtbf <472491134@qq.com>
Co-authored-by: Xinqi Li <lixinqi0703106@163.com>
Co-authored-by: oneflow-ci-bot <ci-bot@oneflow.org>
Co-authored-by: oneflow-ci-bot <69100618+oneflow-ci-bot@users.noreply.github.com>
Co-authored-by: Li Xinqi <lixinqi2010@gmail.com>
Co-authored-by: leaves-zwx <kunta0932@gmail.com>
leaves-zwx added a commit that referenced this pull request Sep 3, 2021
leaves-zwx added a commit that referenced this pull request Sep 3, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants