Skip to content

[autoparallel] Patch meta information of torch.tanh() and torch.nn.Dropout #2773

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 61 commits into from
Feb 22, 2023

Conversation

Cypher30
Copy link
Contributor

@Cypher30 Cypher30 commented Feb 16, 2023

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234
Resolved #2630
Resolved #2631

📝 What does this PR do?

Summarize your work here.
if you have any plots/diagrams/screenshots/tables, please attach them here.

In this PR, I patch meta information of torch.tanh() and torch.nn.Dropout. I also modify all the meta information generators in auto_parallel/meta_profiler/meta_registry/activation.py and turn them into the same template so that we have cleaner code. We could think about refactoring more code in meta_registry in the future.

Again, the test is not supported on torch 1.11.0, so I attach the results here
Screenshot 2023-02-16 at 22 10 30

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

Cypher30 and others added 30 commits July 14, 2022 16:07
@Cypher30 Cypher30 added Run Build and Test auto-parallel related to the auto-parallel feature labels Feb 16, 2023
@Cypher30 Cypher30 changed the title [autoparallel] Patch meta information of torch.tanh() [autoparallel] Patch meta information of torch.tanh() and torch.nn.Dropout Feb 16, 2023
@github-actions
Copy link
Contributor

The code coverage for the changed files is 45%.

Click me to view the complete report
Name                                                                                   Stmts   Miss  Cover
----------------------------------------------------------------------------------------------------------
colossalai/auto_parallel/meta_profiler/meta_registry/activation.py                        29     15    48%
tests/test_auto_parallel/test_tensor_shard/test_metainfo/test_activation_metainfo.py      44     25    43%
----------------------------------------------------------------------------------------------------------
TOTAL                                                                                     73     40    45%

1 similar comment
@github-actions
Copy link
Contributor

The code coverage for the changed files is 45%.

Click me to view the complete report
Name                                                                                   Stmts   Miss  Cover
----------------------------------------------------------------------------------------------------------
colossalai/auto_parallel/meta_profiler/meta_registry/activation.py                        29     15    48%
tests/test_auto_parallel/test_tensor_shard/test_metainfo/test_activation_metainfo.py      44     25    43%
----------------------------------------------------------------------------------------------------------
TOTAL                                                                                     73     40    45%

@github-actions
Copy link
Contributor

The code coverage for the changed files is 44%.

Click me to view the complete report
Name                                                                                   Stmts   Miss  Cover
----------------------------------------------------------------------------------------------------------
colossalai/auto_parallel/meta_profiler/meta_registry/activation.py                        28     15    46%
tests/test_auto_parallel/test_tensor_shard/test_metainfo/test_activation_metainfo.py      44     25    43%
----------------------------------------------------------------------------------------------------------
TOTAL                                                                                     72     40    44%

@YuliangLiu0306 YuliangLiu0306 merged commit fcc4097 into hpcaitech:main Feb 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto-parallel related to the auto-parallel feature
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[FEATURE]: Patch meta information of torch.nn.Dropout [FEATURE]: Patch meta information of torch.tanh()
2 participants