Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Backport transposes optimization to v0.3.0 #3994

Merged
merged 2 commits into from
Dec 4, 2017

Conversation

dzhulgakov
Copy link
Collaborator

Pretty much only node() changes. With this change there are no more annoying nested transposes.

Tests from fb-universe pass (modulo renumberings)

anderspapitto and others added 2 commits December 2, 2017 14:49
)

* Optimizer: Optimize transposes in variety of circumstances

- No-op transposes
- Consecutive transposes (fuse them)
- Transposes into Gemm (fuse them into transA/transB parameter)

* touch up out of date comment
@pytorchbot
Copy link
Collaborator

@dzhulgakov, thanks for your PR! We identified @zdevito to be a potential reviewer.

@dzhulgakov
Copy link
Collaborator Author

@soumith @ezyang to merge if it's not too late

@soumith soumith merged commit fca8826 into pytorch:v0.3.0 Dec 4, 2017
@soumith
Copy link
Member

soumith commented Dec 4, 2017

Thanks @dzhulgakov !

soumith pushed a commit that referenced this pull request Dec 4, 2017
* Optimizer: optimize transposes in variety of circumstances (#3509)

* Optimizer: Optimize transposes in variety of circumstances

- No-op transposes
- Consecutive transposes (fuse them)
- Transposes into Gemm (fuse them into transA/transB parameter)

* touch up out of date comment

* Backporting optimizer changes
peterjc123 pushed a commit to peterjc123/pytorch that referenced this pull request Dec 4, 2017
* Optimizer: optimize transposes in variety of circumstances (pytorch#3509)

* Optimizer: Optimize transposes in variety of circumstances

- No-op transposes
- Consecutive transposes (fuse them)
- Transposes into Gemm (fuse them into transA/transB parameter)

* touch up out of date comment

* Backporting optimizer changes
@soumith soumith added the 0.3 label Feb 3, 2018
wuhuikx pushed a commit to wuhuikx/pytorch that referenced this pull request Jan 30, 2020
* Optimizer: optimize transposes in variety of circumstances (pytorch#3509)

* Optimizer: Optimize transposes in variety of circumstances

- No-op transposes
- Consecutive transposes (fuse them)
- Transposes into Gemm (fuse them into transA/transB parameter)

* touch up out of date comment

* Backporting optimizer changes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants