-
Couldn't load subscription status.
- Fork 5.9k
Replace matmul(v2) with fused_matmul during oneDNN fuse passes #49515
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
qingqing01
merged 58 commits into
PaddlePaddle:develop
from
Silv3S:remove_extra_matmul_attrs
Feb 3, 2023
Merged
Changes from all commits
Commits
Show all changes
58 commits
Select commit
Hold shift + click to select a range
8cf612a
replace matmul with matmul_v2 in fuse passes
Silv3S 1a877be
Remove fusion logic from matmul
Silv3S 71da01e
Merge branch 'develop' into clean_matmuls
Silv3S e47c071
removing fusion methods
Silv3S 2d863cd
add proper name
Silv3S 02fbcb9
adjust namespaces
Silv3S 7df1a54
clean attrs in python tests
Silv3S c7b4e05
delete checkpoint and restore matmul version
Silv3S c98f126
remove unused code
Silv3S 53f6ae5
Merge branch 'PaddlePaddle:develop' into remove_extra_matmul_attrs
Silv3S a301add
matmul and reshape/transpose fuses migrated
Silv3S 3cff939
split MatmulOneDNN headers
Silv3S cc26096
fuse activation and eltwise_add
Silv3S a8c3774
add fuse_activation
Silv3S edb6615
matmul_transpose_reshape/reshape_transpose_matmul
Silv3S 2bf6d5a
matmul + elementwise_add (fused)
Silv3S b0bf97e
activation temporary modifciation
Silv3S 773f3c1
Merge branch 'develop' into remove_extra_matmul_attrs
Silv3S aee35ec
Merge branch 'PaddlePaddle:develop' into remove_extra_matmul_attrs
Silv3S cac16fb
merge newest develop
Silv3S 1cfa75b
revert depedency from other PR
Silv3S b2e9baa
Merge branch 'remove_extra_matmul_attrs' of https://github.com/Silv3S…
Silv3S 5eb838c
remove depedency from other PR
Silv3S dc3cf8e
revert pbtxt
Silv3S 5833035
remove placeholders from matmul_v2
Silv3S 5d71310
add description in OPMaker
Silv3S 8195aa4
remove matmul_v2_op.h and all depedencies
Silv3S 9e1c32b
remove dims changing in base op
Silv3S c7df785
add possibility to fuse already fused_matmul
Silv3S 946bf04
Merge branch 'PaddlePaddle:develop' into remove_extra_matmul_attrs
Silv3S 415202e
Merge branch 'PaddlePaddle:develop' into remove_extra_matmul_attrs
Silv3S 52d43ca
Merge branch 'PaddlePaddle:develop' into remove_extra_matmul_attrs
Silv3S 8687536
restart broken CI
Silv3S 4bd883f
Empty-Commit
Silv3S 35ad305
Merge branch 'PaddlePaddle:develop' into remove_extra_matmul_attrs
Silv3S efaacbf
revert matmul_utils.h
Silv3S 60b2eec
codestyle
Silv3S 3d266dd
adjust imports
Silv3S 39edccd
add pbtxt file
Silv3S 25659f4
Merge branch 'PaddlePaddle:develop' into remove_extra_matmul_attrs
Silv3S 0dc52e0
100% matmul unit tests coverage
Silv3S a00c10e
trigger CI with minimal changes to develop
Silv3S 274f399
adjust changes to develop
Silv3S bd57ba6
add fused_matmul op
Silv3S a825670
inherit base ops
Silv3S dbbeacc
add "v2"
Silv3S 2276699
move OPMaker
Silv3S 7f741b9
Gradually add fused_matmul files
Silv3S 2fa89a8
second batch of fused_matmul changes
Silv3S bef0d26
split infershapes of matmul_v2 and fused_matmul
Silv3S d61c27b
2023
Silv3S 862671a
Merge branch 'develop' into remove_extra_matmul_attrs
Silv3S f4b945a
inherit fused_matmul from matmul_v2
Silv3S b8dc210
Update paddle/phi/backends/onednn/onednn_reuse.h
Silv3S 1ce346e
Update paddle/phi/kernels/fusion/onednn/fused_matmul_kernel.cc
Silv3S 51e704a
resolve conflicts
Silv3S 5cbb217
Merge branch 'develop' into remove_extra_matmul_attrs
Silv3S f0e2abe
codestyle
Silv3S File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There have some
broadcastcomputational logic difference betweenmatmul_v1andmatmul_v2, the code as followsIs there some problems when replacing directly ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, I don't think that it could be problem. For some time oneDNN version of both
matmulandmatmul_v2are already using same code #44640, #48162. Declared attributes of these two are mostly the same. Only difference isalphanot supported inmatmul_v2, andtransposeis renamed totrans).Goal of this PR is just to extract fusion logic from base op and base kernel. Fused op is superset, which has all extra attributes declared in OPMaker and fused kernel has implemented support for handling fusion logic.
I've adjusted all fuse pass unit test to work with
fused_matmul. Also this PR has been checked in our internal validation and it didn't report any accuracy drop.