Skip to content

Self attention with relative positional embedding #7356

Open
@vgrau98

Description

@vgrau98

Is your feature request related to a problem? Please describe.
Enable relative positional embedding in attention block

Describe the solution you'd like
A constructor argument specifying which type (or None) of relative embedding to use

Describe alternatives you've considered
NA

Additional context
Should be useful for some network implementation like SAM (https://arxiv.org/abs/2304.02643, #6357 )
PR opened #7346 with suggestion for decomposed relative positional embedding (https://arxiv.org/abs/2112.01526)

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions