Skip to content

[ExecuTorch][Weight Sharing][XNNPACK] load named data map data for xnnpack #9294

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Mar 15, 2025

Conversation

pytorchbot
Copy link
Collaborator

This PR was created by the merge bot to help merge the original PR into the main branch.
ghstack PR number: #9152 by @mcr229
^ Please use this as the source of truth for the PR details, comments, and reviews
ghstack PR base: https://github.com/pytorch/executorch/tree/gh/mcr229/8/base
ghstack PR head: https://github.com/pytorch/executorch/tree/gh/mcr229/8/head
Merge bot PR base: https://github.com/pytorch/executorch/tree/gh/mcr229/7/orig
Merge bot PR head: https://github.com/pytorch/executorch/tree/gh/mcr229/8/orig
@diff-train-skip-merge

mcr229 added 2 commits March 13, 2025 23:59
…ager

Pull Request resolved: #9151

We enable Backends to return Named Data by adding NamedDataStoreOutput to the preprocess result. This is a completely BC change, as no backends with an implemented preprocess will see any change if nothing is explicitly implemented.

For backend developers to leverage the new NamedDataStore, they can initialize a new NamedDataStore() within preprocess, add_named_data to the data store, and return the NamedDataStore.get_named_data_store_output() in the preprocess result like such:

```
def preprocess(ExportedProgram, List[CompileSpecs]) -> PreprocessResult:
    named_data_store = NamedDataStore()

    for node in exported_program.graph.nodes:
        named_data_store.add_named_data("name", bytes)

    return PreprocessResult(
        processed_bytes=bytes,
        debug_handle_map={},
        data_store_output= named_data_store.get_named_data_store_output()
    )
```


Under the hood, the data store output is embedded in the loweredbackendmodule, (serializing loweredbackendmodule by itself with the a named_data_store_output is still a todo). But via the EdgeProgramManager path, we add the named_data_store_outputs to the edge_program_manger's named data store to keep track of all the named data returned by backends.
ghstack-source-id: 271732049
@exported-using-ghexport

Differential Revision: [D70451660](https://our.internmc.facebook.com/intern/diff/D70451660/)
…npack

Pull Request resolved: #9152

If data is serialized into the NamedDataMap, then we overload getConstantDataPtr to retrieve the data from the named data map. This should be done in a Backwards Compatible way. Meaning if no data is serialized into the named data map, then we are still loading the data from the flatbuffer payload.

Since the runtime change here is being made before the AoT changes, All CI on this diff by itself should test that the changes made here are backwards compatitble.

Note: We do not resolve Runtime Memory usage at this point. WeightCache will be implemented in the next diff. Meaning If we load via the same key across different methods, we still pack twice and allocate two instances for the packed weights.
ghstack-source-id: 271732048
@exported-using-ghexport

Differential Revision: [D70315209](https://our.internmc.facebook.com/intern/diff/D70315209/)
Copy link

pytorch-bot bot commented Mar 14, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/9294

Note: Links to docs will display an error until the docs builds have been completed.

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Mar 14, 2025
Base automatically changed from gh/mcr229/7/orig to main March 14, 2025 23:27
… named data map (#9295)

This PR was created by the merge bot to help merge the original PR into
the main branch.
ghstack PR number: #9153 by
@mcr229
^ Please use this as the source of truth for the PR details, comments,
and reviews
ghstack PR base:
https://github.com/pytorch/executorch/tree/gh/mcr229/9/base
ghstack PR head:
https://github.com/pytorch/executorch/tree/gh/mcr229/9/head
Merge bot PR base:
https://github.com/pytorch/executorch/tree/gh/mcr229/8/orig
Merge bot PR head:
https://github.com/pytorch/executorch/tree/gh/mcr229/9/orig
@diff-train-skip-merge

Co-authored-by: Max Ren <maxren@meta.com>
Copy link

This PR needs a release notes: label

If your changes are user facing and intended to be a part of release notes, please use a label starting with release notes:.

If not, please add the topic: not user facing label.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "topic: not user facing"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

…ap (#9296)

This PR was created by the merge bot to help merge the original PR into
the main branch.
ghstack PR number: #9154 by
@mcr229
^ Please use this as the source of truth for the PR details, comments,
and reviews
ghstack PR base:
https://github.com/pytorch/executorch/tree/gh/mcr229/10/base
ghstack PR head:
https://github.com/pytorch/executorch/tree/gh/mcr229/10/head
Merge bot PR base:
https://github.com/pytorch/executorch/tree/gh/mcr229/9/orig
Merge bot PR head:
https://github.com/pytorch/executorch/tree/gh/mcr229/10/orig
@diff-train-skip-merge

---------

Co-authored-by: Max Ren <maxren@meta.com>
@SS-JIA SS-JIA requested review from lucylq and swolchok as code owners March 15, 2025 02:31
This PR was created by the merge bot to help merge the original PR into
the main branch.
ghstack PR number: #9155 by
@mcr229
^ Please use this as the source of truth for the PR details, comments,
and reviews
ghstack PR base:
https://github.com/pytorch/executorch/tree/gh/mcr229/11/base
ghstack PR head:
https://github.com/pytorch/executorch/tree/gh/mcr229/11/head
Merge bot PR base:
https://github.com/pytorch/executorch/tree/gh/mcr229/10/orig
Merge bot PR head:
https://github.com/pytorch/executorch/tree/gh/mcr229/11/orig
@diff-train-skip-merge

---------

Co-authored-by: Max Ren <maxren@meta.com>
@SS-JIA SS-JIA merged commit 23fe285 into main Mar 15, 2025
74 of 76 checks passed
@SS-JIA SS-JIA deleted the gh/mcr229/8/orig branch March 15, 2025 02:36
@SS-JIA SS-JIA restored the gh/mcr229/8/orig branch March 15, 2025 02:36
DannyYuyang-quic pushed a commit to CodeLinaro/executorch that referenced this pull request Apr 2, 2025
…npack (pytorch#9294)

This PR was created by the merge bot to help merge the original PR into
the main branch.
ghstack PR number: pytorch#9152 by
@mcr229
^ Please use this as the source of truth for the PR details, comments,
and reviews
ghstack PR base:
https://github.com/pytorch/executorch/tree/gh/mcr229/8/base
ghstack PR head:
https://github.com/pytorch/executorch/tree/gh/mcr229/8/head
Merge bot PR base:
https://github.com/pytorch/executorch/tree/gh/mcr229/7/orig
Merge bot PR head:
https://github.com/pytorch/executorch/tree/gh/mcr229/8/orig
@diff-train-skip-merge

---------

Co-authored-by: Max Ren <maxren@meta.com>
@SS-JIA SS-JIA deleted the gh/mcr229/8/orig branch April 16, 2025 20:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants