Skip to content

Support exporting QNN models with Python wheels out-of-the-box #9474

Open
@jathu

Description

@jathu

As part of #9019, we recently added CoreML export support "out of the box" for the executorch pip package: #9483. Developers can now export to CoreML backend, by just pip installing executorch — previously this required cloning the ExecuTorch repo, and building everything from source. We anticipate this to encourage ET adopting for CoreML.

We want to provide the same developer experience when exporting models to Android on Qualcomm chips.

Currently, this requires developers to manually download the QNN SDK, clone the ExecuTorch repo, and build everything from source. Similar to CoreML, we should bundle this requirement into ExecuTorch. Given the QNN library is currently only distributed as a zip file, here are some potential next steps:

  • [Option 1] [Preferred] Ideally, Qualcomm should provide the SDK through pip so we can just include that as a dependency. This is what we currently do with CoreML
  • [Option 2] [Preferred] While we wait for Qualcomm, we can temporarily bundle the library into executorch. This will require permission from Qualcomm
  • [Option 3] [Not ideal] At the very least, we should automate downloading the SDK in the build script

cc @larryliu0820 @cccclai @winskuo-quic @shewu-quic @cbilgin @lucylq

Metadata

Metadata

Assignees

Labels

module: build/installIssues related to the cmake and buck2 builds, and to installing ExecuTorchpartner: qualcommFor backend delegation, kernels, demo, etc. from the 3rd-party partner, Qualcomm

Type

No type

Projects

Status

Backlog

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions