Skip to content

Commit

Permalink
Add documentation. (#5562)
Browse files Browse the repository at this point in the history
Summary:
https://docs-preview.pytorch.org/pytorch/executorch/5562/extension-tensor.html

Pull Request resolved: #5562

Reviewed By: mergennachin

Differential Revision: D63286312

fbshipit-source-id: c4192d7c1af2dd7b559eaa1ce3550f34e64918e4
  • Loading branch information
shoumikhin authored and facebook-github-bot committed Sep 24, 2024
1 parent 3e79ea4 commit 99ee547
Show file tree
Hide file tree
Showing 3 changed files with 449 additions and 6 deletions.
21 changes: 15 additions & 6 deletions docs/source/extension-module.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,23 +6,23 @@ In the [Running an ExecuTorch Model in C++ Tutorial](running-a-model-cpp-tutoria

## Example

Let's see how we can run the `SimpleConv` model generated from the [Exporting to ExecuTorch tutorial](./tutorials/export-to-executorch-tutorial) using the `Module` APIs:
Let's see how we can run the `SimpleConv` model generated from the [Exporting to ExecuTorch tutorial](./tutorials/export-to-executorch-tutorial) using the `Module` and [`TensorPtr`](extension-tensor.md) APIs:

```cpp
#include <executorch/extension/module/module.h>
#include <executorch/extension/tensor/tensor.h>

using namespace ::torch::executor;
using namespace ::executorch::extension;

// Create a Module.
Module module("/path/to/model.pte");

// Wrap the input data with a Tensor.
float input[1 * 3 * 256 * 256];
Tensor::SizesType sizes[] = {1, 3, 256, 256};
TensorImpl tensor(ScalarType::Float, std::size(sizes), sizes, input);
auto tensor = from_blob(input, {1, 3, 256, 256});

// Perform an inference.
const auto result = module.forward(Tensor(&tensor));
const auto result = module.forward(tensor);

// Check for success or failure.
if (result.ok()) {
Expand Down Expand Up @@ -62,6 +62,14 @@ assert(module.is_method_loaded("forward"));
```
Note: the `Program` is loaded automatically before any `Method` is loaded. Subsequent attemps to load them have no effect if one of the previous attemps was successful.

You can also force-load the "forward" method with a convenience syntax:

```cpp
const auto error = module.load_forward();

assert(module.is_method_loaded("forward"));
```
### Querying for Metadata
Get a set of method names that a Module contains udsing the `method_names()` function:
Expand Down Expand Up @@ -131,10 +139,11 @@ Use [ExecuTorch Dump](sdk-etdump.md) to trace model execution. Create an instanc
```cpp
#include <fstream>
#include <memory>

#include <executorch/extension/module/module.h>
#include <executorch/devtools/etdump/etdump_flatcc.h>

using namespace ::torch::executor;
using namespace ::executorch::extension;

Module module("/path/to/model.pte", Module::LoadMode::MmapUseMlock, std::make_unique<ETDumpGen>());

Expand Down
Loading

0 comments on commit 99ee547

Please sign in to comment.