Skip to content

Commit

Permalink
[jit][edge] Make flatbuffer_serailzer print correct type strings. (py…
Browse files Browse the repository at this point in the history
…torch#71935)

Summary:
Pull Request resolved: pytorch#71935

flatbuffer serializer today prints type strings based on platform. For example "DynamicType" will be exported if C10_MOBILE is present. Although it's not intended behavior, we should be able to export the correct type name to reduce confusion from users.
ghstack-source-id: 147821109

Test Plan:
```
buck run fbcode/mode/dbg //arvr/firmware/silicon/turing:test_torch -c pt.has_backtraces=1 -c turing.min_runtime=1 -c turing.dsp_op=1 -c turing.model_file=test1.ptl

Downloaded 0/66 artifacts, 0.00 bytes, 100.0% cache miss (for updated rules)
Building: finished in 38.2 sec (100%) 345/345 jobs, 36/345 updated
  Total time: 38.2 sec
BUILD SUCCEEDED
Conv:  input [1, 32, 4, 4] residuals [1] weights [4, 4, 1, 1, 2, 32] nlu_params [4, 128] in_ch 32 out_ch 32 groups 1 kernel  stride  padding  upsample 0 op_type 0 act_type 0
--tensor: 0x7ffdd461c6e8
        device: cpu
        is_quantized: 0
        contiguous: 1
        layout: Strided
        dtype: int
        itemsize: 4
        data_ptr: 0x7f781a0a2c10
        dim: 4
        size: [1, 32, 4, 4]
        stride: [512, 16, 4, 1]
dump data/size: 0x7f781a0a2c10/512
        0       00000004
        1       00000004
        2       00000004
        3       00000004
        4       00000004
        5       00000004
        6       00000004
        7       00000004
        8       00000004
        9       00000004
        10      00000004
        11      00000004
        12      00000004
        13      00000004
        14      00000004
        15      00000004
```

Reviewed By: qihqi

Differential Revision: D33826292

fbshipit-source-id: 3c579d89d31fe8d0df5ea6915746aa70da7e3d5c
(cherry picked from commit 9723a84)
  • Loading branch information
zhxchen17 authored and pytorchmergebot committed Jan 27, 2022
1 parent 1407939 commit b486797
Showing 1 changed file with 5 additions and 2 deletions.
7 changes: 5 additions & 2 deletions torch/csrc/jit/serialization/flatbuffer_serializer.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -146,6 +146,7 @@ flatbuffers::Offset<jit::mobile::serialization::Schema> FlatbufferSerializer::
return_vec.reserve(returns.size());
for (const auto& arg : args) {
int index = storeIValueAndGetIndex(fbb, arg.default_value());
TORCH_INTERNAL_ASSERT(arg.type()->kind() != c10::DynamicType::Kind);
arg_vec.emplace_back(CreateArg(
fbb,
fbb.CreateSharedString(arg.name()),
Expand All @@ -155,6 +156,7 @@ flatbuffers::Offset<jit::mobile::serialization::Schema> FlatbufferSerializer::

for (const auto& ret : returns) {
int index = storeIValueAndGetIndex(fbb, ret.default_value());
TORCH_INTERNAL_ASSERT(ret.type()->kind() != c10::DynamicType::Kind);
return_vec.emplace_back(CreateArg(
fbb,
fbb.CreateSharedString(ret.name()),
Expand Down Expand Up @@ -207,6 +209,7 @@ flatbuffers::Offset<mobile::serialization::Function> FlatbufferSerializer::

for (const TypePtr& t : code.types_) {
auto type_str = t->annotation_str();
TORCH_INTERNAL_ASSERT(t->kind() != c10::DynamicType::Kind);
if (type_str.find(torch_prefix) == 0) {
TORCH_CHECK(
type_str.find(class_prefix) == 0,
Expand Down Expand Up @@ -351,7 +354,7 @@ flatbuffers::Offset<mobile::serialization::List> FlatbufferSerializer::listToFB(
return CreateList(
fbb,
fbb.CreateVector(items),
fbb.CreateSharedString(list.type()->annotation_str()));
fbb.CreateSharedString(list.type<c10::Type>()->annotation_str()));
}

flatbuffers::Offset<mobile::serialization::Dict> FlatbufferSerializer::dictToFB(
Expand All @@ -372,7 +375,7 @@ flatbuffers::Offset<mobile::serialization::Dict> FlatbufferSerializer::dictToFB(
fbb,
fbb.CreateVector(keys),
fbb.CreateVector(values),
fbb.CreateSharedString(ivalue.type()->annotation_str()));
fbb.CreateSharedString(ivalue.type<c10::Type>()->annotation_str()));
}

flatbuffers::Offset<mobile::serialization::ObjectType> FlatbufferSerializer::
Expand Down

0 comments on commit b486797

Please sign in to comment.