You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Aug 7, 2024. It is now read-only.
Summary:
We use the dispatching mechanism to mm
## TODO
- [x] Hook on to float8_tensor the amax_buffer to be filled under dispatch
- [x] Update emulate path
# Note
Vasiliy has already started this here:
#28
Some things have changed though since then, we are outputing by default in higher precision. However I still need to replicate the amax_buffer filling here and store on float8_tensor passed in
Corresponding core changes to get as far as possible in compile for aot_eager
pytorch/pytorch#111735
``` Shell
Checking against fake_mode=<torch._subclasses.fake_tensor.FakeTensorMode object at 0x7f4c13cd1bd0>
attr=_data
attr_fake_mode=<torch._subclasses.fake_tensor.FakeTensorMode object at 0x7f4c13c34d00>
attr=_scale
attr_fake_mode=<torch._subclasses.fake_tensor.FakeTensorMode object at 0x7f4c13c34d00>
```
### Current Compile Progress
- backend = "eager_only", full_graph = False: ✅
- backend = "eager_only", full_graph = False: ❌
``` Shell
E torch._dynamo.exc.Unsupported: call_function UserDefinedObjectVariable(to_float8) [TensorVariable(), TensorVariable(), ConstantVariable(dtype), TensorVariable()] {}
```
- backend = "aot_eager", full_graph = False: ❌
``` Shell
File "/home/drisspg/meta/pytorch/torch/_functorch/aot_autograd.py", line 4187, in convert
assert all(getattr(x, attr).fake_mode is fake_mode for attr in attrs)
torch._dynamo.exc.BackendCompilerFailed: backend='aot_eager' raised:
AssertionError:
```
Pull Request resolved: #128
Reviewed By: bdhirsh, y-sq
Differential Revision: D50901900
Pulled By: drisspg
fbshipit-source-id: 64626bc652b70bfbabff2ab26e999324d1463e1d
0 commit comments