You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Are there any plans to include an integer GEMM API in the future? 8-bit or 16-bit integer GEMM have become fairly standard for inference of quantized neural networks.
@guillaumekln Thanks for your interest in BLIS. Unfortunately, we don't have the resources at this time to investigate integer support for the gemm operation. We may look into this more seriously at some point in the future, but I'm not sure when that would be.
Are there any plans to include an integer GEMM API in the future? 8-bit or 16-bit integer GEMM have become fairly standard for inference of quantized neural networks.
Here are some examples in other libraries:
Thanks!
The text was updated successfully, but these errors were encountered: