-
Notifications
You must be signed in to change notification settings - Fork 63
fix instantiation bug #155
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
I am happy with the current state of this, but now it is breaking the While this error could be solved by dropping all |
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## master #155 +/- ##
==========================================
+ Coverage 81.57% 81.78% +0.20%
==========================================
Files 25 25
Lines 2133 2146 +13
==========================================
+ Hits 1740 1755 +15
+ Misses 393 391 -2 ☔ View full report in Codecov by Sentry. |
I think I'm definitely in favour of keeping alpha and beta as subtypes of I can't seem to convince myself that we really need to promote the type of alpha and beta though. Somehow it seems like this is something that could be handled at the level of functions, and not the macro. In that case at the very least we would still support polynomials with scalar factors that are subtypes of Am I right in thinking the reason we promote alpha is to correctly deduce the output number type of the newly allocated tensors? I think this could also be achieved by calling Finally, the functions that change from indextuples to labels are used in TBLIS, and might be useful in other backends that have chosen characters as labels instead of integers. I'm fine with removing them here, but I just wanted to mention that. |
Yes, this is how it is actually done for Regarding the label functions, that is a good remark. I just tried to pimp the coverage numbers and noticed these were not used here, not even by the CUDA extension. But I didn't check TensorOperationsTBLIS. Maybe we should add the tests for TensorOperationsTBLIS also to TensorOperations itself. |
I might try and just add some tests for those functions even without TBLIS, there are some simple unit tests, and they should also be the "inverse operation" of the functions that convert labels to indices, which we can test for. I think it might be worth considering changing the implementation of the instantiation to allow for the TC without changing alpha... It seems a bit cleaner and more robust... |
Ok DynamicPolynomials support restored with different implementation as discussed. Also label information is back. I will merge if all lights turn green and there are no other comments. I will also already tag a new version so that the bugs are eliminated and also TensorKit can be updated. |
Fixes bugs with scalar factors appearing in denominator, e.g.