Open
Description
This issue is based on a conversation I had with @ricardoV94, about whether it would be possible to add an integration Op
. Based on this conversation, a Quad
op with gradients should be possible to implement in PyTensor and JAX by directly wrapping scipy.integrate.quad
(disclaimer: I am still not clear on nomenclature with respect to vjp, jvp, push foward, pull back, gradient, etc, but everything needed seems to be in this thread).
Numba will be tricker as usual, because of the spotty Numba coverage of scipy. Scipy uses QUADPACK, written in Fortran, to actually do the computation. I'm pretty sure this can be overloaded, but it would take a bit of tinkering.