Skip to content

Support for hessp in tensor.optimize.minimize #1472

Open
@jessegrabowski

Description

@jessegrabowski

Description

Currently, we don't allow the hessp argument in minimize, which sucks.

We don't allow it because currently we only have a single inner function that computes everything the user requests -- value, grad, and hess. To make this not wasteful, we wrap the inner function in a LRU cache wrapper. Calls for hess will always end up as cache hits, because of how the algorithms in scipy work -- they call value and grad (which is allowed to be fused), then evaluate the hessian at the same point as well.

I experimented with putting the hessp into the LRU1 cache as well, but it never results in cache hits. This is because there will be a call to value_and_grad at some point, then a second call to hessp at the same point, but with different values of p. If p is included in the cache hit detection, it always misses. So the strategy doesn't work for hessp.

@ricardoV94 says we are allowed to have two inner functions. So if the user asks for hessp, we need to compile a second inner function to compute the hessp, and it can be used inside the perform method.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions