Skip to content

yshklarov/hopty

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hopty

A demonstration of various optimization methods. Currently implemented:

To run tests:

$ cabal configure
$ cabal build
$ cabal haddock  # to build documentation
$ ./dist/build/testhopty/testhopty

Examples

Newton's method

Calculate e very inefficiently, by finding the maximum point of equation (unfortunately we have to find the derivatives by hand):

*Numeric.Optimization.Hopty> let f x = x**(1/x)  -- = e^((1/x)log(x))
*Numeric.Optimization.Hopty> let f' x = ((1/x^2) - (log x)/x^2) * f x
*Numeric.Optimization.Hopty> let f'' x = ((1/x^2) - (log x)/x^2) * f' x + ((-2/x^3) - 1/x^2 + 2*(log x)/x^3) * f x
*Numeric.Optimization.Hopty> newton f' f'' 1
2.7182818284590446

This is actually inaccurate: the last two decimal places are incorrect.

Gradient descent

Find the minimum point of the Rosenbrock function, should be (1, 1):

*Numeric.Optimization.Hopty> -- rosen [x, y] = (1-x)^2 + 100*(y-x^2)^2
*Numeric.Optimization.Hopty> let gradrosen [x, y] = [2*(x-1) - 400*x*(y-x^2), 200*(y-x^2)]
*Numeric.Optimization.Hopty> graddes 0.005 gradrosen [-0.5, 0.5]
[0.9999999999999577,0.9999999999999153]

Pattern search

This is easier to use than gradient descent: we don't need to supply the gradient! Also, it's about five times faster.

*Numeric.Optimization.Hopty> let rosenbrock [x, y] = (1-x)^2 + 100*(y-x^2)^2
*Numeric.Optimization.Hopty> patternsearch 100 rosenbrock [-0.5, 0.5]
[0.9999999999973368,0.9999999999946714]

Coordinate descent

Our coordinate descent algorithm internally performs a pattern search one coordinate at a time. For functions that aren't multiplicatively separable, it tends to zig-zag and converge very slowly. For the rosenbrock function, it takes 45 times longer than a multivariate pattern search.

*Numeric.Optimization.Hopty> let rosenbrock [x, y] = (1-x)^2 + 100*(y-x^2)^2
*Numeric.Optimization.Hopty> coorddes 100 rosenbrock [-0.5, 0.5]
[0.9999999999973368,0.9999999999946714]

Issues and limitations

  • Precision is not handled well (ie. at all), and arbitrary-precision calculations are not supported.
  • No unit tests yet
  • Poor documentation
  • No detection for when convergence/divergence is too slow

About

A demonstration of various optimization methods

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published