-
Notifications
You must be signed in to change notification settings - Fork 127
Implementation of negative weights, Sep-CMA, VD-CMA. Sep and VD CMA achieving linear time complexity. #349
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Have you run the CMAES test locally? When I run ./ensmallen_tests "[CMAESTest]"
, almost all tests fail.
I have successfully run all tests locally but in a separated manner './ensmallen_tests {test_case}" but not |
The current VD-update performance is not perfect. I ran it multiple times with logistic regression test case, frequently 1-2 times will fail ~ 90-94 accuracy over 10 times running. I think it is due to the new scale of parameter c1, cmu, csigma in the paper compared to old ones. I am still finding a way to get around this. |
I have added rescale learning rate function to Vd-update as in the paper. Now everything would be fine now with VD update. The problem with the titanic test is still remain. |
include/ensmallen_bits/cmaes/weight_init_policies/default_weight.hpp
Outdated
Show resolved
Hide resolved
include/ensmallen_bits/cmaes/weight_init_policies/negative_weight.hpp
Outdated
Show resolved
Hide resolved
include/ensmallen_bits/cmaes/weight_init_policies/negative_weight.hpp
Outdated
Show resolved
Hide resolved
This issue has been automatically marked as stale because it has not had any recent activity. It will be closed in 7 days if no further activity occurs. Thank you for your contributions! 👍 |
This is my implementation of CMA main class, the purpose of the class is to store the default parameters of the algorithm once the optimizer is created and allow us to add new variants easier since I split the main algorithm into many parts. Since many improvements of the CMA-ES algorithm use the same set of parameters as the original one and many formulas look the same, this new CMAES class will reduce code replication.
Apart from renewal class. I also added:
Negative weights policy (a recent improvement of default CMA-ES, more details is at https://arxiv.org/pdf/1604.00772.pdf - page 31)
Vd-CMA update policy: a linear time update covariance matrix as in * VD-CMA: Linear Time/Space Comparison-based Natural Gradient Optimization. The covariance matrix is limited as C = D * (I + v*v^t) * D, where D is a diagonal, v is a vector.
Sep-CMA update policy: a linear time update covariance matrix by just updating the covariance matrix's diagonal as in
Raymond Ros et al. in "A Simple Modification in CMA-ES Achieving Linear Time and Space Complexitys".
I also planed to implement the IPOP restart strategy and Cholesky update policy but my time in GSOC is running out, but I still make another pull request in near future.
Also, I have attached the benchmark results of variants of CMAES in logistic regression task with gisette dataset.
Benchmark.pdf