Closed
Description
This is an MWE I reduced from a bigger problem (finding ML estimates of a logistic regression using summary statistics). optimize
always finds the same minimizer (as expected, since the function is convex; I also compared to GLM.jl
). But for BFGS()
without autodiff, it reports non-convergence, with autodiff it reports convergence. For ConjugateGradient
, it reports non-convergence, with and without autodiff.
using Optim
using StatsFuns
N0 = [1179,2986,581352,2464,3334,5618,2954,496,503]
N1 = [656,563,9354,545,468,720,678,162,190]
X = [1.0 -0.333333 -0.333333 -0.333333 -0.333333;
1.0 -0.333333 0.666667 0.666667 -0.333333;
1.0 -0.333333 0.666667 -0.333333 0.666667;
1.0 -0.333333 0.666667 -0.333333 -0.333333;
1.0 -0.333333 -0.333333 -0.333333 0.666667;
1.0 0.666667 -0.333333 -0.333333 0.666667;
1.0 0.666667 -0.333333 0.666667 -0.333333;
1.0 -0.333333 -0.333333 0.666667 -0.333333;
1.0 0.666667 -0.333333 -0.333333 -0.333333]
function ℓ_logistic_regression(X, N0, N1, β)
Xβ = X*β
sum(-N0.*Xβ - (N0+N1).* log1pexp.(-Xβ))
end
function init_logistic_regression(X, N0, N1)
X \ logit.(N1./(N0+N1))
end
Optim.optimize(β -> -ℓ_logistic_regression(X, N0, N1, β),
init_logistic_regression(X, N0, N1),
BFGS(), Optim.Options(autodiff = true)) # change the last line for various results
eg
julia> Optim.optimize(β -> -ℓ_logistic_regression(X, N0, N1, β),
init_logistic_regression(X, N0, N1),
BFGS())
Results of Optimization Algorithm
* Algorithm: BFGS
* Starting Point: [-1.7194691162922449,-0.2770378660884508, ...]
* Minimizer: [-1.8220199098825927,-0.25189789184931655, ...]
* Minimum: 5.882142e+04
* Iterations: 12
* Convergence: false
* |x - x'| < 1.0e-32: false
* |f(x) - f(x')| / |f(x)| < 1.0e-32: false
* |g(x)| < 1.0e-08: false
* f(x) > f(x'): true
* Reached Maximum Number of Iterations: false
* Objective Function Calls: 51
* Gradient Calls: 51
Metadata
Metadata
Assignees
Labels
No labels