Skip to content

Commit

Permalink
add option to specify that objective gradient is constant
Browse files Browse the repository at this point in the history
- seems that it can speedup in rare cases
- closes coin-or#597
  • Loading branch information
svigerske committed Aug 15, 2022
1 parent d41a1eb commit 6378c1b
Show file tree
Hide file tree
Showing 4 changed files with 32 additions and 2 deletions.
5 changes: 5 additions & 0 deletions ChangeLog.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,11 @@ More detailed information about incremental changes can be found in the

## 3.14

### 3.14.10 (2022-xx-yy)

- Added option grad_f_constant to specify that objective function is linear.
If set, the gradient of the objective will be requested by NLP only once. [#597]

### 3.14.9 (2022-07-21)

- Fixed mapping of meta data for variable bounds, e.g., variable names,
Expand Down
8 changes: 8 additions & 0 deletions doc/options.dox
Original file line number Diff line number Diff line change
Expand Up @@ -373,6 +373,14 @@ Possible values: yes, no
Possible values: yes, no
</blockquote>

\anchor OPT_grad_f_constant
<strong>grad_f_constant</strong>: Indicates whether to assume that the objective function is linear
<blockquote>
Activating this option will cause Ipopt to ask for the Gradient of the objective function only once from the NLP and reuse this information later. The default value for this string option is "no".

Possible values: yes, no
</blockquote>

\anchor OPT_jac_c_constant
<strong>jac_c_constant</strong>: Indicates whether to assume that all equality constraints are linear
<blockquote>
Expand Down
16 changes: 14 additions & 2 deletions src/Algorithm/IpOrigIpoptNLP.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,12 @@ void OrigIpoptNLP::RegisterOptions(
"If test is activated and an invalid number is detected, "
"the matrix is written to output with print_level corresponding to J_MORE_DETAILED; "
"so beware of large output!");
roptions->AddBoolOption(
"grad_f_constant",
"Indicates whether to assume that the objective function is linear",
false,
"Activating this option will cause Ipopt to ask for the Gradient of the objective function "
"only once from the NLP and reuse this information later.");
roptions->AddBoolOption(
"jac_c_constant",
"Indicates whether to assume that all equality constraints are linear",
Expand Down Expand Up @@ -142,6 +148,7 @@ bool OrigIpoptNLP::Initialize(
options.GetEnumValue("hessian_approximation_space", enum_int, prefix);
hessian_approximation_space_ = HessianApproximationSpace(enum_int);

options.GetBoolValue("grad_f_constant", grad_f_constant_, prefix);
options.GetBoolValue("jac_c_constant", jac_c_constant_, prefix);
options.GetBoolValue("jac_d_constant", jac_d_constant_, prefix);
options.GetBoolValue("hessian_constant", hessian_constant_, prefix);
Expand Down Expand Up @@ -507,7 +514,12 @@ SmartPtr<const Vector> OrigIpoptNLP::grad_f(
{
SmartPtr<Vector> unscaled_grad_f;
SmartPtr<const Vector> retValue;
if( !grad_f_cache_.GetCachedResult1Dep(retValue, &x) )
const Vector* dep = NULL;
if( !grad_f_constant_ )
{
dep = &x;
}
if( !grad_f_cache_.GetCachedResult1Dep(retValue, dep) )
{
grad_f_evals_++;
unscaled_grad_f = x_space_->MakeNew();
Expand All @@ -519,7 +531,7 @@ SmartPtr<const Vector> OrigIpoptNLP::grad_f(
ASSERT_EXCEPTION(success && IsFiniteNumber(unscaled_grad_f->Nrm2()), Eval_Error,
"Error evaluating the gradient of the objective function");
retValue = NLP_scaling()->apply_grad_obj_scaling(ConstPtr(unscaled_grad_f));
grad_f_cache_.AddCachedResult1Dep(retValue, &x);
grad_f_cache_.AddCachedResult1Dep(retValue, dep);
}

return retValue;
Expand Down
5 changes: 5 additions & 0 deletions src/Algorithm/IpOrigIpoptNLP.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -533,6 +533,11 @@ class IPOPTLIB_EXPORT OrigIpoptNLP: public IpoptNLP
*/
bool check_derivatives_for_naninf_;

/** Flag indicating if we need to ask for objective
* Gradient only once
*/
bool grad_f_constant_;

/** Flag indicating if we need to ask for equality constraint
* Jacobians only once
*/
Expand Down

0 comments on commit 6378c1b

Please sign in to comment.