Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scaling #15

Merged
merged 11 commits into from
Jun 25, 2019
Merged

Scaling #15

merged 11 commits into from
Jun 25, 2019

Conversation

YoungFaithful
Copy link
Owner

  • Scaling included for better numerical properties
  • Lines adjusted as there were lines defined twice with opposite start and end node

@YoungFaithful
Copy link
Owner Author

The fails are OK, because:

  • transmission: lines were double and now equivalent, other lines are setup
  • simple, seasonal: 1e-09 values in storage instead of 0 due to better scaling

Copy link
Collaborator

@holgerteichgraeber holgerteichgraeber left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • What is the computational savings based on the scaling?
  • You mention that testing fails due to some lines and scaling. Would it be possible to calculate a new jld2 testset with this PR, so that that is fixed?

examples/workflow_example_cep.jl Outdated Show resolved Hide resolved
@@ -39,7 +40,8 @@ function OptVariable(cep::OptModelCEP,
end
end
end
OptVariable(round.(jumparray.data;digits=round_sigdigits),jumparray.axes...; axes_names=axes_names, type=type)
unscaled_data=jumparray.data*scale[variable] #Unscale the jumparray data based on the scaling parameters in Dictionary scale
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not entirely sure if I follow fully. You unscale the optimal variables here and store them. Where do you scale them previously?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And does that mean that if I just take the optimization results as is form the JuMP model, the variable units would not be correct, right?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The JuMP variables and equations are scaled within the problem formulation (one can not scale them separately like in GAMS - idea being to give you more control of what is actually given to optimizer)

The JuMP variables themselves are scaled and unscaling happens afterwards.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I see. Let's walk through this in person together, I think this is a great change in terms of computational time reduction, but needs to be documented well, because the units do not hold anymore. Make sure to document this in the function header and also where scale is applied: What is the reason for using scale, where is it unscaled.

@YoungFaithful
Copy link
Owner Author

* What is the computational savings based on the scaling?

* You mention that testing fails due to some lines and scaling. Would it be possible to calculate a new jld2 testset with this PR, so that that is fixed?
  • Computational saving is more than 10x (did not converge after 20h and now after approx. 2h)

  • I thought they were already updated, but somehow the error still existed, so I updated them again rn.

@holgerteichgraeber
Copy link
Collaborator

* Computational saving is more than 10x (did not converge after 20h and now after approx. 2h)

That's a significant improvement, awesome!

Copy link
Collaborator

@holgerteichgraeber holgerteichgraeber left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Enhanced documentation looks great.
The tests still fail.

Copy link
Collaborator

@holgerteichgraeber holgerteichgraeber left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Passes the test, nice.

@YoungFaithful YoungFaithful merged commit fafbf1b into dev Jun 25, 2019
@YoungFaithful YoungFaithful deleted the Scaling branch June 25, 2019 18:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants