Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JuMP fix vs constraint issue #34

Open
Libbum opened this issue Nov 15, 2019 · 1 comment
Open

JuMP fix vs constraint issue #34

Libbum opened this issue Nov 15, 2019 · 1 comment
Labels
help wanted Extra attention is needed

Comments

@Libbum
Copy link
Owner

Libbum commented Nov 15, 2019

Closing #21 and #32 have made necessary a few fudge factors that I can't seem to pin down why they are necessary. Either Ipopt isn't cut out for solving these problems as well as CONOPT, or we have a problem in how JuMP handles constraints.

This issue has been occurring for more than a year and I've finally managed to work around it, but in general I'm not happy that it's come to this point.

Please have a read of this discussion on the discourse to get a feel of the steps already taken.

The solution was to just make things function when they didn't by swapping out a fix for a quite constrained @constraint, and vice versa. v2016R2 as an example looks like this:

if typeof(scenario) <: OptimalPriceScenario
    @constraint(model, vars.Tₐₜ[1] == config.tatm₀);
    JuMP.fix(vars.K[1], config.k₀; force=true);
else
    JuMP.fix(vars.Tₐₜ[1], config.tatm₀; force=true);
    @NLconstraint(model, vars.K[1] == config.k₀);
end

If either of these values are changed, the solver gets into an infinite loop where it has reached some area of convergence, but is not happy with the result somehow. This has had to be done for all of the current implemented versions to varying degrees, generally always working with some interplay of K and Tₐₜ has seemed to be the best, although this is totally just via gruelling trial and error work.

I've given up chasing this, but if you're interested in helping out and know a little more about the inner workings of JuMP, please let me know.

@Libbum Libbum added the help wanted Extra attention is needed label Nov 15, 2019
@Libbum
Copy link
Owner Author

Libbum commented Dec 2, 2019

#37 identified that this issue is confined to the default linear solver MUMPS.

Until such time as this issue is resolved, MUMPS will now be handled as a fallback solver, with HSL MA97 as the default.

HSL solvers are far more efficient than MUMPS anyhow, although the do require a license. Academic ones are free though, and the target audience for this package is there anyway, so whilst I'm not a fan of complicating the license chain I'd much prefer this package to not rip my hair out.

From now on I'll be developing with a HSL first approach, although Travis will make sure MUMPS does not lag behind (all tests on Travis will not include the licensed HSL build).

Any further progress on this issue should therefore focus on tweaking linear solver settings when MUMPS is in use.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

1 participant