Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prepend Points are not being inlcuded in ranking of "global search" #40

Open
jmcastro2109 opened this issue Sep 18, 2024 · 4 comments
Open

Comments

@jmcastro2109
Copy link

Hi,

I am using the prepend_points option to add a candidate for optimality of a previous run.

println("Model: Baseline")

BEST_POINT= [2.0023715563522275, 0.47616619564356855, 4.0064103584405935, 9.991268276769539, 2.0070308283954614, 0.7000687884505741]
P = MinimizationProblem(x -> gmm_expanded(x,[]), [0.0,-0.98,0,0.0,0.1,0.0], [10,0.98,30,30,10.0,1.0])
local_method = NLoptLocalMethod(NLopt.LN_NELDERMEAD)
multistart_method = TikTak(2500)
p_expanded = multistart_minimization(multistart_method, local_method, P; use_threads = false, prepend_points = [BEST_POINT])
p_expanded.location, p_expanded.value

println("Estimation Results:", p_expanded.location)
println("Estimation Results:", p_expanded.value) 

However, after evaluating the 2500 Sobol points and the prepend_point, the "local phase" is not starting from the correct point, i.e

it is evaluating the prepend point here:

Parameters:[2.0023715563522275, 0.47616619564356855, 4.0064103584405935, 9.991268276769539, 2.0070308283954614, 0.7000687884505741]
Criterion Function:2.4557462126908754e-5

but when the local phase starts, it is starting from other visited minimum.

Parameters:[1.7097586400102227, 0.17207853362685677, 8.172369551469059, 7.004253780801954, 4.9741088445582955, 0.8102656679075574]
Criterion Function:16.682058405554365

@tpapp
Copy link
Owner

tpapp commented Sep 19, 2024

I looked at the code and it looks OK, can you please give me an MWE?

@jmcastro2109
Copy link
Author

jmcastro2109 commented Sep 20, 2024

This code seems to generate the error:

using MultistartOptimization, NLopt

Point = [0.0,0.1]

function f(x,grad)
    if grad == []
    val = x[1]^2 + x[2]^2
    println("Param:",x)
    println("val:",val)
    end
    return val
end    


P = MinimizationProblem(x -> f(x,[]), [0.0,0.0], [100.0,100.0])
local_method = NLoptLocalMethod(NLopt.LN_NELDERMEAD)
multistart_method = TikTak(25)
p_expanded = multistart_minimization(multistart_method, local_method, P; use_threads = false, prepend_points = [Point])
p_expanded.location, p_expanded.value

It prints:

Param:[9.375, 46.875]
val:2285.15625
Param:[59.375, 96.875]
val:12910.15625
Param:[84.375, 21.875]
val:7597.65625
Param:[34.375, 71.875]
val:6347.65625
Param:[46.875, 9.375]
val:2285.15625
Param:[96.875, 59.375]
val:12910.15625
Param:[71.875, 34.375]
val:6347.65625
Param:[21.875, 84.375]
val:7597.65625
Param:[15.625, 15.625]
val:488.28125
Param:[65.625, 65.625]
val:8613.28125
Param:[90.625, 40.625]
val:9863.28125
Param:[40.625, 90.625]
val:9863.28125
Param:[28.125, 28.125]
val:1582.03125
Param:[78.125, 78.125]
val:12207.03125
Param:[53.125, 3.125]
val:2832.03125
Param:[3.125, 53.125]
val:2832.03125
Param:[4.6875, 26.5625]
val:727.5390625
Param:[54.6875, 76.5625]
val:8852.5390625
Param:[79.6875, 1.5625]
val:6352.5390625
Param:[29.6875, 51.5625]
val:3540.0390625
Param:[42.1875, 14.0625]
val:1977.5390625
Param:[92.1875, 64.0625]
val:12602.5390625
Param:[67.1875, 39.0625]
val:6040.0390625
Param:[17.1875, 89.0625]
val:8227.5390625
Param:[23.4375, 7.8125]
val:610.3515625
Param:[0.0, 0.1]
val:0.010000000000000002
Param:[6.603902043912098, 6.66163707083106]
val:87.988930669057
Param:[11.556828576846172, 6.66163707083106]
val:177.93769521807894
Param:[6.603902043912098, 11.657864873954356]
val:179.5173356249652
Param:[11.556828576846172, 1.6654092677077639]
val:136.33387478357523
Param:[6.603902043912097, 1.6654092677077648]
val:46.38511023455328
Param:[4.127438777445059, 0.0]
val:17.035750861557165
Param:[0.0, 4.996227803123296]
val:24.962292260702238
Param:[0.0, 0.0]
val:0.0
Param:[0.0, 0.0]
val:0.0
Param:[4.127438777445059, 0.0]
val:17.035750861557165
Param:[3.0955790830837944, 0.0]
val:9.582609859625904
Param:[0.0, 0.0]
val:0.0
Param:[4.300861384506421, 1.4336204615021404]
val:20.552676276376097
Param:[7.526507422886237, 1.4336204615021404]
val:58.70358161439923
Param:[4.300861384506421, 2.5088358076287456]
val:24.791665758378667
Param:[1.0752153461266047, 2.5088358076287456]
val:7.450345150186335
Param:[0.0, 3.046443480692048]
val:9.28081788105108
Param:[1.0752153461266047, 1.4336204615021404]
val:3.2113556681837645
Param:[0.0, 0.8960127884388378]
val:0.8028389170459415
Param:[0.0, 1.9712281345654428]
val:3.8857403585023556
Param:[0.0, 0.3584051153755352]
val:0.1284542267273507
Param:[0.0, 0.0]
val:0.0
Param:[0.0, 0.0]
val:0.0
Param:[0.02343750000000002, 0.1328125000000001]
val:0.01818847656250003
Param:[0.041015625000000035, 0.1328125000000001]
val:0.01932144165039066
Param:[0.02343750000000002, 0.2324218750000002]
val:0.054569244384765715
Param:[0.041015625000000035, 0.03320312500000003]
val:0.0027847290039062548
Param:[0.04980468750000004, 0.0]
val:0.0024805068969726606
Param:[0.03222656250000002, 0.0]
val:0.0010385513305664076
Param:[0.027832031250000014, 0.0]
val:0.0007746219635009773
Param:[0.054199218750000035, 0.0]
val:0.0029375553131103555
Param:[0.04650878906250003, 0.0]
val:0.0021630674600601222
Param:[0.0245361328125, 0.0]
val:0.0006020218133926392
Param:[0.01190185546874998, 0.0]
val:0.0001416541635990138
Param:[0.0, 0.0]
val:0.0
Param:[0.0, 0.0]
val:0.0
Param:[0.0, 0.0]
val:0.0
([0.0, 0.0], 0.0)

As you can see, the "local phase", does not start from [0.0,0.1], it starts from some other point.

@tpapp
Copy link
Owner

tpapp commented Sep 23, 2024

But note the line

Param:[0.0, 0.1]
val:0.010000000000000002

in your output. The point is visited, right after the starting points are generated.

@jmcastro2109
Copy link
Author

Yes, I figured that out. However, I expected to apply the local optimizer to the prepended_point rather than starting from a linear combination of the prepended_point and the best starting point.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants