You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It seems the trend is to use things like LabelledArrays.jl for parameter vectors so that you get both x[i] and x.a access to values, which makes writing the objective function much much easier. Do you have any thoughts on if this is something you'd want to support and, if so, how we might go about doing so?
Apologies for the delayed reply. I have been thinking about this and related issues (notably tpapp/TransformVariables.jl#13) for a while. I run into these issues all the time myself, but I do not yet have an easy solution.
I didn't test it, but I am sure that TikTak could be made to work with anything that's an ::AbstractVector. The problem is the local optimizer, eg a NLopt calls a foreign library, so at some point it needs to be converted to a bunch of Float64s, but at the same time if the objective requires labelled or component arrays, then they would need to be reconstituted.
The workaround I would suggest at the moment is making a wrapper (a callable struct) that has the information to reconstruct whatever is needed from a flat vector.
It seems the trend is to use things like LabelledArrays.jl for parameter vectors so that you get both
x[i]
andx.a
access to values, which makes writing the objective function much much easier. Do you have any thoughts on if this is something you'd want to support and, if so, how we might go about doing so?https://github.com/SciML/LabelledArrays.jl
https://github.com/jonniedie/ComponentArrays.jl
https://github.com/invenia/ParameterHandling.jl
The text was updated successfully, but these errors were encountered: