-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question on the use of the Update! method and is_same_except() #212
Comments
Not sure what the problem might be. Can you provide a MWE demonstrating that using MLJModelInterface
import MLJModelInterface as MMI
mutable struct Classifier <: Probabilistic
x::Int
y::Int
end
model = Classifier(1, 2)
model2 = deepcopy(model)
model2.y = 7
@assert MMI.is_same_except(model, model2, :y) Or, if you suspect some other problem, a more self-contained MWE would be helpful. |
for example, using the this
gives false but i have only changed epochs for a simpler example that does not need LaplaceRedux consider this:
it's due to the fact that one of the field has a flux chain in it. If i remove it i get true. |
Thanks, this helps me see the problem: julia> c = Flux.Chain(Dense(2,3))
julia> c == deepcopy(c)
false Unfortunately, MLJ was not designed with this kind of behaviour in mind, for hyperparameter values. This has occurred once before and a hack was introduced, the trait Another possible resolution is for you to explicitly add an overloading In any case, make sure neither |
couldn't something like this replace the default es_same_except function?
with an helper function
it should work for every MLJ model that wrap a flux model. |
Great progress. I think your test for equality of Chains is not correct, for it will not behave as expected for nested chains, like I suggest you just overload locally and we not add complexity to MLJModelInterface for this one corner case. There is probably a more generic way to handle this, maybe by fixing |
indeed, i just found out that the models don't pass the test if optimiser= Adam() is included in the struct. How should I handle this case? should i always add it to the exceptions? |
Can you please provide some more detail. I don't see any problem at my end: julia> using Optimisers, Flux
julia> import MLJModelInterface as MMI
julia> model = NeuralNetworkClassifier();
julia> model2 = deepcopy(model);
julia> MMI.is_same_except(model, model2)
true
julia> model2.optimiser = Adam(42)
Adam(42.0, (0.9, 0.999), 1.0e-8)
julia> MMI.is_same_except(model, model2)
false
julia> model.optimiser = Adam(42)
Adam(42.0, (0.9, 0.999), 1.0e-8)
julia> MMI.is_same_except(model, model2)
true Are you perhaps using Flux.jl optimisers instead of Optimisers.jl optimisers? |
yes i think this is the issue because
gives me false. Looks like the tutorial i have read to write the training loop is outdated and now Flux prefer optimisers from the Optimisers.jl package but the documentation available online is a confusing mix of old and new rules... |
Well MLJFlux now definitely requires only Optimiser.jl optimisers. If any of the MLJ/MLJFlux docs are out-of-date in this respect, please point them out. |
ah but it was not the official documentation, it was i think a medium page or something like that. anyway i think i have fixed the update loop. if you don't mind i would like to keep this issue open for a bit longer, just in case i encounter another problem. in the opposite case i will close it myself. ok? thank you. |
Happy to support your work on an MLJ interface, and thanks for your persistence. |
Hi, i was trying to implement the update method for laplaceredux but I am having a problem.
this is the model
this is the fit function that i have written
and now follows the incomplete update function that i was trying. I have removed the loop part since it's not important.
the issue is that if i try to rerun the model by changing only the number of epochs is_same_except still gives me
even though :epochs is listed as exception
so what is the correct way to implement is_same_except? thank you
The text was updated successfully, but these errors were encountered: