-
-
Notifications
You must be signed in to change notification settings - Fork 157
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feed known DE in neural DE #122
Comments
So, I figured the problem doesn't really have anything to do with feeding a known series in the neural DE.
I know this is a lot to read, but any help would be very much appreciated. This first part (setup) is fine:
So, as mentioned above the first problem has to do with using
I can easily solve this by wrapping The second, more important, problem is that if I try to use
By the way, I'm on Julia 1.3, and packages are: [email protected], [email protected], [email protected], [email protected]. |
Hey,
The adjoint right now requires using in-place differential equations (Xref: SciML/SciMLSensitivity.jl#113 ). This can be fixed, but it's an extreme edge case where you want to use a Number or StaticArray in something that makes sense to use an adjoint with. Doesn't mean it doesn't make sense though, but that's why it hasn't been prioritized. The fact that The second one... I gotta run it to see. |
Tracker just has funny semantics sometimes. That's why we are trying to phase out the use of it, though right now indeed you need it for SDEs. You just have to be careful that scalar indexing produces a using OrdinaryDiffEq, DiffEqFlux, DiffEqSensitivity, Flux
u0 = Float32(1/2)
datasize = 30
t_span = (t_min, t_max) = (0f0, 1f0)
t_range = range(t_min, t_max, length = datasize)
dt = (t_max - t_min)/datasize
true_ode(u, p, t) = Float32(1.01)*u
prob = ODEProblem{false}(true_ode, u0, t_span)
sol = solve(prob, Tsit5(); dense = true)
sim = sol(t_range)
nn = Chain(Dense(1, 16, softplus),
Dense(16, 1))
p_nn, re_nn = Flux.destructure(nn)
function odeN(par)
function dudt(u, p, t)
return re_nn(p)(u)
end
prob = ODEProblem{false}(dudt, u0, t_span, par)
return Array(concrete_solve(prob, Euler(), [u0], par; sensealg = TrackerAdjoint(), dt = dt, saveat = t_range))
end
loss = θ -> sum(abs2, sim .- odeN(θ)[:])
l = loss(p_nn)
cb = (θ, l) -> (println("Current loss:", l); false)
loss(p_nn)
DiffEqFlux.sciml_train(loss, p_nn, ADAM(0.1), maxiters = 300, cb = cb) Let me know if you need any more help. Now to go implement SDE adjoints so we can stop recommending TrackerAdjoint 👍 |
First of all thank you for all of the work that went into this!
I've been trying to get the gist of DiffEqFlux and I'm having a hard time solving a seemingly trivial problem: I'm trying to solve a Neural SDE in which one of the inputs of the neural network is the solution to a given DE. Basically I have two series (Z, K) - that were generated jointly - where Z is independent of K. What I want is for the neural network to learn the evolution of K given that it knows exactly the evolution of Z.
In the following MWE I'll be using
sensealg = TrackerAdjoint()
because the actual problem is for an SDE which, if I understood correctly from the docs, leaves me only that.However, this gives me the following
MethodError
(I leave out the stacktrace which is very unwieldy):I tried playing around with
Tracker.collect
or even with differentsensealg
but different errors pop up at different times and I'm not sure what I'm doing wrong and where.The text was updated successfully, but these errors were encountered: