You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It hasn't been "banned", it's just deprecated because that function became a fully doumented package of its own, Optimization.jl
┌ Warning: sciml_train is being deprecated in favor of direct usage of Optimization.jl. Please consult the Optimization.jl
documentation for more details. Optimization.jl's PolyOpt solver is the polyalgorithm of sciml_train
I am a total newbie in this package...
I tried to run the code in this link: https://sebastiancallh.github.io/post/neural-ode-weather-forecast/
Of course, I change some code to fit the new vision of Julia.
Question : sciml_train has been banned?
In this part, I am not sure if lower_bounds and upper_bounds parameters are wrong.
Here in my code after changed
Here is the error messages
┌ Warning: sciml_train is being deprecated in favor of direct usage of Optimization.jl. Please consult the Optimization.jl documentation for more details. Optimization.jl's PolyOpt solver is the polyalgorithm of sciml_train
└ @ DiffEqFlux C:\Users\a li.julia\packages\DiffEqFlux\Em1Aj\src\train.jl:6
ERROR: MethodError: no method matching length(::Nothing)
Closest candidates are:
length(::Union{Base.KeySet, Base.ValueIterator}) at abstractdict.jl:58
length(::CUDA.CUSPARSE.CuSparseMatrix) at C:\Users\a li.julia\packages\CUDA\DfvRa\lib\cusparse\array.jl:223
length(::Union{NNlib.BatchedAdjoint{T, S}, NNlib.BatchedTranspose{T, S}} where {T, S}) at C:\Users\a li.julia\packages\NNlib\0QnJJ\src\batched\batchedadjtrans.jl:58
...
Stacktrace:
[1] sciml_train(::var"#loss#53"{Matrix{Float64}, var"#predict#52"{NeuralODE{Lux.Chain{NamedTuple{(:layer_1, :layer_2, :layer_3), Tuple{Lux.Dense{true, typeof(swish), typeof(Lux.glorot_uniform), typeof(Lux.zeros32)}, Lux.Dense{true, typeof(swish), typeof(Lux.glorot_uniform), typeof(Lux.zeros32)}, Lux.Dense{true, typeof(identity), typeof(Lux.glorot_uniform), typeof(Lux.zeros32)}}}}, Nothing, Nothing, Tuple{Float64, Float64}, Tuple{Tsit5{typeof(OrdinaryDiffEq.trivial_limiter!), typeof(OrdinaryDiffEq.trivial_limiter!), Static.False}}, Base.Pairs{Symbol, Any, Tuple{Symbol, Symbol, Symbol}, NamedTuple{(:saveat, :abstol, :reltol), Tuple{Vector{Float64}, Float64, Float64}}}}, Vector{Float64}}}, ::Nothing, ::Flux.Optimise.Optimiser, ::Nothing; lower_bounds::Vector{Float64}, upper_bounds::Vector{Float64}, cb::Function, callback::Function, maxiters::Int64, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ DiffEqFlux C:\Users\a li.julia\packages\DiffEqFlux\Em1Aj\src\train.jl:9
[2] train_one_round(node::NeuralODE{Lux.Chain{NamedTuple{(:layer_1, :layer_2, :layer_3), Tuple{Lux.Dense{true, typeof(swish), typeof(Lux.glorot_uniform), typeof(Lux.zeros32)}, Lux.Dense{true, typeof(swish), typeof(Lux.glorot_uniform), typeof(Lux.zeros32)}, Lux.Dense{true, typeof(identity), typeof(Lux.glorot_uniform), typeof(Lux.zeros32)}}}}, Nothing, Nothing, Tuple{Float64, Float64}, Tuple{Tsit5{typeof(OrdinaryDiffEq.trivial_limiter!), typeof(OrdinaryDiffEq.trivial_limiter!), Static.False}}, Base.Pairs{Symbol, Any, Tuple{Symbol, Symbol, Symbol}, NamedTuple{(:saveat, :abstol, :reltol), Tuple{Vector{Float64}, Float64, Float64}}}}, θ::Nothing, y::Matrix{Float64}, opt::Flux.Optimise.Optimiser, maxiters::Int64, y0::Vector{Float64}; kwargs::Base.Pairs{Symbol, var"#28#30"{Vector{Any}, Vector{Any}}, Tuple{Symbol}, NamedTuple{(:cb,), Tuple{var"#28#30"{Vector{Any}, Vector{Any}}}}})
@ Main .\REPL[59]:10
[3] train(θ::Nothing, maxiters::Int64, lr::Float64)
@ Main .\REPL[38]:13
[4] train()
@ Main .\REPL[38]:2
[5] top-level scope
@ REPL[60]:1
[6] top-level scope
@ C:\Users\a li.julia\packages\CUDA\DfvRa\src\initialization.jl:52
The text was updated successfully, but these errors were encountered: