You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When either nx = 0 or nv = 0, we have zero-dimensional arrays in the REN. This is fine for the forward-pass, but raises an error on back-propagation due to inconsistent dimensions. This is related to JuliaLang/julia#28866.
A minimal example is as follows.
using Flux
using Random
using RobustNeuralNetworks
"""Test that backpropagation runs and parameters change"""
batches =10
nu, nx, nv, ny =4, 5, 0, 2
γ =10
ren_ps =LipschitzRENParams{Float64}(nu, nx, nv, ny, γ)
model =DiffREN(ren_ps)
# Dummy data
us =randn(nu, batches)
ys =randn(ny, batches)
data = [(us[:,k], ys[:,k]) for k in1:batches]
# Dummy loss function just for testingfunctionloss(m, u, y)
x0 =init_states(m, size(u,2))
x1, y1 =m(x0, u)
return Flux.mse(y1, y) +sum(x1.^2)
end# Debug batch updates
opt_state = Flux.setup(Adam(0.01), model)
gs = Flux.gradient(loss, model, us, ys)
Flux.update!(opt_state, model, gs[1])
The error is caused by broadcasting addition with bias vectors in the evaluation of the REN. We need to find a work-around.
The text was updated successfully, but these errors were encountered:
When either
nx = 0
ornv = 0
, we have zero-dimensional arrays in the REN. This is fine for the forward-pass, but raises an error on back-propagation due to inconsistent dimensions. This is related to JuliaLang/julia#28866.A minimal example is as follows.
The error is caused by broadcasting addition with bias vectors in the evaluation of the REN. We need to find a work-around.
The text was updated successfully, but these errors were encountered: