Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added additional loss against data for NNODE #666

Merged
merged 53 commits into from
Apr 4, 2023

Conversation

AstitvaAggarwal
Copy link
Contributor

added option for additional loss against data for NNODE(used PhysicsInformedNN as a reference),work in progress as im getting some errors would greatly appreciate any help. Thanks
. (trying to solve issue #640, but NNODE dosent allow additional losses against surrogate Model predictions nor does it allow parameter estimation so this pr could help solve such ODE problems)

@xtalax
Copy link
Member

xtalax commented Mar 30, 2023

This looks good, can you write some tests to ensure its working? We need to test with each strategy. We will need to do documentation too. Where in the paper is this additional loss mentioned?

@AstitvaAggarwal
Copy link
Contributor Author

AstitvaAggarwal commented Mar 30, 2023

in the paper the loss term has additional loss terms :Lgls and Lconstr(under introduction eq 3 to 8), Lgls is defined against cell density predictions via a Surrogate Model MLP which takes in data as measured u at (x,t), Lconstr performs in parameter Estimation for the Growth, Time Delay and Diffusion terms(parameter functions are functions of the solution itself -> leads to issue #572).

@xtalax
Copy link
Member

xtalax commented Mar 30, 2023

Ah I see it, yep this looks like it implements what's missing, good job 👍

@xtalax xtalax linked an issue Mar 30, 2023 that may be closed by this pull request
@AstitvaAggarwal AstitvaAggarwal marked this pull request as ready for review April 2, 2023 08:54
@ChrisRackauckas
Copy link
Member

The test failure looks real.

src/ode_solve.jl Outdated
Comment on lines 298 to 311
function generate_loss(strategy::WeightedIntervalTraining, phi, f, autodiff::Bool, tspan, p,
batch)
minT = tspan[1]
maxT = tspan[2]

weights = strategy.weights ./ sum(strategy.weights)

N = length(weights)
samples = strategy.samples

difference = (maxT - minT) / N

data = Float64[]
for (index, item) in enumerate(weights)
temp_data = rand(1, trunc(Int, samples * item)) .* difference .+ minT .+
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rebase issue? Rebase to the new master.

@ChrisRackauckas
Copy link
Member

Rebase this onto the new master so that it's not dependent on any other PR (it's now merged).

(user defined function), also now has common OptimizationFunction object
definition(instead for each TrainingStrategy).
Also updated docs(might need further editing)
Future Scope:
We can add weighted loss in NNODE,and even losses for
Parameter Estimation for Inverse Problems.
(user defined function), also now has common OptimizationFunction object
definition(instead for each TrainingStrategy).
Also updated docs(might need further editing)
Future Scope:
We can add weighted loss in NNODE,and even losses for
Parameter Estimation for Inverse Problems.
(user defined function), also now has common OptimizationFunction object
definition(instead for each TrainingStrategy).
Also updated docs(might need further editing)
Future Scope:
We can add weighted loss in NNODE,and even losses for
Parameter Estimation for Inverse Problems.
src/ode_solve.jl Outdated
Comment on lines 421 to 426
# additional loss
additional_loss = alg.additional_loss

# Creates OptimizationFunction Object from total_loss
function total_loss(θ, _)
L2_loss = generate_loss(strategy, phi, f, autodiff, tspan, p, batch)(θ, phi)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# additional loss
additional_loss = alg.additional_loss
# Creates OptimizationFunction Object from total_loss
function total_loss(θ, _)
L2_loss = generate_loss(strategy, phi, f, autodiff, tspan, p, batch)(θ, phi)
inner_f = generate_loss(strategy, phi, f, autodiff, tspan, p, batch)
additional_loss = alg.additional_loss
# Creates OptimizationFunction Object from total_loss
function total_loss(θ, _)
L2_loss = inner_f (θ, phi)

src/ode_solve.jl Outdated
Comment on lines 434 to 445
opt_algo = if strategy isa QuadratureTraining
Optimization.AutoForwardDiff()
elseif strategy isa StochasticTraining
Optimization.AutoZygote()
elseif strategy isa WeightedIntervalTraining
Optimization.AutoZygote()
else
# by default GridTraining choice of Optimization
# if adding new training algorithms we can extend this,
# if-elseif-else block for choices of optimization algos
Optimization.AutoZygote()
end
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
opt_algo = if strategy isa QuadratureTraining
Optimization.AutoForwardDiff()
elseif strategy isa StochasticTraining
Optimization.AutoZygote()
elseif strategy isa WeightedIntervalTraining
Optimization.AutoZygote()
else
# by default GridTraining choice of Optimization
# if adding new training algorithms we can extend this,
# if-elseif-else block for choices of optimization algos
Optimization.AutoZygote()
end
opt_algo = if strategy isa QuadratureTraining
Optimization.AutoForwardDiff()
else
Optimization.AutoZygote()
end

@test sol.errors[:l2] < 0.5
@test sol.errors[:l2]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

typo my bad

src/ode_solve.jl Outdated
@@ -3,7 +3,7 @@ abstract type NeuralPDEAlgorithm <: DiffEqBase.AbstractODEAlgorithm end
"""
```julia
NNODE(chain, opt=OptimizationPolyalgorithms.PolyOpt(), init_params = nothing;
autodiff=false, batch=0, kwargs...)
autodiff=false, batch=0,additional_loss=nothing,kwargs...)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
autodiff=false, batch=0,additional_loss=nothing,kwargs...)
autodiff=false, batch=0,additional_loss=nothing,
kwargs...)

src/ode_solve.jl Outdated
Comment on lines 28 to 33
example:
ts=[t for t in 1:100]
(u_, t_) = (analytical_func(ts), ts)
function additional_loss(phi, θ)
return sum(sum(abs2, [phi(t, θ) for t in t_] .- u_)) / length(u_)
end
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This isn't in the right spot. Make an example section.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

under this argument's description? or along with the lower example?

src/ode_solve.jl Outdated
Comment on lines 277 to 278

return loss
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return loss
return loss

src/ode_solve.jl Outdated Show resolved Hide resolved
src/ode_solve.jl Outdated
Comment on lines 293 to 294

return loss
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return loss
return loss

@ChrisRackauckas ChrisRackauckas merged commit 5e506c7 into SciML:master Apr 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Recreate the examples from the BINN paper with NNODE
3 participants