-
-
Notifications
You must be signed in to change notification settings - Fork 195
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
functional inverse problem #572
Comments
julia> prob = NeuralPDE.discretize(pdesys, discretization)
ERROR: MethodError: no method matching nameof(::Term{Real, Base.ImmutableDict{DataType, Any}})
Closest candidates are:
nameof(::Sym) at C:\Users\Luffy\.julia\packages\SymbolicUtils\vnuIf\src\types.jl:144
nameof(::ModelingToolkit.AbstractSystem) at C:\Users\Luffy\.julia\packages\ModelingToolkit\tMgaW\src\systems\abstractsystem.jl:139
nameof(::DataType) at C:\Users\Luffy\AppData\Local\Programs\Julia-1.7.2\share\julia\base\reflection.jl:223
...
Stacktrace:
[1] (::NeuralPDE.var"#40#41")(argument::Term{Real, Base.ImmutableDict{DataType, Any}})
@ NeuralPDE .\none:0
[2] iterate
@ .\generator.jl:47 [inlined]
[3] collect(itr::Base.Generator{Vector{Term{Real, Base.ImmutableDict{DataType, Any}}}, NeuralPDE.var"#40#41"})
@ Base .\array.jl:724
[4] get_vars(indvars_::Vector{Num}, depvars_::Vector{Num})
@ NeuralPDE C:\Users\Luffy\.julia\packages\NeuralPDE\iNhvg\src\symbolic_utilities.jl:353
[5] symbolic_discretize(pde_system::PDESystem, discretization::PhysicsInformedNN{GridTraining{Float64}, Nothing, Vector{NeuralPDE.Phi{Chain{NamedTuple{(:layer_1, :layer_2, :layer_3), Tuple{Dense{true, typeof(tanh_fast), typeof(Lux.glorot_uniform), typeof(Lux.zeros32)}, Dense{true, typeof(tanh_fast), typeof(Lux.glorot_uniform), typeof(Lux.zeros32)}, Dense{true, typeof(identity), typeof(Lux.glorot_uniform), typeof(Lux.zeros32)}}}}, NamedTuple{(:layer_1, :layer_2, :layer_3), Tuple{NamedTuple{(), Tuple{}}, NamedTuple{(), Tuple{}}, NamedTuple{(), Tuple{}}}}}}, typeof(NeuralPDE.numeric_derivative), Bool, typeof(additional_loss), Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}})
@ NeuralPDE C:\Users\Luffy\.julia\packages\NeuralPDE\iNhvg\src\discretize.jl:420
[6] discretize(pde_system::PDESystem, discretization::PhysicsInformedNN{GridTraining{Float64}, Nothing, Vector{NeuralPDE.Phi{Chain{NamedTuple{(:layer_1, :layer_2, :layer_3), Tuple{Dense{true, typeof(tanh_fast), typeof(Lux.glorot_uniform), typeof(Lux.zeros32)}, Dense{true, typeof(tanh_fast), typeof(Lux.glorot_uniform), typeof(Lux.zeros32)}, Dense{true, typeof(identity), typeof(Lux.glorot_uniform), typeof(Lux.zeros32)}}}}, NamedTuple{(:layer_1, :layer_2, :layer_3), Tuple{NamedTuple{(), Tuple{}}, NamedTuple{(), Tuple{}}, NamedTuple{(), Tuple{}}}}}}, typeof(NeuralPDE.numeric_derivative), Bool, typeof(additional_loss), Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}})
@ NeuralPDE C:\Users\Luffy\.julia\packages\NeuralPDE\iNhvg\src\discretize.jl:669
[7] top-level scope
@ REPL[106]:1
[8] top-level scope
@ C:\Users\Luffy\.julia\packages\CUDA\tTK8Y\src\initialization.jl:52 |
Changing |
I'm not sure this case will parse. It probably needs a few changes to be supported. I can't think of a quick workaround either, so I'll say this is unsupported right now. I'm going to post about how the parsing and codegen in this repo should be changed, and that would make it easy to support this, but it won't happen "soon", so I'd just write the PINN out by hand here if you need it. But this is a good thing to target, and the code you posted is probably what the front end should be. |
I'll just move on if it's not supported yet. |
But yeah, a very good feature to have. |
I have an inverse problem where the parameter function is a function of the solution, what is the correct way to implement it?
The text was updated successfully, but these errors were encountered: