You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there anything we can do to improve load speeds when typing using RobustNeuralNetworks in the REPL? Are we unnecessarily loading large parts of Flux.jl or Zygote.jl that we could get away with ignoring?
The text was updated successfully, but these errors were encountered:
A lot of the loading speed seems to come from loading Flux.jl, which we don't actually need... we could just load the essential parts like NNLib.jl, Functors.jl, and write gradient rules with ChainRulesCore.jl, which we'll switch to in #94 anyway.
We also don't need to load MatrixEquations.jl, the version of ContractingRENParams that takes in a linear system is not really used, and could be added back in a later extension of the package if required.
On second thoughts, removing Flux.jl and Zygote.jl sounds like a bad idea. Requires re-implementing things like glorot_normal which should have a standard interface across all ML libraries.
It also only takes package loading speeds down from 2.4s to about 1.7s on my MacbookPro on Julia v1.9. Minimal overhead anyway, so let's leave this alone.
Is there anything we can do to improve load speeds when typing
using RobustNeuralNetworks
in the REPL? Are we unnecessarily loading large parts ofFlux.jl
orZygote.jl
that we could get away with ignoring?The text was updated successfully, but these errors were encountered: