You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
No need to write gpu and cpu methods for all our types anymore. As long as they are all functors (specified with @functor) then this will generalise with Flux.jl, as outlined here.
then all fields of explicit get loaded to the GPU, but Flux.trainable still returns an empty named tuple (so they won't be trained on). This is the behaviour we want.
Note that we use @functor ... for the two base parameter types DirectRENParams and DirectLBDNParams, so this is fine.
We need to add in GPU support for all REN and LBDN models.
SandwichFC
Need to check whether having GPU in the CI is actually possible too.
The text was updated successfully, but these errors were encountered: