You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm sure this is expected, but just to track it.. Loading Flux on Julia post JuliaLang/julia#36059 isn't possible because of a CUDAnative error
ERROR: LoadError: LoadError: LoadError: UndefVarError: AddrSpacePtr not defined
Stacktrace:
[1] getproperty(::Module, ::Symbol) at ./Base.jl:26
[2] top-level scope at /Users/ian/.julia/packages/CUDAnative/e0IdN/src/device/cuda/wmma.jl:52
[3] include at ./Base.jl:369 [inlined]
[4] include(::String) at /Users/ian/.julia/packages/CUDAnative/e0IdN/src/CUDAnative.jl:1
[5] top-level scope at /Users/ian/.julia/packages/CUDAnative/e0IdN/src/device/cuda.jl:15
[6] include at ./Base.jl:369 [inlined]
[7] include(::String) at /Users/ian/.julia/packages/CUDAnative/e0IdN/src/CUDAnative.jl:1
[8] top-level scope at /Users/ian/.julia/packages/CUDAnative/e0IdN/src/CUDAnative.jl:75
[9] include at ./Base.jl:369 [inlined]
[10] include_package_for_output(::String, ::Array{String,1}, ::Array{String,1}, ::Array{String,1}, ::Array{Pair{Base.PkgId,UInt64},1}, ::Tuple{UInt64,UInt64}, ::String) at ./loading.jl:1219
[11] top-level scope at none:1
[12] eval at ./boot.jl:340 [inlined]
[13] eval(::Expr) at ./client.jl:446
[14] top-level scope at none:1
in expression starting at /Users/ian/.julia/packages/CUDAnative/e0IdN/src/device/cuda/wmma.jl:2
in expression starting at /Users/ian/.julia/packages/CUDAnative/e0IdN/src/device/cuda.jl:14
in expression starting at /Users/ian/.julia/packages/CUDAnative/e0IdN/src/CUDAnative.jl:1
I tried to load CUDA (using CUDA) and got the same error ERROR: LoadError: LoadError: LoadError: UndefVarError: AddrSpacePtr not defined today. I am running the latest julia (v1.5.0) and added the latest CUDA (v0.1.0) on a linux machine. Is this still expected?
I'm sure this is expected, but just to track it.. Loading Flux on Julia post JuliaLang/julia#36059 isn't possible because of a CUDAnative error
Assuming it'll be fixed when Flux moves to CUDA.jl FluxML/Flux.jl#1204
The text was updated successfully, but these errors were encountered: