Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix 173 #341

Merged
merged 33 commits into from
Nov 16, 2017
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
219b9f4
Initial support for infix ~ (#173).
yebai Aug 28, 2017
cb5b054
Merge branch 'master' of github.com:yebai/Turing.jl
yebai Aug 28, 2017
ffb9407
seperate model wrapper call in sample.jl test
xukai92 Oct 3, 2017
2a9466d
update ForwardDiff.Dual signature
xukai92 Oct 3, 2017
1431b24
Merge branch 'master' into Fix-173
yebai Oct 3, 2017
d54e9da
Merge branch 'Fix-173' of github.com:yebai/Turing.jl into Fix-173
yebai Oct 3, 2017
008f41e
sample.jl test passed
xukai92 Oct 4, 2017
80df8fe
fix some tests for Julia 0.6
xukai92 Oct 4, 2017
74c7c5f
remove typealias for 0.6
xukai92 Oct 4, 2017
6d7feac
make some functions 0.6-ish
xukai92 Oct 4, 2017
b14035e
make abstract 0.6-ish
xukai92 Oct 4, 2017
551f122
Merge branch 'Fix-173' of https://github.com/yebai/Turing.jl into Fix…
xukai92 Oct 4, 2017
e038605
change some decpreated functions
xukai92 Oct 4, 2017
80b4db6
Update .travis.yml
yebai Oct 5, 2017
ac74f77
Update appveyor.yml
yebai Oct 5, 2017
06520fc
Update appveyor.yml
yebai Oct 5, 2017
1d8139f
fix test
xukai92 Oct 5, 2017
8c74ec6
Deprecations on package loading fixed
xukai92 Oct 9, 2017
d976b45
fix deprecations
xukai92 Oct 9, 2017
aecd4e7
implement callbacks for inner function
xukai92 Oct 10, 2017
f0fa67a
fix model type bug
xukai92 Oct 19, 2017
ced0eb8
Fix type
xukai92 Oct 22, 2017
1776c1f
update Dual in benchmark
xukai92 Oct 23, 2017
7401353
update Dual constructor
xukai92 Oct 23, 2017
4f461c0
Bump up required Julia version to 0.6
yebai Oct 24, 2017
36eeb26
Disable depreciated warning messages for `consume/produce`.
yebai Nov 9, 2017
d8b7ed0
Remove duplicate definition of produce.
yebai Nov 9, 2017
848e36a
Merge branch 'master' into Fix-173
yebai Nov 12, 2017
841df5c
fix floor, tanh, abs, log
xukai92 Nov 15, 2017
b895169
fix logpdf warning and bug
xukai92 Nov 15, 2017
a7bf48a
fix vec assume init
xukai92 Nov 15, 2017
fa57fc2
Merge branch 'master' into Fix-173
yebai Nov 16, 2017
3fd77af
Travis: Allow `Benchmarking` test to fail.
yebai Nov 16, 2017
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ addons:
- g++-5
language: julia
julia:
- 0.5
- 0.6
os:
- linux
- osx
Expand Down Expand Up @@ -36,6 +36,7 @@ matrix:
allow_failures:
- env: GROUP=Test
os: osx
- env: GROUP=Bench
- env: GROUP=LDA
- env: GROUP=MOC
- env: GROUP=SV
Expand Down
2 changes: 1 addition & 1 deletion REQUIRE
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
julia 0.5
julia 0.6

Stan
Distributions 0.11.0
Expand Down
4 changes: 2 additions & 2 deletions appveyor.yml
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
environment:
matrix:
- JULIA_URL: "https://julialang-s3.julialang.org/bin/winnt/x64/0.5/julia-0.5.0-win64.exe"
- JULIA_URL: "https://julialang-s3.julialang.org/bin/winnt/x64/0.6/julia-0.6.0-win64.exe"
MINGW_DIR: mingw64
# MINGW_URL: https://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Win64/Personal%20Builds/mingw-builds/5.3.0/threads-win32/seh/x86_64-5.3.0-release-win32-seh-rt_v4-rev0.7z/download
MINGW_URL: http://mlg.eng.cam.ac.uk/hong/x86_64-5.3.0-release-win32-seh-rt_v4-rev0.7z
MINGW_ARCHIVE: x86_64-5.3.0-release-win32-seh-rt_v4-rev0.7z
- JULIA_URL: "https://julialang-s3.julialang.org/bin/winnt/x86/0.5/julia-0.5.0-win32.exe"
- JULIA_URL: "https://julialang-s3.julialang.org/bin/winnt/x86/0.6/julia-0.6.0-win32.exe"
MINGW_DIR: mingw32
# MINGW_URL: https://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Win32/Personal%20Builds/mingw-builds/5.3.0/threads-win32/dwarf/i686-5.3.0-release-win32-dwarf-rt_v4-rev0.7z/download
MINGW_URL: http://mlg.eng.cam.ac.uk/hong/i686-5.3.0-release-win32-dwarf-rt_v4-rev0.7z
Expand Down
4 changes: 2 additions & 2 deletions benchmarks/optimization.jl
Original file line number Diff line number Diff line change
Expand Up @@ -290,7 +290,7 @@ optRes *= "realpart(): \n"

using ForwardDiff: Dual

ds = [Dual{10,Float64}(rand()) for i = 1:1000]
ds = [Dual{Void,Float64,10}(rand()) for i = 1:1000]

t_map = @elapsed for i = 1:1000 map(d -> d.value, ds) end
t_list = @elapsed for i = 1:1000 Float64[ds[i].value for i = 1:length(ds)] end
Expand All @@ -304,7 +304,7 @@ optRes *= "Constructing Dual numbers: \n"

dps = zeros(44); dps[11] = 1;

t_dualnumbers = @elapsed for _ = 1:(44*2000*5) ForwardDiff.Dual(1.1, dps...) end
t_dualnumbers = @elapsed for _ = 1:(44*2000*5) ForwardDiff.Dual{Void, Float64, 44}(1.1, dps) end

optRes *= "44*2000*5 times: $t_dualnumbers\n"

Expand Down
2 changes: 1 addition & 1 deletion example-models/nips-2017/gmm.model.jl
Original file line number Diff line number Diff line change
Expand Up @@ -28,4 +28,4 @@ p = [ 0.2, 0.2, 0.2, 0.2, 0.2]
# μ = [ 0, 1, 2, 3.5, 4.25] + 0.5 * collect(0:4)
μ = [ 0, 1, 2, 3.5, 4.25] + 2.5 * collect(0:4)
s = [-0.5, -1.5, -0.75, -2, -0.5]
σ = exp(s)
σ = exp.(s)
4 changes: 2 additions & 2 deletions example-models/nips-2017/sv.model.jl
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@
μ ~ Cauchy(0, 10)
h = tzeros(Real, T)
h[1] ~ Normal(μ, σ / sqrt(1 - ϕ^2))
y[1] ~ Normal(0, exp(h[1] / 2))
y[1] ~ Normal(0, exp.(h[1] / 2))
for t = 2:T
h[t] ~ Normal(μ + ϕ * (h[t-1] - μ) , σ)
y[t] ~ Normal(0, exp(h[t] / 2))
y[t] ~ Normal(0, exp.(h[t] / 2))
end
end
2 changes: 1 addition & 1 deletion example-models/nips-2017/sv.sim.jl
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ for t in 2:T
end
y = Vector{Float64}(T);
for t in 1:T
y[t] = rand(Normal(0, exp(h[t] / 2)));
y[t] = rand(Normal(0, exp.(h[t] / 2)));
end


Expand Down
2 changes: 1 addition & 1 deletion example-models/nuts-paper/lr_helper.jl
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ readlrdata() = begin
open("lr_nuts.data") do f
while !eof(f)
raw_line = readline(f)
data_str = filter(str -> length(str) > 0, split(raw_line, r"[ ]+")[1:end-1])
data_str = Iterators.filter(str -> length(str) > 0, split(raw_line, r"[ ]+")[1:end-1])
data = map(str -> parse(str), data_str)
x = cat(1, x, data[1:end-1]')
y = cat(1, y, data[end] - 1) # turn {1, 2} to {0, 1}
Expand Down
2 changes: 1 addition & 1 deletion example-models/nuts-paper/sv_nuts.jl
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ y = readsvdata()
s[1] ~ Exponential(1/100)
for i = 2:N
s[i] ~ Normal(log(s[i-1]), τ)
s[i] = exp(s[i])
s[i] = exp.(s[i])
dy = log(y[i] / y[i-1]) / s[i]
dy ~ TDist(ν)
end
Expand Down
2 changes: 1 addition & 1 deletion example-models/sgld-paper/lr_helper.jl
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ readlrdata() = begin
open("a9a2000.data") do f
while !eof(f)
raw_line = readline(f)
data_str = filter(str -> length(str) > 0, split(raw_line, r"[ ]+")[1:end-1])
data_str = Iterators.filter(str -> length(str) > 0, split(raw_line, r"[ ]+")[1:end-1])
data = map(str -> parse(str), data_str)
x_tmp = zeros(Int32, d)
x_tmp[data[2:end]] = 1
Expand Down
4 changes: 2 additions & 2 deletions example-models/stan-models/normal-mixture.model.jl
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ using ForwardDiff: Dual

theta ~ Uniform(0, 1)

mu = tzeros(Dual, 2)
mu = Vector{Real}(2) # mu is sampled by HMC
for i = 1:2
mu[i] ~ Normal(0, 10)
end
Expand All @@ -25,6 +25,6 @@ using ForwardDiff: Dual
# logtheta_p = map(yᵢ -> [log(theta) + logpdf(Normal(mu[1], 1.0), yᵢ), log(1 - theta) + logpdf(Normal(mu[2], 1.0), yᵢ)], y)
# map!(logtheta_pᵢ -> logtheta_pᵢ - logsumexp(logtheta_pᵢ), logtheta_p) # normalization
# for i = 1:N
# k[i] ~ Categorical(exp(logtheta_p[i]))
# k[i] ~ Categorical(exp.(logtheta_p[i]))
# end
end
4 changes: 2 additions & 2 deletions src/Turing.jl
Original file line number Diff line number Diff line change
Expand Up @@ -55,8 +55,8 @@ global const CACHERANGES = 0b01
# Sampler abstraction #
#######################

abstract InferenceAlgorithm
abstract Hamiltonian <: InferenceAlgorithm
abstract type InferenceAlgorithm end
abstract type Hamiltonian <: InferenceAlgorithm end

doc"""
Sampler{T}
Expand Down
6 changes: 3 additions & 3 deletions src/core/ad.jl
Original file line number Diff line number Diff line change
Expand Up @@ -55,12 +55,12 @@ gradient(vi::VarInfo, model::Function, spl::Union{Void, Sampler}) = begin
vals = getval(vi, vns[i])
if vns[i] in vn_chunk # for each variable to compute gradient in this round
for i = 1:l
vi[range[i]] = ForwardDiff.Dual{CHUNKSIZE, Float64}(realpart(vals[i]), SEEDS[dim_count])
vi[range[i]] = ForwardDiff.Dual{Void, Float64, CHUNKSIZE}(realpart(vals[i]), SEEDS[dim_count])
dim_count += 1 # count
end
else # for other varilables (no gradient in this round)
for i = 1:l
vi[range[i]] = ForwardDiff.Dual{CHUNKSIZE, Float64}(realpart(vals[i]))
vi[range[i]] = ForwardDiff.Dual{Void, Float64, CHUNKSIZE}(realpart(vals[i]))
end
end
end
Expand All @@ -83,7 +83,7 @@ gradient(vi::VarInfo, model::Function, spl::Union{Void, Sampler}) = begin
end

verifygrad(grad::Vector{Float64}) = begin
if any(isnan(grad)) || any(isinf(grad))
if any(isnan.(grad)) || any(isinf.(grad))
dwarn(0, "Numerical error has been found in gradients.")
dwarn(1, "grad = $(grad)")
false
Expand Down
86 changes: 66 additions & 20 deletions src/core/compiler.jl
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@ macro VarName(ex::Union{Expr, Symbol})
# return: (:x,[1,2],6,45,3)
s = string(gensym())
if isa(ex, Symbol)
_ = string(ex)
return :(Symbol($_), Symbol($s))
ex_str = string(ex)
return :(Symbol($ex_str), Symbol($s))
elseif ex.head == :ref
_2 = ex
_1 = ""
Expand Down Expand Up @@ -143,9 +143,9 @@ Example:
```julia
@model gauss() = begin
s ~ InverseGamma(2,3)
m ~ Normal(0,sqrt(s))
1.5 ~ Normal(m, sqrt(s))
2.0 ~ Normal(m, sqrt(s))
m ~ Normal(0,sqrt.(s))
1.5 ~ Normal(m, sqrt.(s))
2.0 ~ Normal(m, sqrt.(s))
return(s, m)
end
```
Expand All @@ -154,7 +154,7 @@ macro model(fexpr)
# Compiler design: sample(fname_compiletime(x,y), sampler)
# fname_compiletime(x=nothing,y=nothing; data=data,compiler=compiler) = begin
# ex = quote
# fname_runtime(;vi=VarInfo,sampler=nothing) = begin
# fname_runtime(vi::VarInfo,sampler::Sampler) = begin
# x=x,y=y
# # pour all variables in data dictionary, e.g.
# k = data[:k]
Expand All @@ -168,6 +168,7 @@ macro model(fexpr)


dprintln(1, fexpr)
fexpr = translate(fexpr)

fname = fexpr.args[1].args[1] # Get model name f
fargs = fexpr.args[1].args[2:end] # Get model parameters (x,y;z=..)
Expand All @@ -178,31 +179,32 @@ macro model(fexpr)
# ==> f(x,y;)
# f(x,y; c=1)
# ==> unchanged

if (length(fargs) == 0 || # e.g. f()
isa(fargs[1], Symbol) || # e.g. f(x,y)
fargs[1].head == :kw) # e.g. f(x,y=1)
insert!(fargs, 1, Expr(:parameters))
end

dprintln(1, fname)
dprintln(1, fargs)
# dprintln(1, fargs)
dprintln(1, fbody)

# Remove positional arguments from inner function, e.g.
# f((x,y; c=1)
# ==> f(; c=1)
# f(x,y;)
# ==> f(;)
fargs_inner = deepcopy(fargs)[1:1]
# fargs_inner = deepcopy(fargs)[1:1]

# Add keyword arguments, e.g.
# f(; c=1)
# ==> f(; c=1, :vi=VarInfo(), :sample=nothing)
# f(;)
# ==> f(; :vi=VarInfo(), :sample=nothing)
push!(fargs_inner[1].args, Expr(:kw, :vi, :(Turing.VarInfo())))
push!(fargs_inner[1].args, Expr(:kw, :sampler, :(nothing)))
dprintln(1, fargs_inner)
# push!(fargs_inner[1].args, Expr(:kw, :vi, :(Turing.VarInfo())))
# push!(fargs_inner[1].args, Expr(:kw, :sampler, :(nothing)))
# dprintln(1, fargs_inner)

# Modify fbody, so that we always return VarInfo
fbody_inner = deepcopy(fbody)
Expand Down Expand Up @@ -234,19 +236,31 @@ macro model(fexpr)

dprintln(1, fbody_inner)

fname_inner = Symbol("$(fname)_model")
fname_inner_str = "$(fname)_model"
fname_inner = Symbol(fname_inner_str)
fdefn_inner = Expr(:(=), fname_inner,
Expr(:function, Expr(:call, fname_inner))) # fdefn = :( $fname() )
push!(fdefn_inner.args[2].args[1].args, fargs_inner...) # set parameters (x,y;data..)
# push!(fdefn_inner.args[2].args[1].args, fargs_inner...) # set parameters (x,y;data..)

push!(fdefn_inner.args[2].args[1].args, :(vi::Turing.VarInfo))
push!(fdefn_inner.args[2].args[1].args, :(sampler::Union{Void,Turing.Sampler}))

push!(fdefn_inner.args[2].args, deepcopy(fbody_inner)) # set function definition
dprintln(1, fdefn_inner)

fdefn_inner_callback_1 = parse("$fname_inner_str(vi::Turing.VarInfo)=$fname_inner_str(vi,nothing)")
fdefn_inner_callback_2 = parse("$fname_inner_str(sampler::Turing.Sampler)=$fname_inner_str(Turing.VarInfo(),nothing)")
fdefn_inner_callback_3 = parse("$fname_inner_str()=$fname_inner_str(Turing.VarInfo(),nothing)")

compiler = Dict(:fname => fname,
:fargs => fargs,
:fbody => fbody,
:dvars => Set{Symbol}(), # data
:pvars => Set{Symbol}(), # parameter
:fdefn_inner => fdefn_inner)
:fdefn_inner => fdefn_inner,
:fdefn_inner_callback_1 => fdefn_inner_callback_1,
:fdefn_inner_callback_2 => fdefn_inner_callback_2,
:fdefn_inner_callback_3 => fdefn_inner_callback_3)

# Outer function defintion 1: f(x,y) ==> f(x,y;data=Dict())
fargs_outer = deepcopy(fargs)
Expand All @@ -263,14 +277,34 @@ macro model(fexpr)

fdefn_outer = Expr(:function, Expr(:call, fname, fargs_outer...),
Expr(:block, Expr(:return, fname_inner)))


unshift!(fdefn_outer.args[2].args, :(Main.eval(fdefn_inner_callback_3)))
unshift!(fdefn_outer.args[2].args, :(Main.eval(fdefn_inner_callback_2)))
unshift!(fdefn_outer.args[2].args, :(Main.eval(fdefn_inner_callback_1)))
unshift!(fdefn_outer.args[2].args, :(Main.eval(fdefn_inner)))
unshift!(fdefn_outer.args[2].args, quote
# Check fargs, data
eval(Turing, :(_compiler_ = deepcopy($compiler)))
fargs = Turing._compiler_[:fargs];
fdefn_inner = Turing._compiler_[:fdefn_inner];
fdefn_inner.args[2].args[1].args[1] = gensym((fdefn_inner.args[2].args[1].args[1]))
fargs = Turing._compiler_[:fargs];

# Copy the expr of function definition and callbacks
fdefn_inner = Turing._compiler_[:fdefn_inner];
fdefn_inner_callback_1 = Turing._compiler_[:fdefn_inner_callback_1];
fdefn_inner_callback_2 = Turing._compiler_[:fdefn_inner_callback_2];
fdefn_inner_callback_3 = Turing._compiler_[:fdefn_inner_callback_3];

# Add gensym to function name
fname_inner_with_gensym = gensym((fdefn_inner.args[2].args[1].args[1]));

# Change the name of inner function definition to the one with gensym()
fdefn_inner.args[2].args[1].args[1] = fname_inner_with_gensym
fdefn_inner_callback_1.args[1].args[1] = fname_inner_with_gensym
fdefn_inner_callback_1.args[2].args[2].args[1] = fname_inner_with_gensym
fdefn_inner_callback_2.args[1].args[1] = fname_inner_with_gensym
fdefn_inner_callback_2.args[2].args[2].args[1] = fname_inner_with_gensym
fdefn_inner_callback_3.args[1].args[1] = fname_inner_with_gensym
fdefn_inner_callback_3.args[2].args[2].args[1] = fname_inner_with_gensym

# Copy data dictionary
for k in keys(data)
if fdefn_inner.args[2].args[2].args[1].head == :line
Expand All @@ -295,7 +329,7 @@ macro model(fexpr)
if _k != nothing
_k_str = string(_k)
dprintln(1, _k_str, " = ", _k)
_ = quote
data_check_ex = quote
if haskey(data, keytype(data)($_k_str))
if nothing != $_k
Turing.dwarn(0, " parameter "*$_k_str*" found twice, value in data dictionary will be used.")
Expand All @@ -305,7 +339,7 @@ macro model(fexpr)
data[keytype(data)($_k_str)] == nothing && Turing.derror(0, "Data `"*$_k_str*"` is not provided.")
end
end
unshift!(fdefn_outer.args[2].args, _)
unshift!(fdefn_outer.args[2].args, data_check_ex)
end
end
unshift!(fdefn_outer.args[2].args, quote data = copy(data) end)
Expand All @@ -331,3 +365,15 @@ getvsym(expr::Expr) = begin
end
curr
end


translate!(ex::Any) = ex
translate!(ex::Expr) = begin
if (ex.head === :call && ex.args[1] === :(~))
ex.head = :macrocall; ex.args[1] = Symbol("@~")
else
map(translate!, ex.args)
end
ex
end
translate(ex::Expr) = translate!(deepcopy(ex))
8 changes: 4 additions & 4 deletions src/core/container.jl
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Data structure for particle filters
- consume(pc::ParticleContainer): return incremental likelihood
"""

typealias Particle Trace
const Particle = Trace

type ParticleContainer{T<:Particle}
model :: Function
Expand All @@ -16,7 +16,7 @@ type ParticleContainer{T<:Particle}
# conditional :: Union{Void,Conditional} # storing parameters, helpful for implementing rejuvenation steps
conditional :: Void # storing parameters, helpful for implementing rejuvenation steps
n_consume :: Int # helpful for rejuvenation steps, e.g. in SMC2
ParticleContainer(m::Function,n::Int) = new(m,n,Array{Particle,1}(),Array{Float64,1}(),0.0,nothing,0)
ParticleContainer{T}(m::Function,n::Int) where {T} = new(m,n,Array{Particle,1}(),Array{Float64,1}(),0.0,nothing,0)
end

(::Type{ParticleContainer{T}}){T}(m) = ParticleContainer{T}(m, 0)
Expand Down Expand Up @@ -127,7 +127,7 @@ end
function weights(pc :: ParticleContainer)
@assert pc.num_particles == length(pc)
logWs = pc.logWs
Ws = exp(logWs-maximum(logWs))
Ws = exp.(logWs-maximum(logWs))
logZ = log(sum(Ws)) + maximum(logWs)
Ws = Ws ./ sum(Ws)
return Ws, logZ
Expand Down Expand Up @@ -196,5 +196,5 @@ getsample(pc :: ParticleContainer) = begin
w = pc.logE
Ws, z = weights(pc)
s = map((i)->getsample(pc, i, Ws[i]), 1:length(pc))
return exp(w), s
return exp.(w), s
end
6 changes: 3 additions & 3 deletions src/core/io.jl
Original file line number Diff line number Diff line change
Expand Up @@ -17,14 +17,14 @@ getjuliatype(s::Sample, v::Symbol, cached_syms=nothing) = begin
# NOTE: cached_syms is used to cache the filter entiries in svalue. This is helpful when the dimension of model is huge.
if cached_syms == nothing
# Get all keys associated with the given symbol
syms = collect(filter(k -> search(string(k), string(v)*"[") != 0:-1, keys(s.value)))
syms = collect(Iterators.filter(k -> search(string(k), string(v)*"[") != 0:-1, keys(s.value)))
else
syms = filter(k -> search(string(k), string(v)) != 0:-1, cached_syms)
syms = collect((Iterators.filter(k -> search(string(k), string(v)) != 0:-1, cached_syms)))
end
# Map to the corresponding indices part
idx_str = map(sym -> replace(string(sym), string(v), ""), syms)
# Get the indexing component
idx_comp = map(idx -> filter(str -> str != "", split(string(idx), [']','['])), idx_str)
idx_comp = map(idx -> collect(Iterators.filter(str -> str != "", split(string(idx), [']','[']))), idx_str)

# Deal with v is really a symbol, e.g. :x
if length(idx_comp) == 0
Expand Down
Loading