Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GradientNumber Promotion Error on Julia v0.5 #88

Closed
papamarkou opened this issue Jan 19, 2016 · 8 comments
Closed

GradientNumber Promotion Error on Julia v0.5 #88

papamarkou opened this issue Jan 19, 2016 · 8 comments

Comments

@papamarkou
Copy link
Contributor

The following example fails:

import ForwardDiff

f(x::Vector) = exp(x)
g = ForwardDiff.gradient(f)

g([3.5, 4.1])

The error I get is the following:

julia> g([3.5, 4.1])
ERROR: MethodError: `convert` has no method matching convert(::Type{ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}}}, ::Array{ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}},1})
This may have arisen from a call to the constructor ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}}(...),
since type constructors fall back to convert methods.
Closest candidates are:
  ForwardDiff.GradientNumber{N,T,C}(::Any, ::Any)
  call{T}(::Type{T}, ::Any)
  convert{T<:Real}(::Type{T<:Real}, ::Complex{T<:Real})
  ...
 in _calc_gradient at /Users/theodore/.julia/v0.4/ForwardDiff/src/api/gradient.jl:86
 in g at /Users/theodore/.julia/v0.4/ForwardDiff/src/api/gradient.jl:54
@KristofferC
Copy link
Collaborator

'f' need to return a scalar, no?

@jrevels
Copy link
Member

jrevels commented Jan 19, 2016

f here is a function from a Vector to a Vector, you need to use ForwardDiff.jacobian instead of ForwardDiff.gradient.

We should probably have a better error message for this, since it seems to be a common mistake.

@jrevels jrevels closed this as completed Jan 19, 2016
@papamarkou
Copy link
Contributor Author

Sorry about this @jrevels, @KristofferC , you are right, I should had used the jacobian. The thing is that I actually need the gradient, but in a different context/example. I tried to somehow construct a simpler example focusing on the initial error message I was getting, and I obviously failed...!

I will try to come up with a better example related to what I am trying to do and what error message I get.

@papamarkou
Copy link
Contributor Author

Ok, attempt 2 to construct a relevant example. This is similar to what I am trying to do:

import ForwardDiff

f(x::Vector) = sum(1+exp(x))
g = ForwardDiff.gradient(f)

g([3.5, 4.1])

This is the error I get:

julia> g([3.5, 4.1])
ERROR: no promotion exists for Int64 and ForwardDiff.GradientNumber{N,T,C}
 [inlined code] from promotion.jl:160
 in .+ at arraymath.jl:119
 [inlined code] from arraymath.jl:141
 in f at none:1
 in _calc_gradient at /Users/theodore/.julia/v0.5/ForwardDiff/src/api/gradient.jl:86
 in g at /Users/theodore/.julia/v0.5/ForwardDiff/src/api/gradient.jl:54
 in eval at /Applications/Julia-0.5.0-dev-e55c22a793.app/Contents/Resources/julia/lib/julia/sys.dylib

@jrevels
Copy link
Member

jrevels commented Jan 19, 2016

Hmm. Looks like v0.5-specific breakage. It seems to work on v0.4:

julia> import ForwardDiff

julia> f(x::Vector) = sum(1+exp(x))
f (generic function with 1 method)

julia> g = ForwardDiff.gradient(f)
g (generic function with 1 method)

julia> g([3.5, 4.1])
2-element Array{Float64,1}:
 33.1155
 60.3403

@jrevels jrevels changed the title Not possible to differentiate exp() (or log()) GradientNumber Promotion Error on Julia v0.5 Jan 19, 2016
@jrevels jrevels reopened this Jan 19, 2016
@papamarkou
Copy link
Contributor Author

Ah, yes, it works on v0.4, didn't notice :)

@KristofferC
Copy link
Collaborator

I don't think the problem is with the promotion but in type inference, maybe same as #75.

@jrevels
Copy link
Member

jrevels commented Jun 15, 2016

This now works, either due to advances in Julia v0.5 or #102.

@jrevels jrevels closed this as completed Jun 15, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants