Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow user to specify loss function using LossFunctions.jl #10

Closed
Tracked by #6
dahong67 opened this issue Oct 2, 2023 · 1 comment
Closed
Tracked by #6

Allow user to specify loss function using LossFunctions.jl #10

dahong67 opened this issue Oct 2, 2023 · 1 comment
Labels
enhancement New feature or request

Comments

@dahong67
Copy link
Owner

dahong67 commented Oct 2, 2023

It would be nice to also allow users to pass in losses from LossFunctions.jl: https://github.com/JuliaML/LossFunctions.jl

Currently, we define the following method signature:

gcp(X::Array, r, func=(x, m) -> (m - x)^2, grad=(x, m) -> ForwardDiff.derivative(m -> func(x, m), m), lower=-Inf) =
_gcp(X, r, func, grad, lower, (;))

The simplest way to support LossFunctions.jl is probably to define an additional method signature as follows:

gcp(X::Array, r, loss::SupervisedLoss, lower=-Inf) = _gcp(X, r, ???, ???; lower)

where the ??? are what need to be filled in.

@alexmul1114, to start you may want to read through the LossFunctions.jl docs and try some examples. The following parts will be particularly relevant (using v0.10 docs since docs for current version appear to currently be missing the docstrings):

@dahong67
Copy link
Owner Author

dahong67 commented Oct 9, 2023

Closed by #11

@dahong67 dahong67 closed this as completed Oct 9, 2023
@dahong67 dahong67 added the enhancement New feature or request label Oct 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant