-
Notifications
You must be signed in to change notification settings - Fork 151
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Build function efficient multiple returns #46
Comments
If you make an array of array of expressions it'll output that array of array. It think we might want to try and let a tuple of expressions do that. @shashi what do you think on this one? I think we'd just generate the full expressions and let CSE reduce it. |
Thanks Chris, But just returning an array of array would be enough to avoid repeating computation and to only compute, for instance the gradient if the hessian is not needed? To be clear, I want to know whether there is way to symbolically and automatically generate something like what they mention in the Optim.jl documentation:
|
Oh a doubly-mutating function? Yeah, you could make a function |
Hi Chris, Sorry for taking so long to answer back, I wanted to have a MWE that reflected my questions. The matter is not exactly to have a doubly-mutating function. Consider the following example inspired by the one in the documentation of DiffResults.jl
One can see in the result below that many operations used in f can be reused in g:
Suppose one is running a gradient descent algorithm with a line search. Given an input vector x one will need functions to Question 1: Is there a better way to do this than to do (independent of inplace or not)
Question 2: It seems to me when I run for instance
that LLVM does not realize that there are multiple operations that could be reused. I might be wrong about that because I am not very familiar with LLVM. If indeed LLVM does not reuse the operations, is there a functionality that could be implemented on build_function itself that would allow for that? Question 3: (This is actually probably a bug) When building the function
the output is
which seems like a bug, or I might be doing something wrong. |
We can do Q2 symbolically instead of relying on LLVM. |
Also, doing CSE is essential for producing readable code after reduction, too. |
I was looking into how to implement tuple returns in
in I was about to do a PR to |
Yeah I think I didn't really have |
You can definitely define something like
|
Yeah, that could have been an idea. Why did you decide to merge the PR? |
It is often the case (for instance in optimization) that one needs to calculate some functions, such as the gradient and the hessian, that have some shared calculations. Is there a way in modelingtoolkit to use symbolic computation and build_function to generate functions that act something like?
┆Issue is synchronized with this Trello card by Unito
The text was updated successfully, but these errors were encountered: