-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JOSS review #327
Comments
Thank you for the review! I will address these points individually below.
To point you to the evidence that lead to this vague statement: ProbNum is implemented in NumPy which severly limits its performance, and the implementation is focused on generality rather than performance (disclaimer: I am one of the early main contributors to that software project). ProbDiffEq contains benchmarks such as this one which shows a similar comparison to diffrax as we show compared to DifferentialEquations.jl; I therefore thought that "similar perfromance" might be sufficiently well-founded (disclaimer: I worked closely with the main developer of ProbDiffEq for many years). Is this sufficient evidence to keep the part in the paper as it is, or should I soften the statement or add a sentence to support it? I would also like to support it much more strongly with a benchmark, but cross-programming-language benchmarks are somewhat non-trivial and I am not sure how quickly I could come around to creating one, so if we can resolve this part of the issue without such a benchmark that would be greatly preferred from my side currently.
I updated the README and extended the text to the following:
Does this add the desired information and resolve your concern? If you have thoughts on how to improve it, please do let me know.
Thank you for catching this! For the citations I used DocumenterCitations.jl, which I found to provide very pleasant results on the documentation website, but I completely missed how it gets rendered in the REPL. There is an open issue here that relates to this question, and essentially DocumenterCitations.jl seems not well-equiped to render citations well in the REPL at the current time. The way I see it, there is no way I could keep the way references are currently done with numbers (e.g. [4] and [5] as shown here) and with a shared "References" page (here), while also having perfectly readable REPL documentation. I see two ways forward: I could change the entry such that in the REPL the citation gets rendered as
but then it also gets shown as [Krämer et al.] in the documentation and the numbers do not align with the References section anymore. Or I could keep this part as it is. Personally I would prefer to rather focus on the online documentation, at the cost of some imperfections in the REPL, but I would be very happy about your input here. What would be the best way to resolve this issue to a level that is satisfactory to you?
Good catch! I updated the code to use
I provide guidelines for third parties wishing to contribute to the software, report issues or problems with the software, or seek support in the Contributing section of the README. As the section is quite short, I thought this would be the most visible location. Is this satisfactory to resolve this part of the issue? Thank you again for taking the time to review the package so carefully and for your valuable feedback, I appreciate it a lot. |
I would be fine if you explained this a bit like above. Maybe you can cite some of the benchmarks of OrdinaryDiffEq.jl etc.
👍 Please remember to merge the PR and update the paper branch accordingly.
That sounds fine. Please open an issue in your repo so that people know that this is a known issue (linking to the issue you mentioned above).
👍
Ok, let's keep it as it is. Thank you for your contribution to the open-source software ecosystem. Feel free to close this issue once you have merged the corresponding PR. |
Thank you for the fast and helpful reply! I updated the text in the paper and in particular I mentioned that both packages include benchmarks that compare to scipy, as those are currently the main evidence for the "similar performance" claim. I believe this should resolve the remaining open points, and I will therefore close this issue (but of course if you have more comments please just re-open it). Thank you again for taking the time to review the package and for providing such helpful comments! |
This is part of the review openjournals/joss-reviews#7048
Thanks a lot for your contributions 🙂 Please find below some comments based on the review checklist:
EK1(order=3)
vs.EK1(order=4)
, not onlyprob
vs.prob2
. Could you please comment on that in the tutorial?The text was updated successfully, but these errors were encountered: