-
Notifications
You must be signed in to change notification settings - Fork 312
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New benchmarking examples #266
Conversation
Implemented 1D and multiple trajectories cases and wrote corresponding tests.
Raise error if the inputs for pde cases contain tuple, and write corresponding tests. Modified conftest.py: return additional t in data_3d_random_pde and data_5d_random_pde
Log-normal contour plot of RMSE between chaotic level and noise level for X and X dot.
…into lanyue_work
…into lanyue_work
…and avoids integrating the models for speed. Now is quite fast, and the plan is to use this to look at syntactical complexity.
… with syntax so far, which is surprising, but need to make sure it is plotting properly.
…ngs. Made a new script that loops over all the polynomial systems in the dysts database, and fits them (without noise). Also wrote a script to load in all the ODE functions as strings, and parse them to extract all the model coefficients. Needs some tuning, but so far looks like it is performing well compared to the true coefficients.
This example looks really great to me! I dug into details a little bit, and pushed a small change just now. These are just suggestions--feel free to revert anything back to the last version if you like. I think it is ready for merge with master when you are. The main change I made was to speed up the Pareto sweeps by calculating I also included an option For the MIOSR algorithm, I did encounter "GurobiError: Model too large for size-limited license; visit https://www.gurobi.com/free-trial for a full license." I was able to acquire an academic license following the instructions here: https://www.gurobi.com/features/academic-named-user-license/, and everything works okay after copying the license to the right gurobi directory in my anaconda site-packages. The MIOSR runs do take a long time, and I think they will hog resources if running in parallel. I didn't run the sweeps again or save any results. I'd suggest (if you have the patience) to run all the sweeps one-at-a-time on a cluster machine so the runtime can be estimated without competing for resources. But everything look really nice to me, and I think it is a really helpful example to include. Nice work! |
PS: I pushed once more fixing the order of the |
…ning all the results at once so dont need multiple notebooks.
…for all the optimizers except MIOSR. Need to update the plotting notebook to use the new data. Made some small changes to the run_all script so that all of the optimizers can run weak form.
…ents were being reshuffled even when weak form was false.
…into lanyue_work
…umbers. The issue is that the true_coefficients matrix was reordered for the weak library, but the error matries were also reordered, so then they didnt match again. Now the true_coefficients are left alone. Reran the STLSQ results which are looking much more reasonable now.
…into lanyue_work
…into lanyue_work
… linting and pytest errors, this is ready for a merge.
Recreated bug described in #266, which only arises in gurobipy 10.0.0. Verified tests pass in 10.0.1 to 10.0.3
New benchmarking examples
Fixes dynamicslab#303 Demonstrated that bug described in dynamicslab#266 only arises in gurobipy 10.0.0. Verified tests pass in 10.0.1 to 10.0.3
Hi all,
Lanyue Zhang and I have been working on getting a big benchmark paper + example done, and I think the branch is pretty close to being ready for a merge.
Almost all of the changes are in the examples/16_noise_robustness/ folder. There are three jupyter notebooks there, and a utils.py file for some extra functions. One of the notebooks reproduces all the figures in the paper, another does some visualization and generates plots, and the last performs an example hyper parameter scan using the ensembling and other fancy functionality.
I also changed gitlab -> github in the .pre-commit file, since it looks like the issue was that flake8 moved from gitlab to github very recently (https://jira.mongodb.org/browse/PYTHON-3531). This appears to have resolved the issue. However, there is now a pytest issue related to sphinx and/or ipython (jupyter/nbconvert#528) and I'm not sure how to fix this.