Releases: automl/ParameterImportance
Bugfix for bokeh-plots
Compatibility With SMAC 0.12.1 and 0.12.2
1.1.1
Major changes
- Add support for SMAC 0.12.1 and 0.12.2
- Update args of random-forest to fit latest SMAC-requirements
Compatibility With SMAC 0.12.0
Major changes
- Add support for SMAC 0.12.0
- Drop support for SMAC < 0.12.0
Minor changes
- Fix and update examples
Add bokeh and fix compatibility with new SMAC
1.0.7
Major changes
- Add interactive bokeh-plots for evaluators
Interface changes
- Add function
plot_bokeh
to evaluators, returns bokeh-plot
Minor changes
- Change method to shorten parameter-names on plots
- Add pandas and bokeh to requirements
Bugfixes
- Support SMAC 0.11.x
- Add traj-alljson format for unambigously readable trajectories
- Fix #112 smac-facade import error
Fix SMAC>0.11.x support
Fix SMAC>0.11.x support
Enable smac-support for version > 0.8.0
mainly enable smac-support for version > 0.9.0, also add small features
Increase logs and fix label
improve logging, verbosity-control and output 'cost' instead of 'quality'
Beta release of PIMP
Supported Evaluation methods
Ablation (via Surrogates)
Ablation is a local method that determines parameter importances between two given configurations. It thereby looks which parameter contributed most in a local part of the Configuration Space. It is an iterative method that changes, in each round, one parameter from the starting configuration to that of the target configuration. The parameter that resulted in the highest improvement is kept as this rounds most important parameter. The order determines which parameters are deemed most important and the percentage of improvement tells us how much influence a parameter has.
In PIMP we implemented an efficient variant of ablation, which replaces costly algorithm runs with cheap to evaluate surrogates.
Forward Selection
Forward-Selection is an iterative method. In each iteration it constructs models that only consider parts of all available parameters and keeps the one parameter that results in the lowest prediction error for the next round. The order determines which parameters are deemed most important.
Influence Models
Influence Models aim to learn a linear model and deems those parameters as most important that result in the highest weights of the linear model. However it does not necessarily look at all possible parameters, only those that improve the performance when adding them to the linear model in a forward step. Additionally, it performs one (or more) backwards steps, in which it checks if parameters have become unimportant due to conditionalities in the Parameter Space.
fANOVA
fANOVA is an efficient parameter importance method, leveraging random forest models fit on the data already gathered by Bayesian optimization. fANOVA is able to quantify the importance of both single hyperparameters and of interactions between hyperparameters.