This release increases the supported version of python to 3.8 and also includes changes in the
installation requirements, where pandas
and scikit-optimize
packages have been updated
to support higher versions.
- Added github actions.
- Issue #210: Integrate Scikit-Optimize for benchmarking.
In this release BTB includes two new tuners, GCP
and GCPEi
. which use a
GaussianProcessRegressor
meta-model from sklearn.gaussian_process
applying
copulas.univariate.Univariate
transformations to the input data and afterwards reverts it for
the predictions.
- Issue #15: Implement a
GaussianCopulaProcessRegressor
. - Issue #205: Separate datasets from
MLChallenge
. - Issue #208: Implement
GaussianCopulaProcessMetaModel
.
With this release we fix the AX.optimize
tuning function by casting the values of the
hyperparameters to the type of value that they represent.
- Issue #201: Fix AX.optimize malfunction.
With this release we integrate a new tuning library, SMAC
, with our benchmarking process. A new
leaderboard including this library has been generated. The following two tuners from this library
have been added:
SMAC4HPO
: Bayesian optimization using a Random Forest model of pyrfr.HB4AC
: Uses Successive Halving for proposals.
- Renamed
btb_benchmark/tuners
tobtb_benchmark/tuning_functions
. - Ready to use tuning functions from
btb_benchmark/tuning_functions
.
- Issue #195: Integrate
SMAC
for benchmarking.
With this release we integrate a new tuning library, Ax
, with our benchmarking process. A new
leaderboard including this library has been generated.
- Issue #194: Integrate
Ax
for benchmarking.
This version adds a new functionality which allows running the benchmarking framework on a Kubernetes cluster. By doing this, the benchmarking process can be executed distributedly, which reduces the time necessary to generate a new leaderboard.
btb_benchmark.kubernetes.run_dask_function
: Run dask function inside a pod using the given config.btb_benchmark.kubernetes.run_on_kubernetes
: Start a Dask Cluster using dask-kubernetes and run a function.- Documentation updated.
- Jupyter notebooks with examples on how to run the benchmarking process and how to run it on kubernetes.
This release brings a new benchmark
framework with public leaderboard.
As part of our benchmarking efforts we will run the framework at every release and make the results
public. In each run we compare it to other tuners and optimizer libraries. We are constantly adding
new libraries for comparison. If you have suggestions for a tuner library we should include in our
compraison, please contact us via email at [email protected].
- Issue #159: Implement more
MLChallenges
and generate a public leaderboard. - Issue #180: Update BTB Benchmarking module.
- Issue #182: Integrate HyperOPT with benchmarking.
- Issue #184: Integrate dask to bencharking.
This release improves BTBSession
error handling and allows Tunables
with cardinality
equal to 1 to be scored with BTBSession
. Also, we provide a new documentation for
this version of BTB
.
Improved documentation, unittests and integration tests.
- Issue #164: Improve documentation for
v0.3.5+
. - Issue #166: Wrong erro raised by BTBSession on too many errors.
- Issue #170: Tuner has no scores attribute until record is run once.
- Issue #175: BTBSession crashes when record is not performed.
- Issue #176: BTBSession fails to select a proper Tunable when normalized_scores becomse None.
With this release we are improving BTBSession
by adding private attributes, or not intended to
be public / modified by the user and also improving the documentation of it.
Improved docstrings, unittests and public interface of BTBSession
.
- Issue #162: Fix session with the given comments on PR 156.
With this release we introduce a BTBSession
class. This class represents the process of selecting
and tuning several tunables until the best possible configuration fo a specific scorer
is found.
We also have improved and fixed some minor bugs arround the code (described in the issues below).
BTBSession
that makesBTB
more user friendly.
Improved unittests, removed old dependencies, added more MLChallenges
and fixed an issue with
the bound methods.
- Issue #145: Implement
BTBSession
. - Issue #155: Set defaut to
None
forCategoricalHyperParam
is not possible. - Issue #157: Metamodel
_MODEL_KWARGS_DEFAULT
becomes mutable. - Issue #158: Remove
mock
dependency from the package. - Issue #160: Add more Machine Learning Challenges and more estimators.
Fix a bug where creating an instance of Tuner
ends in an error.
Improve unittests to use spec_set
in order to detect errors while mocking an object.
- Issue #153: Bug with tunner logger message that avoids creating the Tunner.
With this release we add the new benchmark
challenge MLChallenge
which allows users to
perform benchmarking over datasets with machine learning estimators, and also some new
features to make the workflow easier.
- New
MLChallenge
challenge that allows performing crossvalidation over datasets and machine learning estimators. - New
from_dict
function forTunable
class in order to instantiate from a dictionary that contains information over hyperparameters. - New
default
value for each hyperparameter type.
- Issue #68: Remove
btb.tuning.constants
module. - Issue #120: Tuner repr not helpful.
- Issue #121: HyperParameter repr not helpful.
- Issue #141: Imlement propper logging to the tuning section.
- Issue #150: Implement Tunable
from_dict
. - Issue #151: Add default value for hyperparameters.
- Issue #152: Support
None
as a choice inCategoricalHyperPrameters
.
With this release we introduce a benchmark
module for BTB
which allows the users to perform
a benchmark over a series of challenges
.
- New
benchmark
module. - New submodule named
challenges
to work toghether withbenchmark
module.
- Issue #139: Implement a Benchmark for BTB
With this release we introduce an improved BTB
that has a major reorganization of the project
with emphasis on an easier way of interacting with BTB
and an easy way of developing, testing and
contributing new acquisition functions, metamodels, tuners and hyperparameters.
The new major reorganization comes with the btb.tuning
module. This module provides everything
needed for the tuning
process and comes with three new additions Acquisition
, Metamodel
and
Tunable
. Also there is an update to the Hyperparamters
and Tuners
. This changes are meant
to help developers and contributors to easily develop, test and contribute new Tuners
.
There is a slightly new way of using BTB
as the new Tunable
class is introduced, that is meant
to be the only requiered object to instantiate a Tuner
. This Tunable
class represents a
collection of HyperParams
that need to be tuned as a whole, at once. Now, in order to create a
Tuner
, a Tunable
instance must be created first with the hyperparameters
of the
objective function
.
- New
Hyperparameters
that allow an easier interaction for the final user. - New
Tunable
class that manages a collection ofHyperparameters
. - New
Tuner
class that is a python mixin that requieres ofAcquisition
andMetamodel
as parents. Also now works with a singleTunable
object. - New
Acquisition
class, meant to implement an acquisition function to be inherit by aTuner
. - New
Metamodel
class, meant to implement everything that a certainmodel
needs and be inherit by theTuner
. - Reorganization of the
selection
module to follow a similarAPI
totuning
.
- Issue #131: Reorganize the project structure.
- Issue #133: Implement Tunable class to control a list of hyperparameters.
- Issue #134: Implementation of Tuners for the new structure.
- Issue #140: Reorganize selectors.
- Issue #115: HyperParameter subclass instantiation not working properly
- Issue #62: Test for
None
inHyperParameter.cast
instead ofHyperParameter.__init__
- Issue #98: Categorical hyperparameters do not support
None
as input - Issue #89: Fix the computation of
avg_rewards
inBestKReward
- Issue #84: Error in GP tuning when only one parameter is present bug
- Issue #96: Fix pickling of HyperParameters
- Issue #98: Fix implementation of the GPEi tuner
- Updated documentation
- Issue #94: Fix unicode
param_type
caused error on python 2.
- Issue #74:
ParamTypes.STRING
tunables do not work
- New Recommendation module
- New HyperParameter types
- Improved documentation and examples
- Fully tested Python 2.7, 3.4, 3.5 and 3.6 compatibility
- HyperParameter copy and deepcopy support
- Replace print statements with logging
- Integrated with Travis-CI
- Exhaustive unit testing
- New implementation of HyperParameter
- Tuner builds a grid of real values instead of indices
- Resolve Issue #29: Make args explicit in
__init__
methods - Resolve Issue #34: make all imports explicit
- Fix error from mixing string/numerical hyperparameters
- Inverse transform for categorical hyperparameter returns single item
- Issue #47: Add missing requirements in v0.1.1 setup.py
- Issue #46: Error on v0.1.1: 'GP' object has no attribute 'X'
- First release.