You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 9, 2024. It is now read-only.
Hyperparameter tuning will involve the following:
An object that will enable 'combinations' to be tried. This object will be serialized much like any other and when deserialized, will be used to manage the sklearn base hyperparemeter tuners. 'Combinations' are specific configurations of the foreshadow pipeline that the user determines, for instance, specifying that two different concrete transformers will be used, or for a given concrete transformer, 3 different hyperparameter settings will be used.
As well, based on sklearn parameter tuning limitations, we will tune, at a given time, either the,
pipeline, ( the number and type of steps, for instance, if the intent is changed for a given column the preprocessor pipeline for that column will change).
concrete/smart transformer decisions.
Note: some concrete transformers may change the number of columns and we should be able to handle this smartly.
hyperparameters for 2.
@adithyabsk add more details here and especially any concerns.
Recall the sequence diagram that at a given time, only intents, engineerer, or preprocessor can be tuned and the rest will be held static. This portion will be a separate issue as this ticket only focuses on the manual tuning by the user.
Estimate: 4 days
The text was updated successfully, but these errors were encountered:
@adithyabsk will show a POC and see if we can do conditional tuning so that 2. and 3. can be combined.
TunerWrapper will be an interface between a Tuner Class and a Foreshadow object.
The tuner class may be our own subclass of BaseSearchCV or an SKLearn one - at first, they will all be sklearn tuners which have additional constraints we need to abide by.
This tuning works using the .score() of the estimator and the get_params / set_params to tune the foreshadow object.
Description
Hyperparameter tuning will involve the following:
An object that will enable 'combinations' to be tried. This object will be serialized much like any other and when deserialized, will be used to manage the sklearn base hyperparemeter tuners. 'Combinations' are specific configurations of the foreshadow pipeline that the user determines, for instance, specifying that two different concrete transformers will be used, or for a given concrete transformer, 3 different hyperparameter settings will be used.
As well, based on sklearn parameter tuning limitations, we will tune, at a given time, either the,
Note: some concrete transformers may change the number of columns and we should be able to handle this smartly.
@adithyabsk add more details here and especially any concerns.
Recall the sequence diagram that at a given time, only intents, engineerer, or preprocessor can be tuned and the rest will be held static. This portion will be a separate issue as this ticket only focuses on the manual tuning by the user.
Estimate: 4 days
The text was updated successfully, but these errors were encountered: