Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.Last.tune.result contains intermediate results #613

Closed
simonpcouch opened this issue Feb 17, 2023 · 2 comments · Fixed by #623
Closed

.Last.tune.result contains intermediate results #613

simonpcouch opened this issue Feb 17, 2023 · 2 comments · Fixed by #623

Comments

@simonpcouch
Copy link
Contributor

library(tidymodels)

svm_mod <- svm_rbf(mode = "regression", cost = tune()) %>%
  set_engine("kernlab")

set.seed(7898)
data_folds <- vfold_cv(mtcars, v = 2)

rec <- recipe(mpg ~ ., data = mtcars) %>%
  step_bs(disp, deg_free = 1)

cars_res <- tune_bayes(svm_mod, preprocessor = rec,
                        resamples = data_folds)
#> → 1 | warning: some 'x' values beyond boundary knots may cause ill-conditioned bases
#> There were issues with some computations   1: x15

cars_res
#> # Tuning results
#> # 2-fold cross-validation 
#> # A tibble: 22 × 5
#>    splits          id    .metrics          .notes           .iter
#>    <list>          <chr> <list>            <list>           <int>
#>  1 <split [16/16]> Fold1 <tibble [10 × 5]> <tibble [0 × 3]>     0
#>  2 <split [16/16]> Fold2 <tibble [10 × 5]> <tibble [5 × 3]>     0
#>  3 <split [16/16]> Fold1 <tibble [2 × 5]>  <tibble [0 × 3]>     1
#>  4 <split [16/16]> Fold2 <tibble [2 × 5]>  <tibble [1 × 3]>     1
#>  5 <split [16/16]> Fold1 <tibble [2 × 5]>  <tibble [0 × 3]>     2
#>  6 <split [16/16]> Fold2 <tibble [2 × 5]>  <tibble [1 × 3]>     2
#>  7 <split [16/16]> Fold1 <tibble [2 × 5]>  <tibble [0 × 3]>     3
#>  8 <split [16/16]> Fold2 <tibble [2 × 5]>  <tibble [1 × 3]>     3
#>  9 <split [16/16]> Fold1 <tibble [2 × 5]>  <tibble [0 × 3]>     4
#> 10 <split [16/16]> Fold2 <tibble [2 × 5]>  <tibble [1 × 3]>     4
#> # … with 12 more rows
#> 
#> There were issues with some computations:
#> 
#>   - Warning(s) x15: some 'x' values beyond boundary knots may cause ill-conditioned b...
#> 
#> Run `show_notes(.Last.tune.result)` for more information.

.Last.tune.result
#> # Tuning results
#> # 2-fold cross-validation 
#> # A tibble: 2 × 4
#>   splits          id    .metrics         .notes          
#>   <list>          <chr> <list>           <list>          
#> 1 <split [16/16]> Fold1 <tibble [2 × 5]> <tibble [0 × 3]>
#> 2 <split [16/16]> Fold2 <tibble [2 × 5]> <tibble [1 × 3]>
#> 
#> There were issues with some computations:
#> 
#>   - Warning(s) x1: some 'x' values beyond boundary knots may cause ill-conditioned b...
#> 
#> Run `show_notes(.Last.tune.result)` for more information.

Created on 2023-02-17 with reprex v2.0.2

@simonpcouch
Copy link
Contributor Author

This happens because tune_bayes() iteratively calls tune_grid() as part of the GP. That function "resets" the last result each time:

tune/R/tune_grid.R

Lines 298 to 307 in c74dfe4

res <-
tune_grid_workflow(
object,
resamples = resamples,
grid = grid,
metrics = metrics,
pset = param_info,
control = control
)
.stash_last_result(res)

@github-actions
Copy link

This issue has been automatically locked. If you believe you have found a related problem, please file a new issue (with a reprex: https://reprex.tidyverse.org) and link to this issue.

@github-actions github-actions bot locked and limited conversation to collaborators Mar 14, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant