Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[python] No evaluation result shown during training even setting verbose=2 #5028

Closed
shiyu1994 opened this issue Feb 23, 2022 · 6 comments
Closed
Labels

Comments

@shiyu1994
Copy link
Collaborator

Description

With the latest master branch, evaluation results are not shown during training even when verbose=2.

Reproducible example

The following code won't show any RMSE values during training.

import numpy as np
import lightgbm as lgb

def fobj(preds, train_data):
    labels = train_data.get_label()
    return preds - labels, np.ones_like(labels)

def test():
    np.random.seed(123)
    num_data = 10000
    num_feature = 100
    train_X = np.random.randn(num_data, num_feature)
    train_y = np.mean(train_X, axis=-1)
    valid_X = np.random.randn(num_data, num_feature)
    valid_y = np.mean(valid_X, axis=-1)
    weights = np.random.rand(num_data)
    train_data = lgb.Dataset(train_X, train_y, weight=weights) # comment out weights will get the same output
    valid_data = lgb.Dataset(valid_X, valid_y)
    params = {
        "verbose": 2,
        "metric": "rmse",
        "learning_rate": 0.2,
        "num_trees": 20,
    }
    booster = lgb.train(train_set=train_data, valid_sets=[valid_data], valid_names=["valid"], params=params, fobj=fobj)

if __name__ == "__main__":
    test()

Environment info

LightGBM version or commit hash:
Latest master branch

Command(s) you used to install LightGBM:
Install from source.

@shiyu1994 shiyu1994 added the bug label Feb 23, 2022
@StrikerRUS
Copy link
Collaborator

StrikerRUS commented Feb 23, 2022

To narrow the problem: version 3.3.2 from PyPI, which is actually 3.3.1, correctly outputs [Debug] messages:

D:\Miniconda3\lib\site-packages\lightgbm\engine.py:177: UserWarning: Found `num_trees` in params. Will use it instead of argument
  _log_warning(f"Found `{alias}` in params. Will use it instead of argument")
[LightGBM] [Warning] Using self-defined objective function
[LightGBM] [Debug] Dataset::GetMultiBinFromAllFeatures: sparse rate 0.000000
[LightGBM] [Debug] init for col-wise cost 0.000150 seconds, init for row-wise cost 0.002715 seconds
[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.004773 seconds.
You can set `force_col_wise=true` to remove the overhead.
[LightGBM] [Info] Total Bins 25500
[LightGBM] [Info] Number of data points in the train set: 10000, number of used features: 100
[LightGBM] [Warning] Using self-defined objective function
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[1]	valid's rmse: 0.100043
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 6
[2]	valid's rmse: 0.099099
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[3]	valid's rmse: 0.0982311
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[4]	valid's rmse: 0.0974867
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[5]	valid's rmse: 0.0965613
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[6]	valid's rmse: 0.0957191
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[7]	valid's rmse: 0.0949163
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 6
[8]	valid's rmse: 0.0940159
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[9]	valid's rmse: 0.0932777
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[10]	valid's rmse: 0.0924858
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[11]	valid's rmse: 0.0917661
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[12]	valid's rmse: 0.0909356
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[13]	valid's rmse: 0.0901323
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[14]	valid's rmse: 0.0894671
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[15]	valid's rmse: 0.0888048
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[16]	valid's rmse: 0.0881257
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[17]	valid's rmse: 0.0874723
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[18]	valid's rmse: 0.0868133
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[19]	valid's rmse: 0.0862182
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[20]	valid's rmse: 0.0856057

Output for nightly build:

D:\Miniconda3\lib\site-packages\lightgbm\engine.py:138: UserWarning: Found `num_trees` in params. Will use it instead of argument
  _log_warning(f"Found `{alias}` in params. Will use it instead of argument")
[LightGBM] [Warning] Using self-defined objective function
[LightGBM] [Debug] Dataset::GetMultiBinFromAllFeatures: sparse rate 0.000000
[LightGBM] [Debug] init for col-wise cost 0.000029 seconds, init for row-wise cost 0.002843 seconds
[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.004850 seconds.
You can set `force_col_wise=true` to remove the overhead.
[LightGBM] [Info] Total Bins 25500
[LightGBM] [Info] Number of data points in the train set: 10000, number of used features: 100
[LightGBM] [Warning] Using self-defined objective function
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 6
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 6
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8
[LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7

@StrikerRUS
Copy link
Collaborator

StrikerRUS commented Feb 23, 2022

OK, I got it. The problem is in dropped support of additional arguments of the train() function. Now users should provide log_evaluation callback directly instead of relying on indirect callback initialization via default verbose_eval=True argument (removed in master #4878).

The "fix" is just to refactor the last line in the original code snipped in the following way:

...

    callbacks = [lgb.log_evaluation()]
    booster = lgb.train(train_set=train_data, valid_sets=[valid_data], valid_names=["valid"], params=params, fobj=fobj, callbacks=callbacks)

@StrikerRUS
Copy link
Collaborator

I believe this issue can be closed.

@shiyu1994
Copy link
Collaborator Author

@StrikerRUS Thanks for you explanation, I'll close this.

@arnwas
Copy link

arnwas commented Aug 18, 2023

This should be added to the docs in an easy to be found way.

Copy link

This issue has been automatically locked since there has not been any recent activity since it was closed. To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues including a reference to this.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 22, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

3 participants