Utilities for easy use of custom losses in CatBoost, LightGBM, XGBoost. This sounds very simple, but in reality it took a lot of work.
Install this via pip (or your favourite package manager):
pip install boost-loss
import numpy as np
from boost_loss import LossBase
from numpy.typing import NDArray
class L2Loss(LossBase):
def loss(self, y_true: NDArray, y_pred: NDArray) -> NDArray:
return (y_true - y_pred) ** 2 / 2
def grad(self, y_true: NDArray, y_pred: NDArray) -> NDArray: # dL/dy_pred
return - (y_true - y_pred)
def hess(self, y_true: NDArray, y_pred: NDArray) -> NDArray: # d^2L/dy_pred^2
return np.ones_like(y_true)
import lightgbm as lgb
from boost_loss import apply_custom_loss
from sklearn.datasets import load_boston
X, y = load_boston(return_X_y=True)
apply_custom_loss(lgb.LGBMRegressor(), L2Loss()).fit(X, y)
Built-in losses are available. 1
from boost_loss.regression import LogCoshLoss
torch.autograd
Loss 2
import torch
from boost_loss.torch import TorchLossBase
class L2LossTorch(TorchLossBase):
def loss_torch(self, y_true: torch.Tensor, y_pred: torch.Tensor) -> torch.Tensor:
return (y_true - y_pred) ** 2 / 2
Thanks goes to these wonderful people (emoji key):
34j 💻 🤔 📖 |
This project follows the all-contributors specification. Contributions of any kind welcome!
Footnotes
-
Inspired by orchardbirds/bokbokbok ↩
-
Inspired by TomerRonen34/treeboost_autograd ↩