-
-
Notifications
You must be signed in to change notification settings - Fork 617
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added The BInary Expected_Calibration_Error (ECE) Metric #3132
base: master
Are you sure you want to change the base?
Conversation
Apply Sweep Rules to your PR?
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR @Zekrom-7780 !
I left few comments but I haven't yet fully checked the code and correctness.
Do I have to add new tests, and documentation for the new metric? I don't know whether I had to add them or not.
Yes, we need to add new tests and provide a docstring and include the metric to the docs (i'll guide how to do that once we are at this point, nothing complicated).
Main question about tests will be what would be the reference for our implementation. For example, in most of the cases, we are using sklearn to compute expected value.
Will be doing the MultiClassECE in a new pull request, since it feels that it is the better thing to do.
Won't be possible to have a single class for binary and multi-class ECE ?
@@ -0,0 +1,57 @@ | |||
import torch |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
First, let's put it into ignite/metrics/ExpectedCalibrationError.py
instead of ignite/contrib/metrics/ExpectedCalibrationError.py
def update(self, output): | ||
y_pred, y = output | ||
|
||
assert y_pred.dim() == 2 and y_pred.shape[1] == 2, "This metric is for binary classification." |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's use the following way to raise errors instead of assert
:
if not (y_pred.dim() == 2 and y_pred.shape[1] == 2):
raise ValueError("This metric is for binary classification")
To assert if the input is binary we were doing previously something like here:
ignite/ignite/metrics/accuracy.py
Lines 51 to 67 in 4dc4e04
def _check_binary_multilabel_cases(self, output: Sequence[torch.Tensor]) -> None: | |
y_pred, y = output | |
if not torch.equal(y, y**2): | |
raise ValueError("For binary cases, y must be comprised of 0's and 1's.") | |
if not torch.equal(y_pred, y_pred**2): | |
raise ValueError("For binary cases, y_pred must be comprised of 0's and 1's.") | |
def _check_type(self, output: Sequence[torch.Tensor]) -> None: | |
y_pred, y = output | |
if y.ndimension() + 1 == y_pred.ndimension(): | |
num_classes = y_pred.shape[1] | |
if num_classes == 1: | |
update_type = "binary" | |
self._check_binary_multilabel_cases((y_pred, y)) |
self.corrects = torch.tensor([], device=self.device) | ||
|
||
def update(self, output): | ||
y_pred, y = output |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We usually call .detach
on both to stop grad computation like here:
ignite/ignite/metrics/accuracy.py
Line 231 in 4dc4e04
y_pred, y = output[0].detach(), output[1].detach() |
@vfdev-5 made the changes, Ready for comments
Will do it in the next commits if you want, actually I just wanted to understand how does the |
@Zekrom-7780 any updates on this PR ? |
@vfdev-5 I'll create the Then i'll move to writing the unit-tests, where I'll need your help, since I haven't written any as such |
…gnite into Issue_1009 The merge is necessary since I diverged from the main branch
@vfdev-5 , made the changes |
Fixes #1009
Description:
Added the binary
ECE(Expected_Calibration_Error)
metric.Reviewers
cc. @vfdev-5
P.S.
MultiClassECE
in a new pull request, since it feels that it is the better thing to do.