Skip to content

Commit

Permalink
Normalize top-1 metrics to [0, 1] (#1394)
Browse files Browse the repository at this point in the history
* align metrics

* change check

* delete if
  • Loading branch information
kprokofi authored Nov 30, 2022
1 parent 59e853a commit a8ca77d
Showing 1 changed file with 4 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,11 @@ def evaluate(self, results, metric="accuracy", metric_options=None, logger=None)
metrics.remove("class_accuracy")
self.class_acc = True

# compute top-k metrics from mmcls and align them in [0,1] range
eval_results = super().evaluate(results, metrics, metric_options, logger)
for k in metric_options['topk']:
eval_results[f'accuracy_top-{k}'] /= 100
assert 0 <= eval_results[f'accuracy_top-{k}'] <= 1

# Add Evaluation Accuracy score per Class - it can be used only for multi-class dataset.
if self.class_acc:
Expand Down

0 comments on commit a8ca77d

Please sign in to comment.