Skip to content

Conversation

@scumatteo
Copy link
Contributor

If top-k metric is called with num_classes < k, top-k metric returns 0.

@coveralls
Copy link

Pull Request Test Coverage Report for Build 5077487805

  • 1 of 2 (50.0%) changed or added relevant lines in 1 file are covered.
  • 2 unchanged lines in 1 file lost coverage.
  • Overall coverage increased (+0.1%) to 72.217%

Changes Missing Coverage Covered Lines Changed/Added Lines %
avalanche/evaluation/metrics/topk_acc.py 1 2 50.0%
Files with Coverage Reduction New Missed Lines %
avalanche/evaluation/metrics/amca.py 2 75.0%
Totals Coverage Status
Change from base Build 5065785914: 0.1%
Covered Lines: 15749
Relevant Lines: 21808

💛 - Coveralls

@AntonioCarta
Copy link
Collaborator

It doesn't make a lot of sense to return 0 to mask an error, especially at the metric-level. IMO the plugin (not the metric) should not emit values when num_classes < k. If called, the metric should still fail.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants