Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Entropy metric #3210

Merged
merged 11 commits into from
Mar 22, 2024
Merged

Add Entropy metric #3210

merged 11 commits into from
Mar 22, 2024

Conversation

kzkadc
Copy link
Contributor

@kzkadc kzkadc commented Mar 20, 2024

Description: adds Entropy metric.
Entropy is often used when evaluating uncertainty of classification predictions.
It is computed as $H=\sum_c - p_c \log p_c$, where $p_c$ is the predicted probability for $c$-th class.

Check list:

  • New tests are added (if a new feature is added)
  • New doc strings: description and/or example code are in RST format
  • Documentation is updated (if required)

@github-actions github-actions bot added docs module: metrics Metrics module labels Mar 20, 2024
ignite/metrics/entropy.py Outdated Show resolved Hide resolved
assert not ent._sum_of_entropies.requires_grad


@pytest.mark.distributed
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For distributed config tests, could you please rewrite them using new testing formalism that we are trying to adopt. Here is an example of the code to inspire of:

@pytest.mark.usefixtures("distributed")
class TestDistributed:
@pytest.mark.parametrize("average", [False, "macro", "weighted", "micro"])
@pytest.mark.parametrize("n_epochs", [1, 2])
def test_integration_multiclass(self, average, n_epochs):

Here is a PR showing how to pass from old code to the new one:
https://github.com/pytorch/ignite/pull/3208/files#diff-c56c264ef288f88e5738e9ad22de66dffd4c58d2e656eb62e8dbaa678672317d

Thanks!

ignite/metrics/entropy.py Outdated Show resolved Hide resolved
Copy link
Collaborator

@vfdev-5 vfdev-5 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kzkadc thanks for the PR, lgtm now

Updating distributed config test to the new code can be done in a follow-up PR if you would like.
Let's see if CI is passing and merge this PR

ignite/metrics/entropy.py Outdated Show resolved Hide resolved
@vfdev-5
Copy link
Collaborator

vfdev-5 commented Mar 22, 2024

@vfdev-5 vfdev-5 enabled auto-merge (squash) March 22, 2024 14:36
@vfdev-5 vfdev-5 disabled auto-merge March 22, 2024 14:55
@vfdev-5
Copy link
Collaborator

vfdev-5 commented Mar 22, 2024

@kzkadc this failure is related: https://github.com/pytorch/ignite/actions/runs/8391979913/job/22983553399?pr=3210#step:9:345

         y_preds = idist.all_gather(y_preds)
        y_true = idist.all_gather(y_true)
    
        assert "entropy" in engine.state.metrics
        res = engine.state.metrics["entropy"]
    
>       true_res = np_entropy(y_preds.numpy())
E       TypeError: can't convert xla:0 device type tensor to numpy. Use Tensor.cpu() to copy the tensor to host memory first.

tests/ignite/metrics/test_entropy.py:102: TypeError

@vfdev-5 vfdev-5 merged commit c3845ba into pytorch:master Mar 22, 2024
14 of 18 checks passed
@kzkadc kzkadc mentioned this pull request Mar 23, 2024
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants