Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Loss Functions (Symmetric Loss function) #71

Open
s0nicboOm opened this issue Nov 1, 2022 · 6 comments
Open

Support for Loss Functions (Symmetric Loss function) #71

s0nicboOm opened this issue Nov 1, 2022 · 6 comments
Labels
area/ml Machine Learning based changes enhancement New feature or request hacktoberfest

Comments

@s0nicboOm
Copy link
Contributor

s0nicboOm commented Nov 1, 2022

Summary

Introduce Symmetric Loss functions for the ML model.

We are ultimately looking to support loss functions that are not only in PyTorch but also provide a flexibility to the user to plug in there own custom loss function. This issue does not only point to adding more loss functions but also asks for a better way of providing that interface to the user to bring in their own custom loss function that integrates with the models that we have today.

Use Cases


Message from the maintainers:

If you wish to see this enhancement implemented please add a 👍 reaction to this issue! We often sort issues this way to know what to prioritize.

@s0nicboOm s0nicboOm added enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed area/ml Machine Learning based changes labels Nov 1, 2022
@ab93
Copy link
Member

ab93 commented Nov 2, 2022

Can you please explain why we would need this? @s0nicboOm

@s0nicboOm
Copy link
Contributor Author

Can you please explain why we would need this? @s0nicboOm

Symmetric Loss function is when we want to weight the positive as well as the negative outliers equally. Currently we use Absolute error and MSE.

Here there is an Opportunity of introducing more such similar loss functions.

@haripriyajk
Copy link
Contributor

Hey,I would like to try resolving this issue.

@rum1887
Copy link

rum1887 commented Oct 8, 2023

hi, If I have understood it right, you want to me add support for symmetric functions https://github.com/numaproj/numalogic/blob/main/numalogic/models/vae/base.py here in the _init_criterion method.

def _init_criterion(loss_fn: str) -> Callable:
    if loss_fn == "huber":
        return F.huber_loss
    if loss_fn == "l1":
        return F.l1_loss
    if loss_fn == "mse":
        return F.mse_loss
    raise ValueError(f"Unsupported loss function provided: {loss_fn}")

like the sigmoid loss from torch
@vigith @s0nicboOm

@rum1887
Copy link

rum1887 commented Oct 8, 2023

torch.nn.functional has 21 loss functions among which 6 are symmetric loss functions, 3 are already supported

  1. L1 Loss (l1_loss)
  2. Mean Squared Error Loss (mse_loss)
  3. Huber Loss (huber_loss)

Need to add the following

  1. Poisson Negative Log Likelihood Loss (poisson_nll_loss)
  2. Gaussian Negative Log Likelihood Loss (gaussian_nll_loss)
  3. Smooth L1 Loss (smooth_l1_loss)

@s0nicboOm
Copy link
Contributor Author

s0nicboOm commented Oct 12, 2023

torch.nn.functional has 21 loss functions among which 6 are symmetric loss functions, 3 are already supported

  1. L1 Loss (l1_loss)
  2. Mean Squared Error Loss (mse_loss)
  3. Huber Loss (huber_loss)

Need to add the following

  1. Poisson Negative Log Likelihood Loss (poisson_nll_loss)
  2. Gaussian Negative Log Likelihood Loss (gaussian_nll_loss)
  3. Smooth L1 Loss (smooth_l1_loss)

Hi @rum1887 !

We are ultimately looking to support loss functions that are not only in PyTorch but also provide a flexibility to the user to plug in there own custom loss function. This issue does not only point to adding more loss functions but also asks for a better way of providing that interface to the user to bring in their own custom loss function that integrates with the models that we have today.

Take an example of a custom loss function that user wants to use in. In this case we are limited but if-else structure and we are not able to provide the flexibility to user. So, we do need a support for custom loss function that involves making a new interface. Hope that makes sense. Also, it should be noted that loss_fn are used at two places today: 1) training and 2) calculating reconstruction/prediction.

Thanks for the question. Let me add it to the issue summary for better clarification.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/ml Machine Learning based changes enhancement New feature or request hacktoberfest
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants