Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Understand loss.shape #3

Open
BAAAL00 opened this issue Jan 7, 2025 · 1 comment
Open

Understand loss.shape #3

BAAAL00 opened this issue Jan 7, 2025 · 1 comment
Assignees

Comments

@BAAAL00
Copy link
Collaborator

BAAAL00 commented Jan 7, 2025

No description provided.

@BAAAL00 BAAAL00 self-assigned this Jan 7, 2025
@markorn1612
Copy link
Collaborator

see: https://pytorch.org/docs/stable/generated/torch.nn.SmoothL1Loss.html

reduction (str, optional) – Specifies the reduction to apply to the output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': the sum of the output will be divided by the number of elements in the output, 'sum': the output will be summed. Note: size_average and reduce are in the process of being deprecated, and in the meantime, specifying either of those two args will override reduction. Default: 'mean'

We used default: 'mean', so the output is a scalar value, this results in loss.shape being torch.Size([])

If we want the loss for each element, we should use reduction='none', which results in loss.shape being the same as input and target.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants