You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for sharing your great work!
I read some questions and your answers about the losses, but I ask again to make it sure.
For the losses in your code, I see a similar type of losses such as
neg_loss = 2.0/self.alpha * torch.log(1 + torch.sum(torch.exp(self.alpha * (neg_pair - base))))
from the SemiHard loss, DistWeighted loss. As far as I know, those losses were a linear loss of the distance(in your code, similarity) according to the original papers. (If those losses are from the FaceNet and Sampling matteres...)
Are these losses your own 'Weight' loss which has an assumption that the data follows the distribution of the mixture of Gaussian?
Thanks :D
The text was updated successfully, but these errors were encountered:
Thanks for sharing your great work!
I read some questions and your answers about the losses, but I ask again to make it sure.
For the losses in your code, I see a similar type of losses such as
neg_loss = 2.0/self.alpha * torch.log(1 + torch.sum(torch.exp(self.alpha * (neg_pair - base))))
from the SemiHard loss, DistWeighted loss. As far as I know, those losses were a linear loss of the distance(in your code, similarity) according to the original papers. (If those losses are from the FaceNet and Sampling matteres...)
Are these losses your own 'Weight' loss which has an assumption that the data follows the distribution of the mixture of Gaussian?
Thanks :D
The text was updated successfully, but these errors were encountered: