Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

L2 Norm Test #2

Open
avn3r-dn opened this issue Oct 2, 2019 · 0 comments
Open

L2 Norm Test #2

avn3r-dn opened this issue Oct 2, 2019 · 0 comments

Comments

@avn3r-dn
Copy link

avn3r-dn commented Oct 2, 2019

Hello, Thanks for your work.

I see that on the Test Data you do L2 Norm to the features is that correct? Wouldn't that throw off the threshold value since it was learned on an unnormalized d^2 euclidean space instead of normalized cosine space?

Also, I was trying to implement the BallClustering loss myself. Is it the same to assume its similar to CenterLoss but instead of the loss being the distances to the Centroid we use the 2 d^2 euclidean constrains to calculate loss? As parameters for the loss I have the centroids (num_classes, num_features) and b_hat (scalar value). It doesn't seem to produce better results than Softmax + Triplet on my datasets can you help me out figuring out what's missing or share your BallClustering Loss to compare with my implementation?

In the paper, you mention the Loss is simply L = alpha * Lsim + Ldis but no mention of Softmax. Is it safe to assume it's both Softmax + BallClustering on just 1 loss?

Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant