Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What do 'low_order' and 'high_order' represent? #5

Open
lygsbw opened this issue Jul 16, 2021 · 3 comments
Open

What do 'low_order' and 'high_order' represent? #5

lygsbw opened this issue Jul 16, 2021 · 3 comments

Comments

@lygsbw
Copy link

lygsbw commented Jul 16, 2021

Hello, I want to know what the 'low_order' and 'high_order' represent in the 'similarity' function in models.py and how to set the 'high_k'?

I also wonder which parts of the codes present the L_cos in the paper?

Thank you very much!

@ChengyueGongR
Copy link
Owner

ChengyueGongR commented Jul 16, 2021

Hi,
The 'low_order' and 'high_order' in 'sim_loss' represents multi-scale representations with different pooling scale. Currently, we do not include the cosine similarity calculation. It can be implemented in a very simple way:
import torch.nn.functional as F
def abs_cos(x, y):
return (F.normalize(x) @ F.normalize(y).transpose(-2, -1)).abs().mean()

@xwan6266
Copy link

Dear Authors,
Thanks for your nice work. I also have some questions about the similarity function part.
If I understand right, this functions correspond to the patch-wise contrastive loss described in your paper. However, the code only considers one positive pairs against two negative pairs calculated from multi-scale representations with different pooling scale. This is different from the loss function described in your paper.
My question is: is this because this kind operation can have better performance in practice or other reasons?
Many thanks.

@hkhanuja
Copy link

hkhanuja commented Apr 6, 2022

Dear Authors,
As @xwan6266 correctly pointed out, the similarity loss function mentioned in the paper is different from the one in your implementation. Is there some reason behind that? Can you please explain?
@ChengyueGongR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants