Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why pretrained weights are bigger than finetune weights? #106

Closed
eeyrw opened this issue Oct 6, 2022 · 2 comments
Closed

Why pretrained weights are bigger than finetune weights? #106

eeyrw opened this issue Oct 6, 2022 · 2 comments

Comments

@eeyrw
Copy link

eeyrw commented Oct 6, 2022

I download:
image
But I find out that pretrained weights is about 3x larger comparing with finetune. What's the difference between them?
When I loading the pretrained weights a message from console saying:
reshape position embedding from 196 to 576.
Does it means that two weights correspond to different model configuration?

@LiJunnan1992
Copy link
Contributor

  1. pretrained weights contains additional parameters such as the momentum model used for contrastive learning.
  2. yes the input resolution is different between pretrain & finetune

@eeyrw
Copy link
Author

eeyrw commented Oct 11, 2022

I see. Thank you.

@eeyrw eeyrw closed this as completed Oct 11, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants