-
Notifications
You must be signed in to change notification settings - Fork 589
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ITM Loss Stuck at 0.63 #200
Comments
It seems to predict 0s for all of the ITM labels no matter what |
so what is solution for this problem? i am encountering same problem |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi, I am trying to replicate and pretrain BLIP for distillation purposes - I am using Flickr30K + COCO and my ITM loss gets stuck at 0.63 - upon an initial look, all of the ITM predictions are 1. Is this a dataset size issue or a batch issue? I've tried changing the learning rate to a smaller LR, I've tried increasing the size of the model and more, but nothing seems to work.
The text was updated successfully, but these errors were encountered: