Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model name in demo #199

Closed
chandrabhuma opened this issue May 11, 2024 · 6 comments
Closed

Model name in demo #199

chandrabhuma opened this issue May 11, 2024 · 6 comments

Comments

@chandrabhuma
Copy link

I have used the demo given

Demo

from vlmeval.config import supported_VLM
model = supported_VLM'idefics_9b_instruct'

Forward Single Image

ret = model.generate(['assets/apple.jpg', 'What is in this image?'])
print(ret) # The image features a red apple with a leaf on it.

Forward Multiple Images

ret = model.generate(['assets/apple.jpg', 'assets/apple.jpg', 'How many apples are there in the provided images? '])
print(ret) # There are two apples in the provided images.
It is working fine. But when I change the model other than idefics it is giving key error. Could u please clarify?

@kennymckormick
Copy link
Member

Hi, @chandrabhuma,
would you please tell me which model are u using when encountered the error?

@chandrabhuma
Copy link
Author

from vlmeval.config import supported_VLM
model = supported_VLM'deepseek-ai/deepseek-vl-7b-chat'

KeyError Traceback (most recent call last)
in <cell line: 3>()
1 # Demo
2 from vlmeval.config import supported_VLM
----> 3 model = supported_VLM'deepseek-ai/deepseek-vl-7b-chat'
4 # Forward Single Image
5 ret = model.generate(['/content/fire.jpg', 'What is in this image?'])

KeyError: 'deepseek-ai/deepseek-vl-7b-chat'
For not only deepseek but also for other several models...Should I change the path of the model?

@chandrabhuma
Copy link
Author

Thank you for the prompt response

@chandrabhuma
Copy link
Author

I have uses the same line as given in demo
model = supported_VLM'deepseek-ai/deepseek-vl-7b-chat'

@kennymckormick
Copy link
Member

Hi, @chandrabhuma ,
That's not the correct usage. You should input the model_name that is defined in vlmeval/config.py

@chandrabhuma
Copy link
Author

Now they are working..I am checking all the models. Thanks for the update and correcting me..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants