-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Keep getting error: 'VLLM' object has no attribute 'AUTO_MODEL_CLASS' #1953
Comments
you can try if it is fixed in the latest state of main branch. there were many fixes, changes and improvements since release of 0.4.2 |
Hi, This is fixed on the |
Hi, Could you provide me with steps to solve this? I have tried updating the repo a couple of times (i am in main) but i keep getting this. @haileyschoelkopf |
as a temporary solution you can skip the conditon causing the error by choosing your type of model in the lm_eval/api/model.py in def _encode_pair(self, context, continuation): (in line 299) and setting the "AUTO_MODEL_CLASS" manually. so in my case for "llama3 70b" i changed if self.AUTO_MODEL_CLASS == transformers.AutoModelForCausalLM: to if True and it worked. |
@malhajar17 have you uninstalled and reinstalled after pulling the latest commits from main? |
having the same issue here. When is the 0.43 version scheduled for a release? |
Yeah!, I also fall this hole. Yes, of course. Where is 0.4.3? Extremely anxious |
v0.4.3 is released on PyPI! |
Hello, i tried to run eval on local with vllm, but the same error occurs.
version
As implied, the error occured in the model initialization, where VLLM template has no such attribute 'AUTO_MODEL_CLASS'.
Do I need to manually put it as argument?
The text was updated successfully, but these errors were encountered: