You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It seems like the EleutherAI/pythia-800m tokenizer includes 'token_type_ids' values, but these lead to a ValueError when evaluating the following code:
Traceback (most recent call last):
File "eval.py", line 76, in <module>
outputs = model.generate(**inputs, temperature=0.0, max_new_tokens=40)
File "/om2/user/ericjm/miniconda3/envs/phase-changes/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/om2/user/ericjm/miniconda3/envs/phase-changes/lib/python3.8/site-packages/transformers/generation/utils.py", line 1296, in generate
self._validate_model_kwargs(model_kwargs.copy())
File "/om2/user/ericjm/miniconda3/envs/phase-changes/lib/python3.8/site-packages/transformers/generation/utils.py", line 993, in _validate_model_kwargs
raise ValueError(
ValueError: The following `model_kwargs` are not used by the model: ['token_type_ids'] (note: typos in the generate arguments will also show up in this list)
I can get around this error by simply using a tokenizer from another one of the models. This tokenizer, for instance, works:
It seems like the
EleutherAI/pythia-800m
tokenizer includes'token_type_ids'
values, but these lead to a ValueError when evaluating the following code:Here is the stack trace:
I can get around this error by simply using a tokenizer from another one of the models. This tokenizer, for instance, works:
It seems like the tokenizers are the same for all the models, so this issue is pretty easy to get around, but I just thought I'd report it.
The text was updated successfully, but these errors were encountered: