Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: transformers.generation.utils.GenerationMixin.generate() argument after ** must be a mapping, not Tensor #178

Open
LSK-1 opened this issue Jun 21, 2024 · 10 comments

Comments

@LSK-1
Copy link

LSK-1 commented Jun 21, 2024

No description provided.

@hongmin118
Copy link

hongmin118 commented Jun 21, 2024

我也是这个错(GLM-4-9B-chat FastApi)
Snipaste_2024-06-21_17-00-34

@KMnO4-zx
Copy link
Contributor

请多提供一些报错信息,比如:

  • 什么环境,windows or linux?
  • fastapi的请求参数是什么?

@freecow
Copy link

freecow commented Jun 22, 2024

也是报同样的错,环境是直接采用github推荐的AutoDL镜像self-llm/GLM-4,fastapi调用格式:
curl -X POST "http:https://127.0.0.1:6006"
-H 'Content-Type: application/json'
-d '{"prompt": "你好", "history": []}'

@hkxxxxx
Copy link

hkxxxxx commented Jun 23, 2024

+1同样的错误,GLM-4-9B-Chat Fastapi:
image

@AXYZdong
Copy link
Contributor

可能是由于 transformers 版本低的问题,升级一下 transformers 可以解决。

pip install transformers-4.41.2

@AXYZdong
Copy link
Contributor

max_new_tokens (=256)

最大的token是设置的,没有回答完整很正常~

@AXYZdong
Copy link
Contributor

可能是由于 transformers 版本低的问题,升级一下 transformers 可以解决。

pip install transformers-4.41.2

在我的环境中transformers就是4.41.2这个版本,直接使用的autodl上的社区镜像,但是依然报错 Message: 'Both max_new_tokens (=256) and max_length(=272) seem to have been set. max_new_tokens will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)' Arguments: (<class 'UserWarning'>,) 且没有生成完整的回答: image

对于 fastapi 应该不会报错了。

@AXYZdong
Copy link
Contributor

max_new_tokens (=256)

最大的token是设置的,没有回答完整很正常~

这个在哪里设置呀,请大佬指条明路

参考 THUDM/ChatGLM3#1215

@hkxxxxx
Copy link

hkxxxxx commented Jun 23, 2024

可能是由于 transformers 版本低的问题,升级一下 transformers 可以解决。

pip install transformers-4.41.2

在我的环境中transformers就是4.41.2这个版本,直接使用的autodl上的社区镜像,但是依然报错 Message: 'Both max_new_tokens (=256) and max_length(=272) seem to have been set. max_new_tokens will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)' Arguments: (<class 'UserWarning'>,) 且没有生成完整的回答: image

对于 fastapi 应该不会报错了。

是的,我在fastapi项目中重装了transformers==4.41.2,成功运行!感谢大佬

@hongmin118
Copy link

pip install transformers==4.41.2,OK,感谢大佬

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants