Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

运行模型出错 #23

Open
wzf2020 opened this issue Jun 7, 2024 · 2 comments
Open

运行模型出错 #23

wzf2020 opened this issue Jun 7, 2024 · 2 comments

Comments

@wzf2020
Copy link

wzf2020 commented Jun 7, 2024

场景:调用./run.sh --model llama2-7b --arch soc 时候出现两个错误
错误1:
报错提示没有chat.cpython-310-x86_64-linux-gnu.so这个库。
修改:板子非x86的,应该是aarch64 的库,根据自己板子实际编译结果修改run_demo.sh脚本中的库。
建议:建议修改下脚本run_demo.sh或者在常见问题中给出一个说明,让用户根据实际库名称修改

错误2:报一下错误
Traceback (most recent call last):
File "python_demo/pipeline.py", line 216, in
main(args)
File "python_demo/pipeline.py", line 197, in main
model = Llama2(args)
File "python_demo/pipeline.py", line 14, in init
self.tokenizer = AutoTokenizer.from_pretrained(
File "/home/linaro/.local/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 676, in from_pretrained
raise ValueError(
ValueError: Tokenizer class LlamaTokenizer does not exist or is not currently imported.
解决:通过运行pip3 install transformers --upgrade来更新Transformers库。
建议:在常见问题中说明,给用户提示下

@ctrlcplusv
Copy link

ctrlcplusv commented Jun 20, 2024

你好,我改成了aarch64但是还不能编译出chat.gnu.so库,这是咋回事
2222

@chuxiaoyi2023
Copy link
Collaborator

很奇怪的问题,有点像是链接错误,是lib_soc没有下载好么

参考这里的下载libsophon的方式
https://github.com/sophgo/LLM-TPU/tree/main/models/Qwen/demo_parallel

之后export LD_LIBRARY_PATH=xxx
试试呢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants