-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
运行模型出错 #23
Comments
很奇怪的问题,有点像是链接错误,是lib_soc没有下载好么 参考这里的下载libsophon的方式 之后export LD_LIBRARY_PATH=xxx |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
场景:调用./run.sh --model llama2-7b --arch soc 时候出现两个错误
错误1:
报错提示没有chat.cpython-310-x86_64-linux-gnu.so这个库。
修改:板子非x86的,应该是aarch64 的库,根据自己板子实际编译结果修改run_demo.sh脚本中的库。
建议:建议修改下脚本run_demo.sh或者在常见问题中给出一个说明,让用户根据实际库名称修改
错误2:报一下错误
Traceback (most recent call last):
File "python_demo/pipeline.py", line 216, in
main(args)
File "python_demo/pipeline.py", line 197, in main
model = Llama2(args)
File "python_demo/pipeline.py", line 14, in init
self.tokenizer = AutoTokenizer.from_pretrained(
File "/home/linaro/.local/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 676, in from_pretrained
raise ValueError(
ValueError: Tokenizer class LlamaTokenizer does not exist or is not currently imported.
解决:通过运行pip3 install transformers --upgrade来更新Transformers库。
建议:在常见问题中说明,给用户提示下
The text was updated successfully, but these errors were encountered: