-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[DIPU]华为上设置DIPU_PYTHON_DEVICE_AS_CUDA=false报错module 'torch' has no attribute 'xpu' #804
Labels
DIPU
DIPU related
Comments
|
先不要改环境变量,就能跑了。这个变量没有需求的情况下不要设置。 虽然正常跑不需要改这个变量,但是现在这个功能确实是坏掉了。@fandaoyi |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
背景:使用dipu在华为昇腾910B上跑llama2推理,
import torch_dipu
提示若设置
export DIPU_PYTHON_DEVICE_AS_CUDA=false
则推理会出现报错:恢复
export DIPU_PYTHON_DEVICE_AS_CUDA=true
则报错消失,详细报错如下:The text was updated successfully, but these errors were encountered: