-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference device problem #58
Comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi, thanks for your great work.
I've tried to infer on an image with the script
inference_on_an_image.py
, it works ok and some boxes are detected. But when I tried again on the same image and replaced lines likedevice = "cuda" if not cpu_only else "cpu"
todevice = "cuda:1" if not cpu_only else "cpu"
to run on a different GPU, the code still runs but there is no box detected. The code works ok withdevice = "cuda:0"
though. What could be the issue?P/S: I notice when
device = "cuda:1"
both GPU:0 and GPU:1 on my machine witness some VRAM increase.Here is the result when
device = "cuda"
ordevice = "cuda:0"
:And when
device = "cuda:1"
:The text was updated successfully, but these errors were encountered: