Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference device problem #58

Open
chau25102001 opened this issue Feb 16, 2024 · 0 comments
Open

Inference device problem #58

chau25102001 opened this issue Feb 16, 2024 · 0 comments

Comments

@chau25102001
Copy link

Hi, thanks for your great work.
I've tried to infer on an image with the script inference_on_an_image.py, it works ok and some boxes are detected. But when I tried again on the same image and replaced lines like device = "cuda" if not cpu_only else "cpu" to device = "cuda:1" if not cpu_only else "cpu" to run on a different GPU, the code still runs but there is no box detected. The code works ok with device = "cuda:0" though. What could be the issue?

P/S: I notice when device = "cuda:1" both GPU:0 and GPU:1 on my machine witness some VRAM increase.
image

Here is the result when device = "cuda" or device = "cuda:0":
image

And when device = "cuda:1":
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant