Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inferencing queries #1557

Open
SoumyajitMukherjee-droid opened this issue Aug 10, 2022 · 1 comment
Open

Inferencing queries #1557

SoumyajitMukherjee-droid opened this issue Aug 10, 2022 · 1 comment
Assignees
Labels
question Further information is requested

Comments

@SoumyajitMukherjee-droid

Can the model infer on CPU or only on GPU?
Can it infer high resolution image or at max, what resolution of the image can be inferred?
Can it perform 3D pose estimation?
What is the Inference speed in FPS?

@liqikai9
Copy link
Collaborator

1&4. Both can. You can refer to this: https://mmpose.readthedocs.io/en/latest/inference_speed_summary.html

  1. For high-resolution image, you can detect the instances in the image and use the bboxes to crop the image and then input them to the topdown model.

  2. Yes, mmpose have the related models for 3D pose estimation, you can find them here: https://github.com/open-mmlab/mmpose/tree/master/configs/body and here: https://github.com/open-mmlab/mmpose/tree/master/configs/hand

@jin-s13 jin-s13 added the question Further information is requested label Aug 18, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants