Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

can you public eval code?? #29

Open
liwenssss opened this issue Mar 17, 2022 · 13 comments
Open

can you public eval code?? #29

liwenssss opened this issue Mar 17, 2022 · 13 comments

Comments

@liwenssss
Copy link

hi,I want to test my result on your Codalab, but it seams something wrong and I cant get the score. Can you public the evaluation code so that I can get my eval result.

@clashroyaleisgood
Copy link

Hi, I met the same question.

Maybe you can try to use the evaluation code provided by freihand:

freihand/eval.py, https://github.com/lmb-freiburg/freihand/blob/master/eval.py

And the evaluation groud-truth is also released:

https://github.com/lmb-freiburg/freihand#evaluate-on-the-dataset, Update

Note that, the code needs some modifications in my case

    pcd = o3d.geometry.PointCloud()
    pcd.points = o3d.utility.Vector3dVector(verts)
    d1 = gt.compute_point_cloud_distance(pr)
    d2 = pr.compute_point_cloud_distance(gt)
        data_uri1 = base64.b64encode(open(img_path, 'rb').read())
        data_uri1 = data_uri1.decode("utf-8")  # byte string to string, b'123' to '123

If you have file import problem, from utils.fh_utils import *...
You can try this:

import sys
sys.path.append('..')

@liumc14
Copy link

liumc14 commented Nov 6, 2022

Can I have a look at your pred.py file? Thank you

@clashroyaleisgood
Copy link

In fact, I don't really write a pred.py myself.
I use codes (model) edited from https://github.com/SeanChenxy/HandMesh, with mobrecon model.

here is the snippet to dump result:
https://github.com/clashroyaleisgood/HandMesh/blob/8fbd3a89fa655e095fedcf2c29baec01a9e54666/mobrecon/runner.py#L349-L358

@liumc14
Copy link

liumc14 commented Nov 6, 2022

Ok, thank you, but can you take the liberty to ask how to get the corresponding mano parameters according to the pictures in the test set?

@clashroyaleisgood
Copy link

I'm not sure, but maybe it's in the offcial eval zip file: FreiHAND_pub_v2_eval.zip
file: evaluation_mano.json


updated:
oh! I just find it:

K, mano, xyz = db_data_anno[idx]

maybe you can try this.

@liumc14
Copy link

liumc14 commented Nov 6, 2022

But this is the. mono file in the training directory. Our dataset has this file, but only images and k.json and scale.json are available in the evaluation directory. What bothers me is how to use these files to predict xyz.json. If you are free, please help me. Thank you

@clashroyaleisgood
Copy link

NONONO! FreiHAND have already released their evaluation annotations on official dataset website:
https://lmb.informatik.uni-freiburg.de/resources/datasets/FreihandDataset.en.html#:~:text=Download%20FreiHAND%20Dataset%20v2%20%2D%20Evaluation%20set%20with%20annotations%20(724MB)

the zip file contains _mano, _verts, _xyz, ... for evaluation set

@liumc14
Copy link

liumc14 commented Nov 6, 2022

But his pred.py needs to write its own prediction code. Thank you for your advice. The official one does provide it, but I would like to ask how to predict xyz by myself:
https://github.com/lmb-freiburg/freihand/blob/master/pred.py#:~:text=%23%20TODO%3A%20Put%20your%20algorithm%20here%2C%20which%20computes%20(metric)%203D%20joint%20coordinates%20and%203D%20vertex%20positions

@clashroyaleisgood
Copy link

Sorry for not getting the point.
Do you mean, how to predict xyz from a single RGB image?
Or how to combine your prediction algorithm with the code snippet you provided?


What HandMesh does is just copying codes about( converting prediction result to json ) from the pred.py to his codes.
So he don't need to call or edit this pred.py - pred_template()

@liumc14
Copy link

liumc14 commented Nov 6, 2022

My question is:
how to predict xyz from a single RGB image?
If you have some suggestions, please explain them to me. Thank you

@clashroyaleisgood
Copy link

There are so many researches about Hand Pose Prediction.
trying to get prediction result more and more accurate

@liumc14
Copy link

liumc14 commented Nov 6, 2022

OK, thank you for your suggestion. You can ask about the modle in this project Is py trained? Position_ Can hand directly predict xyz and vert? What I don't understand here is pose_ What is the hand parameter mano? Is it a 61 bit mano parameter? How to obtain from a pending image?
https://github.com/lmb-freiburg/freihand/blob/master/utils/model.py#:~:text=def%20pose_hand(mano%2C%20K%2C%20use_mean_pose%3DTrue)%3A

@clashroyaleisgood
Copy link

I think it's far away from the original issue title, so this is my last reply.

You can ask about the modle in this project Is py trained?

if this means: if the model in this project is trained in python?
the answer is Yes, but I don,t know which method is used to trained this renderer

Position_ Can hand directly predict xyz and vert? What I don't understand here is pose_

Sorry I really can't understand it...

What is the hand parameter mano? Is it a 61 bit mano parameter?

You can see this: https://github.com/hassony2/manopth
or other online resources to get a better understand about what a mano parameter is.
I don't have a full understand about it.
As far as I know, it's a 61-d feature contains 10 shape parameters, 48 rotation parameters, and 3 global_t parameters
shape means how fat(thin) this hand is.
rotation means the rotation angles in each joints(15*3), additional (3) is the rotation of the whole hand
global_t means the wrist joint potision in camera coordinate

How to obtain from a pending image?

I guess it can only be regressed from ground-truth hand vertices.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants