Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: Support print per class AP. #395

Merged
merged 2 commits into from
Jan 26, 2022
Merged

[Feature]: Support print per class AP. #395

merged 2 commits into from
Jan 26, 2022

Conversation

RangiLyu
Copy link
Owner

Support print per class AP

resolve #317
resolve #250
resolve #39

Evaluate annotation type *bbox*
DONE (t=33.63s).
Accumulating evaluation results...
DONE (t=8.08s).
[NanoDet][01-26 20:17:31]INFO:
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.270
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.417
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.280
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.081
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.276
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.451
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.254
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.394
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.419
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.149
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.463
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.676

[NanoDet][01-26 20:17:31]INFO:
| class         | AP50   | mAP   | class          | AP50   | mAP   |
|:--------------|:-------|:------|:---------------|:-------|:------|
| person        | 62.8   | 37.8  | bicycle        | 30.0   | 16.1  |
| car           | 38.5   | 21.3  | motorcycle     | 53.8   | 30.6  |
| airplane      | 76.2   | 56.1  | bus            | 64.0   | 52.1  |
| train         | 80.0   | 60.3  | truck          | 35.1   | 21.9  |
| boat          | 27.5   | 13.1  | traffic light  | 23.1   | 10.4  |
| fire hydrant  | 66.7   | 52.4  | stop sign      | 58.0   | 51.3  |
| parking meter | 49.1   | 31.6  | bench          | 23.3   | 14.1  |
| bird          | 32.4   | 19.4  | cat            | 80.6   | 58.0  |
| dog           | 65.2   | 48.2  | horse          | 60.9   | 39.5  |
| sheep         | 53.9   | 30.4  | cow            | 56.9   | 35.3  |
| elephant      | 75.8   | 51.6  | bear           | 80.0   | 61.8  |
| zebra         | 81.0   | 54.9  | giraffe        | 81.7   | 56.9  |
| backpack      | 9.7    | 5.2   | umbrella       | 41.3   | 24.6  |
| handbag       | 6.3    | 2.9   | tie            | 31.4   | 17.6  |
| suitcase      | 31.0   | 17.5  | frisbee        | 55.6   | 37.5  |
| skis          | 25.5   | 10.9  | snowboard      | 31.0   | 16.8  |
| sports ball   | 30.0   | 19.0  | kite           | 44.7   | 26.0  |
| baseball bat  | 28.7   | 12.9  | baseball glove | 39.4   | 19.4  |
| skateboard    | 51.3   | 28.6  | surfboard      | 40.1   | 21.4  |
| tennis racket | 49.9   | 26.9  | bottle         | 24.5   | 13.2  |
| wine glass    | 22.6   | 12.7  | cup            | 28.6   | 18.4  |
| fork          | 22.3   | 12.6  | knife          | 6.7    | 3.6   |
| spoon         | 6.0    | 2.6   | bowl           | 36.8   | 25.2  |
| banana        | 32.6   | 17.7  | apple          | 15.8   | 10.4  |
| sandwich      | 42.5   | 27.3  | orange         | 31.1   | 22.9  |
| broccoli      | 31.0   | 15.3  | carrot         | 28.1   | 13.6  |
| hot dog       | 36.2   | 23.4  | pizza          | 56.7   | 41.6  |
| donut         | 39.9   | 28.5  | cake           | 35.8   | 22.8  |
| chair         | 22.8   | 12.3  | couch          | 54.1   | 36.8  |
| potted plant  | 24.4   | 12.3  | bed            | 62.0   | 43.7  |
| dining table  | 37.2   | 25.5  | toilet         | 71.8   | 53.4  |
| tv            | 64.9   | 45.0  | laptop         | 59.6   | 44.4  |
| mouse         | 57.5   | 37.7  | remote         | 14.8   | 6.9   |
| keyboard      | 57.8   | 36.9  | cell phone     | 26.5   | 18.5  |
| microwave     | 53.9   | 38.7  | oven           | 42.6   | 26.6  |
| toaster       | 30.5   | 17.6  | sink           | 41.4   | 24.9  |
| refrigerator  | 58.5   | 42.6  | book           | 13.5   | 5.7   |
| clock         | 55.0   | 32.3  | vase           | 28.2   | 17.6  |
| scissors      | 25.5   | 18.5  | teddy bear     | 50.6   | 33.3  |
| hair drier    | 0.2    | 0.0   | toothbrush     | 12.2   | 6.4   |

@RangiLyu RangiLyu added the enhancement New feature or request label Jan 26, 2022
@RangiLyu RangiLyu added this to In Progress in NanoDet V1.0 Plan Jan 26, 2022
@codecov
Copy link

codecov bot commented Jan 26, 2022

Codecov Report

Merging #395 (35b72b5) into main (8cb9044) will increase coverage by 0.20%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #395      +/-   ##
==========================================
+ Coverage   74.05%   74.25%   +0.20%     
==========================================
  Files          66       66              
  Lines        4424     4459      +35     
  Branches      752      756       +4     
==========================================
+ Hits         3276     3311      +35     
  Misses        969      969              
  Partials      179      179              
Flag Coverage Δ
unittests 74.25% <100.00%> (+0.20%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
nanodet/data/dataset/coco.py 72.82% <100.00%> (+0.29%) ⬆️
nanodet/evaluator/coco_detection.py 92.20% <100.00%> (+6.16%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 8cb9044...35b72b5. Read the comment docs.

@RangiLyu RangiLyu merged commit 3996f81 into main Jan 26, 2022
@RangiLyu RangiLyu deleted the coco_eval branch March 16, 2022 10:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Development

Successfully merging this pull request may close these issues.

average precision metric for a single class/category 计算每个类别AP和F1值 测试mAP
1 participant