forked from PaddlePaddle/PaddleOCR
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request PaddlePaddle#4474 from tink2123/paddle2onnx
add paddle2onnx for test_tipc
- Loading branch information
Showing
11 changed files
with
472 additions
and
194 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,72 @@ | ||
# paddle2onnx 模型转化与预测 | ||
|
||
本章节介绍 PaddleOCR 模型如何转化为 ONNX 模型,并基于 ONNX 引擎预测。 | ||
|
||
## 1. 环境准备 | ||
|
||
需要准备 Paddle2ONNX 模型转化环境,和 ONNX 模型预测环境 | ||
|
||
### Paddle2ONNX | ||
Paddle2ONNX 支持将 PaddlePaddle 模型格式转化到 ONNX 模型格式,算子目前稳定支持导出 ONNX Opset 9~11,部分Paddle算子支持更低的ONNX Opset转换。 | ||
更多细节可参考 [Paddle2ONNX](https://github.com/PaddlePaddle/Paddle2ONNX/blob/develop/README_zh.md) | ||
|
||
- 安装 Paddle2ONNX | ||
``` | ||
python3.7 -m pip install paddle2onnx | ||
``` | ||
|
||
- 安装 ONNX | ||
``` | ||
# 建议安装 1.4.0 版本,可根据环境更换版本号 | ||
python3.7 -m pip install onnxruntime==1.4.0 | ||
``` | ||
|
||
## 2. 模型转换 | ||
|
||
|
||
- Paddle 模型下载 | ||
|
||
有两种方式获取Paddle静态图模型:在 [model_list](../../doc/doc_ch/models_list.md) 中下载PaddleOCR提供的预测模型; | ||
参考[模型导出说明](../../doc/doc_ch/inference.md#训练模型转inference模型)把训练好的权重转为 inference_model。 | ||
|
||
以 ppocr 检测模型为例: | ||
|
||
``` | ||
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_det_infer.tar | ||
cd ./inference && tar xf ch_ppocr_mobile_v2.0_det_infer.tar && cd .. | ||
``` | ||
|
||
- 模型转换 | ||
|
||
使用 Paddle2ONNX 将Paddle静态图模型转换为ONNX模型格式: | ||
|
||
``` | ||
paddle2onnx --model_dir=./inference/ch_ppocr_mobile_v2.0_det_infer/ \ | ||
--model_filename=inference.pdmodel \ | ||
--params_filename=inference.pdiparams \ | ||
--save_file=./inference/det_mobile_onnx/model.onnx \ | ||
--opset_version=10 \ | ||
--enable_onnx_checker=True | ||
``` | ||
|
||
执行完毕后,ONNX 模型会被保存在 `./inference/det_mobile_onnx/` 路径下 | ||
|
||
## 3. onnx 预测 | ||
|
||
以检测模型为例,使用 ONNX 预测可执行如下命令: | ||
|
||
``` | ||
python3.7 ../../tools/infer/predict_det.py --use_gpu=False --use_onnx=True \ | ||
--det_model_dir=./inference/det_mobile_onnx/model.onnx \ | ||
--image_dir=../../doc/imgs/1.jpg | ||
``` | ||
|
||
执行命令后在终端会打印出预测的检测框坐标,并在 `./inference_results/` 下保存可视化结果。 | ||
|
||
``` | ||
root INFO: 1.jpg [[[291, 295], [334, 292], [348, 844], [305, 847]], [[344, 296], [379, 294], [387, 669], [353, 671]]] | ||
The predict time of ../../doc/imgs/1.jpg: 0.06162881851196289 | ||
The visualized image saved in ./inference_results/det_res_1.jpg | ||
``` | ||
|
||
* 注意:ONNX暂时不支持变长预测,因为需要将输入resize到固定输入,预测结果可能与直接使用Paddle预测有细微不同。 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,47 @@ | ||
# Paddle2onnx预测功能测试 | ||
|
||
PaddleServing预测功能测试的主程序为`test_paddle2onnx.sh`,可以测试Paddle2ONNX的模型转化功能,并验证正确性。 | ||
|
||
## 1. 测试结论汇总 | ||
|
||
基于训练是否使用量化,进行本测试的模型可以分为`正常模型`和`量化模型`,这两类模型对应的Paddle2ONNX预测功能汇总如下: | ||
|
||
| 模型类型 |device | | ||
| ---- | ---- | | ||
| 正常模型 | GPU | | ||
| 正常模型 | CPU | | ||
| 量化模型 | GPU | | ||
| 量化模型 | CPU | | ||
|
||
## 2. 测试流程 | ||
### 2.1 功能测试 | ||
先运行`prepare.sh`准备数据和模型,然后运行`test_paddle2onnx.sh`进行测试,最终在```test_tipc/output```目录下生成`paddle2onnx_infer_*.log`后缀的日志文件。 | ||
|
||
```shell | ||
bash test_tipc/prepare.sh ./test_tipc/configs/ppocr_det_mobile_params.txt "paddle2onnx_infer" | ||
|
||
# 用法: | ||
bash test_tipc/test_paddle2onnx.sh ./test_tipc/configs/ppocr_det_mobile_params.txt | ||
``` | ||
|
||
#### 运行结果 | ||
|
||
各测试的运行情况会打印在 `test_tipc/output/results_paddle2onnx.log` 中: | ||
运行成功时会输出: | ||
|
||
``` | ||
Run successfully with command - paddle2onnx --model_dir=./inference/ch_ppocr_mobile_v2.0_det_infer/ --model_filename=inference.pdmodel --params_filename=inference.pdiparams --save_file=./inference/det_mobile_onnx/model.onnx --opset_version=10 --enable_onnx_checker=True! | ||
Run successfully with command - python test_tipc/onnx_inference/predict_det.py --use_gpu=False --image_dir=./inference/ch_det_data_50/all-sum-510/ --det_model_dir=./inference/det_mobile_onnx/model.onnx 2>&1 ! | ||
``` | ||
|
||
运行失败时会输出: | ||
|
||
``` | ||
Run failed with command - paddle2onnx --model_dir=./inference/ch_ppocr_mobile_v2.0_det_infer/ --model_filename=inference.pdmodel --params_filename=inference.pdiparams --save_file=./inference/det_mobile_onnx/model.onnx --opset_version=10 --enable_onnx_checker=True! | ||
... | ||
``` | ||
|
||
|
||
## 3. 更多教程 | ||
|
||
本文档为功能测试用,更详细的Paddle2onnx预测使用教程请参考:[Paddle2ONNX](https://github.com/PaddlePaddle/Paddle2ONNX) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,76 @@ | ||
#!/bin/bash | ||
source test_tipc/common_func.sh | ||
|
||
FILENAME=$1 | ||
|
||
dataline=$(cat ${FILENAME}) | ||
lines=(${dataline}) | ||
# common params | ||
model_name=$(func_parser_value "${lines[1]}") | ||
python=$(func_parser_value "${lines[2]}") | ||
|
||
|
||
# parser params | ||
dataline=$(awk 'NR==111, NR==123{print}' $FILENAME) | ||
IFS=$'\n' | ||
lines=(${dataline}) | ||
|
||
# parser paddle2onnx | ||
padlle2onnx_cmd=$(func_parser_value "${lines[1]}") | ||
infer_model_dir_key=$(func_parser_key "${lines[2]}") | ||
infer_model_dir_value=$(func_parser_value "${lines[2]}") | ||
model_filename_key=$(func_parser_key "${lines[3]}") | ||
model_filename_value=$(func_parser_value "${lines[3]}") | ||
params_filename_key=$(func_parser_key "${lines[4]}") | ||
params_filename_value=$(func_parser_value "${lines[4]}") | ||
save_file_key=$(func_parser_key "${lines[5]}") | ||
save_file_value=$(func_parser_value "${lines[5]}") | ||
opset_version_key=$(func_parser_key "${lines[6]}") | ||
opset_version_value=$(func_parser_value "${lines[6]}") | ||
enable_onnx_checker_key=$(func_parser_key "${lines[7]}") | ||
enable_onnx_checker_value=$(func_parser_value "${lines[7]}") | ||
# parser onnx inference | ||
inference_py=$(func_parser_value "${lines[8]}") | ||
use_gpu_key=$(func_parser_key "${lines[9]}") | ||
use_gpu_value=$(func_parser_value "${lines[9]}") | ||
det_model_key=$(func_parser_key "${lines[10]}") | ||
image_dir_key=$(func_parser_key "${lines[11]}") | ||
image_dir_value=$(func_parser_value "${lines[11]}") | ||
|
||
|
||
LOG_PATH="./test_tipc/output" | ||
mkdir -p ./test_tipc/output | ||
status_log="${LOG_PATH}/results_paddle2onnx.log" | ||
|
||
|
||
function func_paddle2onnx(){ | ||
IFS='|' | ||
_script=$1 | ||
|
||
# paddle2onnx | ||
_save_log_path="${LOG_PATH}/paddle2onnx_infer_cpu.log" | ||
set_dirname=$(func_set_params "${infer_model_dir_key}" "${infer_model_dir_value}") | ||
set_model_filename=$(func_set_params "${model_filename_key}" "${model_filename_value}") | ||
set_params_filename=$(func_set_params "${params_filename_key}" "${params_filename_value}") | ||
set_save_model=$(func_set_params "${save_file_key}" "${save_file_value}") | ||
set_opset_version=$(func_set_params "${opset_version_key}" "${opset_version_value}") | ||
set_enable_onnx_checker=$(func_set_params "${enable_onnx_checker_key}" "${enable_onnx_checker_value}") | ||
trans_model_cmd="${padlle2onnx_cmd} ${set_dirname} ${set_model_filename} ${set_params_filename} ${set_save_model} ${set_opset_version} ${set_enable_onnx_checker}" | ||
eval $trans_model_cmd | ||
last_status=${PIPESTATUS[0]} | ||
status_check $last_status "${trans_model_cmd}" "${status_log}" | ||
# python inference | ||
set_gpu=$(func_set_params "${use_gpu_key}" "${use_gpu_value}") | ||
set_model_dir=$(func_set_params "${det_model_key}" "${save_file_value}") | ||
set_img_dir=$(func_set_params "${image_dir_key}" "${image_dir_value}") | ||
infer_model_cmd="${python} ${inference_py} ${set_gpu} ${set_img_dir} ${set_model_dir} --use_onnx=True > ${_save_log_path} 2>&1 " | ||
eval $infer_model_cmd | ||
status_check $last_status "${infer_model_cmd}" "${status_log}" | ||
} | ||
|
||
|
||
echo "################### run test ###################" | ||
|
||
export Count=0 | ||
IFS="|" | ||
func_paddle2onnx |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.