Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

使用预训练模型PCN #151

Open
whitelin0321 opened this issue Jun 7, 2024 · 0 comments
Open

使用预训练模型PCN #151

whitelin0321 opened this issue Jun 7, 2024 · 0 comments

Comments

@whitelin0321
Copy link

hi, Thank you for your work. There was an issue when I ran PCN to train the model using the following command
python tools/inference.py --model_config cfgs/PCN_models/PCN.yaml --model_checkpoint ./weight/PCNnew.pth --pc /data/datasets/suwa_cut_foot_2978_40960/13073/13073_auto_foot.ply --out_pc_root ./inference_foot_50007 --save_vis_img
Loading weights from ./weight/PCNnew.pth...
Traceback (most recent call last):
File "tools/inference.py", line 120, in
main()
File "tools/inference.py", line 108, in main
builder.load_model(base_model, args.model_checkpoint)
File "/root/autodl-tmp/code/PoinTr/tools/../tools/builder.py", line 151, in load_model
base_model.load_state_dict(base_ckpt)
File "/data/miniconda3/envs/repairnet/lib/python3.8/site-packages/torch/nn/modules/module.py", line 2041, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for PCN:
Missing key(s) in state_dict: "first_conv.0.weight", "first_conv.0.bias", "first_conv.1.weight", "first_conv.1.bias", "first_conv.1.running_mean", "first_conv.1.running_var", "first_conv.3.weight", "first_conv.3.bias", "second_conv.0.weight", "second_conv.0.bias", "second_conv.1.weight", "second_conv.1.bias", "second_conv.1.running_mean", "second_conv.1.running_var", "second_conv.3.weight", "second_conv.3.bias", "mlp.0.weight", "mlp.0.bias", "mlp.2.weight", "mlp.2.bias", "mlp.4.weight", "mlp.4.bias", "final_conv.0.weight", "final_conv.0.bias", "final_conv.1.weight", "final_conv.1.bias", "final_conv.1.running_mean", "final_conv.1.running_var", "final_conv.3.weight", "final_conv.3.bias", "final_conv.4.weight", "final_conv.4.bias", "final_conv.4.running_mean", "final_conv.4.running_var", "final_conv.6.weight", "final_conv.6.bias".
Unexpected key(s) in state_dict: "base_model.grouper.input_trans.weight", "base_model.grouper.input_trans.bias", "base_model.grouper.layer1.0.weight", "base_model.grouper.layer1.1.weight", "base_model.grouper.layer1.1.bias", "base_model.grouper.layer2.0.weight", "base_model.grouper.layer2.1.weight", "base_model.grouper.layer2.1.bias", "base_model.grouper.layer3.0.weight", "base_model.grouper.layer3.1.weight", "base_model.grouper.layer3.1.bias", "base_model.grouper.layer4.0.weight", "base_model.grouper.layer4.1.weight", "base_model.grouper.layer4.1.bias", "base_model.pos_embed.0.weight", "base_model.pos_embed.0.bias", "base_model.pos_embed.1.weight", "base_model.pos_embed.1.bias", "base_model.pos_embed.1.running_mean", "base_model.pos_embed.1.running_var", "base_model.pos_embed.1.num_batches_tracked", "base_model.pos_embed.3.weight", "base_model.pos_embed.3.bias", "base_model.input_proj.0.weight", "base_model.input_proj.0.bias", "base_model.input_proj.1.weight", "base_model.input_proj.1.bias", "base_model.input_proj.1.running_mean", "base_model.input_proj.1.running_var", "base_model.input_proj.1.num_batches_tracked", "base_model.input_proj.3.weight", "base_model.input_proj.3.bias", "base_model.encoder.0.norm1.weight", "base_model.encoder.0.norm1.bias", "base_model.encoder.0.attn.qkv.weight", "base_model.encoder.0.attn.proj.weight", "base_model.encoder.0.attn.proj.bias", "base_model.encoder.0.norm2.weight", "base_model.encoder.0.norm2.bias", "base_model.encoder.0.knn_map.0.weight", "base_model.encoder.0.knn_map.0.bias", "base_model.encoder.0.merge_map.weight", "base_model.encoder.0.merge_map.bias", "base_model.encoder.0.mlp.fc1.weight", "base_model.encoder.0.mlp.fc1.bias", "base_model.encoder.0.mlp.fc2.weight", "base_model.encoder.0.mlp.fc2.bias", "base_model.encoder.1.norm1.weight", "base_model.encoder.1.norm1.bias", "base_model.encoder.1.attn.qkv.weight", "base_model.encoder.1.attn.proj.weight", "base_model.encoder.1.attn.proj.bias", "base_model.encoder.1.norm2.weight", "base_model.encoder.1.norm2.bias", "base_model.encoder.1.knn_map.0.weight", "base_model.encoder.1.knn_map.0.bias", "base_model.encoder.1.merge_map.weight", "base_model.encoder.1.merge_map.bias", "base_model.encoder.1.mlp.fc1.weight", "base_model.encoder.1.mlp.fc1.bias", "base_model.encoder.1.mlp.fc2.weight", "base_model.encoder.1.mlp.fc2.bias", "base_model.encoder.2.norm1.weight", "base_model.encoder.2.norm1.bias", "base_model.encoder.2.attn.qkv.weight", "base_model.encoder.2.attn.proj.weight", "base_model.encoder.2.attn.proj.bias", "base_model.encoder.2.norm2.weight", "base_model.encoder.2.norm2.bias", "base_model.encoder.2.knn_map.0.weight", "base_model.encoder.2.knn_map.0.bias", "base_model.encoder.2.merge_map.weight", "base_model.encoder.2.merge_map.bias", "base_model.encoder.2.mlp.fc1.weight", "base_model.encoder.2.mlp.fc1.bias", "base_model.encoder.2.mlp.fc2.weight", "base_model.encoder.2.mlp.fc2.bias", "base_model.encoder.3.norm1.weight", "base_model.encoder.3.norm1.bias", "base_model.encoder.3.attn.qkv.weight", "base_model.encoder.3.attn.proj.weight", "base_model.encoder.3.attn.proj.bias", "base_model.encoder.3.norm2.weight", "base_model.encoder.3.norm2.bias", "base_model.encoder.3.knn_map.0.weight", "base_model.encoder.3.knn_map.0.bias", "base_model.encoder.3.merge_map.weight", "base_model.encoder.3.merge_map.bias", "base_model.encoder.3.mlp.fc1.weight", "base_model.encoder.3.mlp.fc1.bias", "base_model.encoder.3.mlp.fc2.weight", "base_model.encoder.3.mlp.fc2.bias", "base_model.encoder.4.norm1.weight", "base_model.encoder.4.norm1.bias", "base_model.encoder.4.attn.qkv.weight", "base_model.encoder.4.attn.proj.weight", "base_model.encoder.4.attn.proj.bias", "base_model.encoder.4.norm2.weight", "base_model.encoder.4.norm2.bias", "base_model.encoder.4.knn_map.0.weight", "base_model.encoder.4.knn_map.0.bias", "base_model.encoder.4.merge_map.weight", "base_model.encoder.4.merge_map.bias", "base_model.encoder.4.mlp.fc1.weight", "base_model.encoder.4.mlp.fc1.bias", "base_model.encoder.4.mlp.fc2.weight", "base_model.encoder.4.mlp.fc2.bias", "base_model.encoder.5.norm1.weight", "base_model.encoder.5.norm1.bias", "base_model.encoder.5.attn.qkv.weight", "base_model.encoder.5.attn.proj.weight", "base_model.encoder.5.attn.proj.bias", "base_model.encoder.5.norm2.weight", "base_model.encoder.5.norm2.bias", "base_model.encoder.5.knn_map.0.weight", "base_model.encoder.5.knn_map.0.bias", "base_model.encoder.5.merge_map.weight", "base_model.encoder.5.merge_map.bias", "base_model.encoder.5.mlp.fc1.weight", "base_model.encoder.5.mlp.fc1.bias", "base_model.encoder.5.mlp.fc2.weight", "base_model.encoder.5.mlp.fc2.bias", "base_model.increase_dim.0.weight", "base_model.increase_dim.0.bias", "base_model.increase_dim.1.weight", "base_model.increase_dim.1.bias", "base_model.increase_dim.1.running_mean", "base_model.increase_dim.1.running_var", "base_model.increase_dim.1.num_batches_tracked", "base_model.increase_dim.3.weight", "base_model.increase_dim.3.bias", "base_model.coarse_pred.0.weight", "base_model.coarse_pred.0.bias", "base_model.coarse_pred.2.weight", "base_model.coarse_pred.2.bias", "base_model.mlp_query.0.weight", "base_model.mlp_query.0.bias", "base_model.mlp_query.2.weight", "base_model.mlp_query.2.bias", "base_model.mlp_query.4.weight", "base_model.mlp_query.4.bias", "base_model.decoder.0.norm1.weight", "base_model.decoder.0.norm1.bias", "base_model.decoder.0.self_attn.qkv.weight", "base_model.decoder.0.self_attn.proj.weight", "base_model.decoder.0.self_attn.proj.bias", "base_model.decoder.0.norm_q.weight", "base_model.decoder.0.norm_q.bias", "base_model.decoder.0.norm_v.weight", "base_model.decoder.0.norm_v.bias", "base_model.decoder.0.attn.q_map.weight", "base_model.decoder.0.attn.k_map.weight", "base_model.decoder.0.attn.v_map.weight", "base_model.decoder.0.attn.proj.weight", "base_model.decoder.0.attn.proj.bias", "base_model.decoder.0.norm2.weight", "base_model.decoder.0.norm2.bias", "base_model.decoder.0.mlp.fc1.weight", "base_model.decoder.0.mlp.fc1.bias", "base_model.decoder.0.mlp.fc2.weight", "base_model.decoder.0.mlp.fc2.bias", "base_model.decoder.0.knn_map.0.weight", "base_model.decoder.0.knn_map.0.bias", "base_model.decoder.0.merge_map.weight", "base_model.decoder.0.merge_map.bias", "base_model.decoder.0.knn_map_cross.0.weight", "base_model.decoder.0.knn_map_cross.0.bias", "base_model.decoder.0.merge_map_cross.weight", "base_model.decoder.0.merge_map_cross.bias", "base_model.decoder.1.norm1.weight", "base_model.decoder.1.norm1.bias", "base_model.decoder.1.self_attn.qkv.weight", "base_model.decoder.1.self_attn.proj.weight", "base_model.decoder.1.self_attn.proj.bias", "base_model.decoder.1.norm_q.weight", "base_model.decoder.1.norm_q.bias", "base_model.decoder.1.norm_v.weight", "base_model.decoder.1.norm_v.bias", "base_model.decoder.1.attn.q_map.weight", "base_model.decoder.1.attn.k_map.weight", "base_model.decoder.1.attn.v_map.weight", "base_model.decoder.1.attn.proj.weight", "base_model.decoder.1.attn.proj.bias", "base_model.decoder.1.norm2.weight", "base_model.decoder.1.norm2.bias", "base_model.decoder.1.mlp.fc1.weight", "base_model.decoder.1.mlp.fc1.bias", "base_model.decoder.1.mlp.fc2.weight", "base_model.decoder.1.mlp.fc2.bias", "base_model.decoder.1.knn_map.0.weight", "base_model.decoder.1.knn_map.0.bias", "base_model.decoder.1.merge_map.weight", "base_model.decoder.1.merge_map.bias", "base_model.decoder.1.knn_map_cross.0.weight", "base_model.decoder.1.knn_map_cross.0.bias", "base_model.decoder.1.merge_map_cross.weight", "base_model.decoder.1.merge_map_cross.bias", "base_model.decoder.2.norm1.weight", "base_model.decoder.2.norm1.bias", "base_model.decoder.2.self_attn.qkv.weight", "base_model.decoder.2.self_attn.proj.weight", "base_model.decoder.2.self_attn.proj.bias", "base_model.decoder.2.norm_q.weight", "base_model.decoder.2.norm_q.bias", "base_model.decoder.2.norm_v.weight", "base_model.decoder.2.norm_v.bias", "base_model.decoder.2.attn.q_map.weight", "base_model.decoder.2.attn.k_map.weight", "base_model.decoder.2.attn.v_map.weight", "base_model.decoder.2.attn.proj.weight", "base_model.decoder.2.attn.proj.bias", "base_model.decoder.2.norm2.weight", "base_model.decoder.2.norm2.bias", "base_model.decoder.2.mlp.fc1.weight", "base_model.decoder.2.mlp.fc1.bias", "base_model.decoder.2.mlp.fc2.weight", "base_model.decoder.2.mlp.fc2.bias", "base_model.decoder.2.knn_map.0.weight", "base_model.decoder.2.knn_map.0.bias", "base_model.decoder.2.merge_map.weight", "base_model.decoder.2.merge_map.bias", "base_model.decoder.2.knn_map_cross.0.weight", "base_model.decoder.2.knn_map_cross.0.bias", "base_model.decoder.2.merge_map_cross.weight", "base_model.decoder.2.merge_map_cross.bias", "base_model.decoder.3.norm1.weight", "base_model.decoder.3.norm1.bias", "base_model.decoder.3.self_attn.qkv.weight", "base_model.decoder.3.self_attn.proj.weight", "base_model.decoder.3.self_attn.proj.bias", "base_model.decoder.3.norm_q.weight", "base_model.decoder.3.norm_q.bias", "base_model.decoder.3.norm_v.weight", "base_model.decoder.3.norm_v.bias", "base_model.decoder.3.attn.q_map.weight", "base_model.decoder.3.attn.k_map.weight", "base_model.decoder.3.attn.v_map.weight", "base_model.decoder.3.attn.proj.weight", "base_model.decoder.3.attn.proj.bias", "base_model.decoder.3.norm2.weight", "base_model.decoder.3.norm2.bias", "base_model.decoder.3.mlp.fc1.weight", "base_model.decoder.3.mlp.fc1.bias", "base_model.decoder.3.mlp.fc2.weight", "base_model.decoder.3.mlp.fc2.bias", "base_model.decoder.3.knn_map.0.weight", "base_model.decoder.3.knn_map.0.bias", "base_model.decoder.3.merge_map.weight", "base_model.decoder.3.merge_map.bias", "base_model.decoder.3.knn_map_cross.0.weight", "base_model.decoder.3.knn_map_cross.0.bias", "base_model.decoder.3.merge_map_cross.weight", "base_model.decoder.3.merge_map_cross.bias", "base_model.decoder.4.norm1.weight", "base_model.decoder.4.norm1.bias", "base_model.decoder.4.self_attn.qkv.weight", "base_model.decoder.4.self_attn.proj.weight", "base_model.decoder.4.self_attn.proj.bias", "base_model.decoder.4.norm_q.weight", "base_model.decoder.4.norm_q.bias", "base_model.decoder.4.norm_v.weight", "base_model.decoder.4.norm_v.bias", "base_model.decoder.4.attn.q_map.weight", "base_model.decoder.4.attn.k_map.weight", "base_model.decoder.4.attn.v_map.weight", "base_model.decoder.4.attn.proj.weight", "base_model.decoder.4.attn.proj.bias", "base_model.decoder.4.norm2.weight", "base_model.decoder.4.norm2.bias", "base_model.decoder.4.mlp.fc1.weight", "base_model.decoder.4.mlp.fc1.bias", "base_model.decoder.4.mlp.fc2.weight", "base_model.decoder.4.mlp.fc2.bias", "base_model.decoder.4.knn_map.0.weight", "base_model.decoder.4.knn_map.0.bias", "base_model.decoder.4.merge_map.weight", "base_model.decoder.4.merge_map.bias", "base_model.decoder.4.knn_map_cross.0.weight", "base_model.decoder.4.knn_map_cross.0.bias", "base_model.decoder.4.merge_map_cross.weight", "base_model.decoder.4.merge_map_cross.bias", "base_model.decoder.5.norm1.weight", "base_model.decoder.5.norm1.bias", "base_model.decoder.5.self_attn.qkv.weight", "base_model.decoder.5.self_attn.proj.weight", "base_model.decoder.5.self_attn.proj.bias", "base_model.decoder.5.norm_q.weight", "base_model.decoder.5.norm_q.bias", "base_model.decoder.5.norm_v.weight", "base_model.decoder.5.norm_v.bias", "base_model.decoder.5.attn.q_map.weight", "base_model.decoder.5.attn.k_map.weight", "base_model.decoder.5.attn.v_map.weight", "base_model.decoder.5.attn.proj.weight", "base_model.decoder.5.attn.proj.bias", "base_model.decoder.5.norm2.weight", "base_model.decoder.5.norm2.bias", "base_model.decoder.5.mlp.fc1.weight", "base_model.decoder.5.mlp.fc1.bias", "base_model.decoder.5.mlp.fc2.weight", "base_model.decoder.5.mlp.fc2.bias", "base_model.decoder.5.knn_map.0.weight", "base_model.decoder.5.knn_map.0.bias", "base_model.decoder.5.merge_map.weight", "base_model.decoder.5.merge_map.bias", "base_model.decoder.5.knn_map_cross.0.weight", "base_model.decoder.5.knn_map_cross.0.bias", "base_model.decoder.5.merge_map_cross.weight", "base_model.decoder.5.merge_map_cross.bias", "base_model.decoder.6.norm1.weight", "base_model.decoder.6.norm1.bias", "base_model.decoder.6.self_attn.qkv.weight", "base_model.decoder.6.self_attn.proj.weight", "base_model.decoder.6.self_attn.proj.bias", "base_model.decoder.6.norm_q.weight", "base_model.decoder.6.norm_q.bias", "base_model.decoder.6.norm_v.weight", "base_model.decoder.6.norm_v.bias", "base_model.decoder.6.attn.q_map.weight", "base_model.decoder.6.attn.k_map.weight", "base_model.decoder.6.attn.v_map.weight", "base_model.decoder.6.attn.proj.weight", "base_model.decoder.6.attn.proj.bias", "base_model.decoder.6.norm2.weight", "base_model.decoder.6.norm2.bias", "base_model.decoder.6.mlp.fc1.weight", "base_model.decoder.6.mlp.fc1.bias", "base_model.decoder.6.mlp.fc2.weight", "base_model.decoder.6.mlp.fc2.bias", "base_model.decoder.6.knn_map.0.weight", "base_model.decoder.6.knn_map.0.bias", "base_model.decoder.6.merge_map.weight", "base_model.decoder.6.merge_map.bias", "base_model.decoder.6.knn_map_cross.0.weight", "base_model.decoder.6.knn_map_cross.0.bias", "base_model.decoder.6.merge_map_cross.weight", "base_model.decoder.6.merge_map_cross.bias", "base_model.decoder.7.norm1.weight", "base_model.decoder.7.norm1.bias", "base_model.decoder.7.self_attn.qkv.weight", "base_model.decoder.7.self_attn.proj.weight", "base_model.decoder.7.self_attn.proj.bias", "base_model.decoder.7.norm_q.weight", "base_model.decoder.7.norm_q.bias", "base_model.decoder.7.norm_v.weight", "base_model.decoder.7.norm_v.bias", "base_model.decoder.7.attn.q_map.weight", "base_model.decoder.7.attn.k_map.weight", "base_model.decoder.7.attn.v_map.weight", "base_model.decoder.7.attn.proj.weight", "base_model.decoder.7.attn.proj.bias", "base_model.decoder.7.norm2.weight", "base_model.decoder.7.norm2.bias", "base_model.decoder.7.mlp.fc1.weight", "base_model.decoder.7.mlp.fc1.bias", "base_model.decoder.7.mlp.fc2.weight", "base_model.decoder.7.mlp.fc2.bias", "base_model.decoder.7.knn_map.0.weight", "base_model.decoder.7.knn_map.0.bias", "base_model.decoder.7.merge_map.weight", "base_model.decoder.7.merge_map.bias", "base_model.decoder.7.knn_map_cross.0.weight", "base_model.decoder.7.knn_map_cross.0.bias", "base_model.decoder.7.merge_map_cross.weight", "base_model.decoder.7.merge_map_cross.bias", "foldingnet.folding1.0.weight", "foldingnet.folding1.0.bias", "foldingnet.folding1.1.weight", "foldingnet.folding1.1.bias", "foldingnet.folding1.1.running_mean", "foldingnet.folding1.1.running_var", "foldingnet.folding1.1.num_batches_tracked", "foldingnet.folding1.3.weight", "foldingnet.folding1.3.bias", "foldingnet.folding1.4.weight", "foldingnet.folding1.4.bias", "foldingnet.folding1.4.running_mean", "foldingnet.folding1.4.running_var", "foldingnet.folding1.4.num_batches_tracked", "foldingnet.folding1.6.weight", "foldingnet.folding1.6.bias", "foldingnet.folding2.0.weight", "foldingnet.folding2.0.bias", "foldingnet.folding2.1.weight", "foldingnet.folding2.1.bias", "foldingnet.folding2.1.running_mean", "foldingnet.folding2.1.running_var", "foldingnet.folding2.1.num_batches_tracked", "foldingnet.folding2.3.weight", "foldingnet.folding2.3.bias", "foldingnet.folding2.4.weight", "foldingnet.folding2.4.bias", "foldingnet.folding2.4.running_mean", "foldingnet.folding2.4.running_var", "foldingnet.folding2.4.num_batches_tracked", "foldingnet.folding2.6.weight", "foldingnet.folding2.6.bias", "increase_dim.0.weight", "increase_dim.0.bias", "increase_dim.1.weight", "increase_dim.1.bias", "increase_dim.1.running_mean", "increase_dim.1.running_var", "increase_dim.1.num_batches_tracked", "increase_dim.3.weight", "increase_dim.3.bias", "reduce_map.weight", "reduce_map.bias".

@whitelin0321 whitelin0321 changed the title 使用预训练模型PCN 10/10000 实时翻译 划译 Using pre trained model PCN 使用预训练模型PCN Jun 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant