-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WARNING: Unused parameter in LoRA state dict #51
Comments
If possible, can you provide the used LoRA checkpoint? |
Now I cannot access the computer, so I cannot provide the LoRA file for the time being. Do you know what might be causing this? |
Probably the mistake in the merge script. It expects one format, but gets the other. I need an example LoRA checkpoint (does not even need to be trained) to be able to fix the script. |
https://huggingface.co/iclgg/rwkv-lora/tree/main |
@iclgg Thanks for the file! Can you please test this version of the merge script? |
It works well! Thank you. |
Converts an RWKV model checkpoint in PyTorch format to an rwkv.cpp compatible file using convert_pytorch_to_ggml.py.
Get a LoRA checkpoint with https://github.com/Blealtan/RWKV-LM-LoRA.
Merges a LoRA checkpoint in PyTorch format (.pth) into an rwkv.cpp model file using merge_lora_into_ggml.py.
Warnings like that "Unused parameter in LoRA state dict blocks.13.att.receptance.lora_B(att.key.lora_A、att.value.lora_A、att.receptance.lora_A、ffn.key.lora_A、ffn.receptance.lora_A、ffn.value.lora_A、att.key.lora_B、att.value.lora_B、att.receptance.lora_B、ffn.key.lora_B、ffn.receptance.lora_B、ffn.value.lora_B)" were printed during the merge .
Using the merged model is poor and does not reflect the effect of lora.
Why does this happen?
The text was updated successfully, but these errors were encountered: