Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing non-LoRA key tok_embeddings.weight from base model dict #1110

Open
vasicvuk opened this issue Jun 22, 2024 · 2 comments
Open

Missing non-LoRA key tok_embeddings.weight from base model dict #1110

vasicvuk opened this issue Jun 22, 2024 · 2 comments

Comments

@vasicvuk
Copy link

When running following commands:

  File "/opt/Projects/torchtune/venv/lib/python3.10/site-packages/recipes/lora_finetune_single_device.py", line 302, in _setup_model
    validate_missing_and_unexpected_for_lora(
  File "/opt/Projects/torchtune/venv/lib/python3.10/site-packages/torchtune/modules/peft/peft_utils.py", line 334, in validate_missing_and_unexpected_for_lora
    raise AssertionError(f"Missing non-LoRA key {k} from base model dict")
AssertionError: Missing non-LoRA key tok_embeddings.weight from base model dict

I am getting error:

AssertionError: Missing non-LoRA key tok_embeddings.weight from base model dict

Is there something I am doing wrong?

@RdoubleA
Copy link
Contributor

Hi @vasicvuk, which model are you using? Do you mind sharing your config?

@vasicvuk
Copy link
Author

@RdoubleA phi3 with the config from the repo, I guess there is some extra steps about weights to do for phi3?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants