Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add soft prompt tuning #398

Merged
merged 3 commits into from
Sep 8, 2021
Merged

Add soft prompt tuning #398

merged 3 commits into from
Sep 8, 2021

Conversation

sdtblck
Copy link
Contributor

@sdtblck sdtblck commented Aug 30, 2021

  • Adds a generalized insert_layer function to insert arbitrary layers in the model
  • Uses it to add soft prompt tuning, but also could be used for other stuff (I'm thinking multimodal few shot training)

Should work in inference but haven't yet tested it.

We should be wary that soft prompt tuning / fine tuning will work better if we pad sequences up to max length, rather than packing them (which we do for pretraining).

I'll start working on a PR to simplify data loading + add an option to pad sequences, instead of pack.

@sdtblck sdtblck requested a review from a team as a code owner August 30, 2021 15:50
@StellaAthena StellaAthena merged commit 0f25f42 into main Sep 8, 2021
@StellaAthena StellaAthena deleted the soft_prompt_2 branch September 8, 2021 06:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants