Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add top_k to PromptNode #4158

Closed
tstadel opened this issue Feb 14, 2023 · 4 comments · Fixed by #4159
Closed

Add top_k to PromptNode #4158

tstadel opened this issue Feb 14, 2023 · 4 comments · Fixed by #4159

Comments

@tstadel
Copy link
Member

tstadel commented Feb 14, 2023

Is your feature request related to a problem? Please describe.
Setting top_k on PromptNode is quite cumbersome. Depending on which invocation context you use, you have to set different params on PromptModel:
For hf models:

....
  - name: my_prompt_model
    type: PromptModel
    params:
      model_name_or_path: google/flan-t5-large
      model_kwargs:
        model_kwargs:
          num_return_sequences: 3
          num_beams: 3
....

For OpenAI models:

...
  - name: my_prompt_model
    type: PromptModel
    params:
      model_name_or_path: text-davinci-003
      model_kwargs:
        n: 3
        best_of: 3
...

PromptNode misses the functionality to set top_k completely. This is a problem if you want to use the same PromptModel with different PromptNodes (i.e. for different use-cases), that require differnt top_k.

Describe the solution you'd like

  • Add top_k param to PromptNode
  • push down top_k to the invocation layers

Describe alternatives you've considered

  • make top_k part of PromptTemplate: it's not the right place, as with PromptTemplate you specify the use-case in general, but not the instance of a use-case. top_kshould be part of the instance of a use-case and thus part ofPromptNode`
@sjrl
Copy link
Contributor

sjrl commented Feb 15, 2023

Hey @tstadel just to also let you know, with this PR you would at least be able to specify the model_kwargs when initializing a PromptNode and would not need to set a separate PromptModel.

@sjrl
Copy link
Contributor

sjrl commented Feb 15, 2023

But to add, I see why the additional changes you've made would be necessary to allow the top_k parameter to be pushed to the model at run time (invocation time) of the PromptNode. And this would allow the reuse of the model in other prompt nodes as you've said.

@tstadel
Copy link
Member Author

tstadel commented Feb 15, 2023

But to add, I see why the additional changes you've made would be necessary to allow the top_k parameter to be pushed to the model at run time (invocation time) of the PromptNode. And this would allow the reuse of the model in other prompt nodes as you've said.

I know it's just a small improvement. But especially if you want to quickly change models across invocation layers it comes quite handy as you don't need to know the invocation layers specifics.

@vblagoje
Copy link
Member

@sjrl @tstadel, could we call this parameter n? top_k is more related to search/retrieval, no?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants