-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add top_k
to PromptNode
#4158
Comments
But to add, I see why the additional changes you've made would be necessary to allow the |
I know it's just a small improvement. But especially if you want to quickly change models across invocation layers it comes quite handy as you don't need to know the invocation layers specifics. |
Is your feature request related to a problem? Please describe.
Setting top_k on
PromptNode
is quite cumbersome. Depending on which invocation context you use, you have to set different params onPromptModel
:For hf models:
For OpenAI models:
PromptNode
misses the functionality to settop_k
completely. This is a problem if you want to use the samePromptModel
with differentPromptNodes
(i.e. for different use-cases), that require differnttop_k
.Describe the solution you'd like
top_k
param toPromptNode
top_k
to the invocation layersDescribe alternatives you've considered
top_k
part ofPromptTemplate
: it's not the right place, as withPromptTemplate you specify the use-case in general, but not the instance of a use-case.
top_kshould be part of the instance of a use-case and thus part of
PromptNode`The text was updated successfully, but these errors were encountered: