-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Coroutine question #1486
Comments
🤖 Hey @fenss, great to see you diving into the details again! How's everything going? Thank you for your detailed analysis and question. You're correct in your understanding of how the The line As for your question about whether it's necessary to use However, please note that modifying the way inputs are processed could have implications on the overall behavior of the application. It's recommended to thoroughly test any changes to ensure they meet your requirements and do not introduce any unexpected behavior. As for the issue of requests blocking each other, it's important to note that the If the I hope this answers your question. If you have any further questions or need more clarification, please let me know. Sources
|
I also need the process for non blocking.
|
Hi @fenss , We hope you're doing well. Just a friendly reminder that if we do not hear back from you within the next 3 days, we will close this issue. If you need more time or further assistance, please let us know. Thank you for your understanding! |
Thank you for your contribution! This issue will be closed. If you have any questions or encounter another problem, please open a new issue and we will be ready to assist you. |
Hi langflow.
I noticed langflow provides the flow execution API: {langflow_host_and_port}/api/v1/process/{flow_id}
There's a param
inputs
, actually is the LangChain input param.Then I create a simple flow, contains one normal LLMChain, one simple PromptTemplate, and a custom LLM. This LLM is simply wrapper of a HTTP request, using requests package in python.
I parallelly call this API: {langflow_host_and_port}/api/v1/process/{flow_id}, using
inputs
as adict
, found that it seems execute sequential, that means the requests block each others.Follow the codes:
Runnable
instance in the flow graph(https://github.com/logspace-ai/langflow/blob/v0.6.7a5/src/backend/langflow/processing/process.py):inputs
is a dict, here we go, we focus onoutput = langchain_object(inputs, return_only_outputs=True)
:__call__
of Chain. Ifinputs
is a dict wrapped in a list, It will callabatch
orainvoke
, and the requests will not block each others:Is that necessary to
output = langchain_object(inputs, return_only_outputs=True)
?The text was updated successfully, but these errors were encountered: