-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE REQUEST]: Stream Results options #50
Comments
Thank you for submitting a feature request @jonnyjohnson1 ! What do you think the expected behaviour should be in cases where there are multiple final steps? |
My use case, right now, is to create a stream I can pass on to FastAPI's StreamResponse. I don't imagine a multiple-final-steps option. Though, I could see how it would be good to be able to pass along information (possibly through the callback functions) to notify when a new flow in the chain has started and stopped, and pass that information back through to the API. After exploration, I imagine the latter is possible given the callback functions, and I just need to know more about building an API and passing those results from the callback into the API. The code isn't pretty atm, but I have added a stream parameter option to the OpenAIChat class, and ultimately allow a stream_callback within the functional callback class. I have tried two options with the callback:
Result:
I figured out how to do this by changing a few files:
It's used in the OpenAIChat class:
The functional_callback.py addition:
Then, I use the llmflow, and callback in the main app like this:
|
Hey @jonnyjohnson1 thank you so much for sharing these! I will have some time to look into this over the weekend. I am currently working on a large update and I would like to use the opportunity to solve streaming as well |
Hey @jonnyjohnson1 I spent some time looking into this. I think a good solution would be to create a new
and get any chunk from any flowstep as they arrive. Meanwhile, as you mentioned, an easier workaround can be updating the callback class and adding
This of course makes sense only if you are fine with using websockets in your front-end. Would you like to contribute and take a stab at updating the callback function? If not I will probably have some extra time again next weekend. |
you can try the demo here: |
A .run_stream() option seems much cleaner. In my flow, I used a Thought, Observation, Action chain, and I needed to know:
I remember, too, I set up a 5-token cache in my case, to look for the tokens as they came through The callback functions getting to update what stage the flow is on at each step of the way can really help the UI inform the user why they're waiting. For each step in the async demo flow, the callback function choices might look like the following:
And, yeah, lemme see, I might be able to get to it Thursday or Friday. |
Any update on this? |
It's on my todo list. I am working on a few updates that I will release in the coming weekends and I will look into this afterwards |
The problem you are trying to solve:
Streaming flow results eliminates dead time of users waiting around. I know it's not possible for intermediate flow steps, but the final one could be streamed.
Suggested new feature or change:
A streaming option would be nice.
In a flow, the next step would have to trigger after the step has been fully generated, but
the final step in a flow could be streamed.
Just have to add this parameter in the OpenAI and OpenAIChat class files.
openai.ChatCompletion.create(
model="gpt-3.5-turbo",
stream=True <--------- add this line
)
The text was updated successfully, but these errors were encountered: