Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accessing Stream Chunks (Streamed generation) #36

Open
backslash112 opened this issue Jun 29, 2024 · 11 comments
Open

Accessing Stream Chunks (Streamed generation) #36

backslash112 opened this issue Jun 29, 2024 · 11 comments

Comments

@backslash112
Copy link

backslash112 commented Jun 29, 2024

  • I'm submitting a ...
    [x] question about how to use this project

  • Summary
    I'm encountering two problems when working with the streaming example:

  1. When running the code from examples/streaming2.ts with stream: true, I get an error: Missing required fields: answerInPoints. What's causing this error and how can I resolve it?
  2. After setting stream: true, how can I access the result chunks? Are there methods similar to for await (const chunk of result) or completion.data.on() that I can use to process the incoming stream? (Similar to How to use stream: true? openai/openai-node#18)

Any guidance on resolving these issues would be greatly appreciated. Thank you!

@backslash112 backslash112 changed the title Streaming issues with examples/streaming2.ts Accessing Stream Chunks Jun 29, 2024
@backslash112 backslash112 changed the title Accessing Stream Chunks Accessing Stream Chunks (Streamed generation) Jun 29, 2024
@dosco
Copy link
Collaborator

dosco commented Jul 3, 2024

sorry was planning to fix this earlier was in the middle of our big migration to a monorepo. looking into this now.

@dosco
Copy link
Collaborator

dosco commented Jul 4, 2024

fix in latest release

@taieb-tk
Copy link

I would like to reopen this issue. Number two, i dont understand on how to do it with .chat.

I can see that it is suppose to return a Readable Stream, and i have set the stream to true. But i cannot get it to work.

Any example or ideas @dosco ?

@dosco dosco reopened this Aug 23, 2024
@dosco
Copy link
Collaborator

dosco commented Aug 26, 2024

@taieb-tk have you looked at the streaming1.ts and streaming2.ts examples? stream: true enables streaming with the underlying llm provider to speed up things the final fields are not streamed out.

@taieb-tk
Copy link

taieb-tk commented Sep 1, 2024

@dosco
Yes i did, i could not get it to work, probably a skill issue from my side.
I tried to just use the

`const ai = new ax.AxAIOpenAI({
apiKey: apiKey as string,
});

    ai.setOptions({ debug: true })


    const response = await ai.chat({
        chatPrompt: conversationHistory,

        config: {
            stream: true
        },
        ...(tools?.length && { functions: normalizeFunctions(tools) }),
    });

`

Not sure what to do with the response in the next step... Could you possibly help me? :)

@taieb-tk
Copy link

Bump any help would be appriciated :)

@dosco
Copy link
Collaborator

dosco commented Oct 14, 2024

The bug is in the line below which is wrong. Also typescript should catch this it's even in the api docs. https://axllm.dev/apidocs/classes/axai/

 config: {
            stream: true
        },

It should be

ai.setOptions({ debug: true })


const response = await ai.chat({
    chatPrompt: conversationHistory,
    ...(tools?.length && { functions: normalizeFunctions(tools) }),
}, {
   stream: true
});

@taieb-tk
Copy link

The response is suppose to be asyncgenerator? Any examples on how to catch that stream?

@taieb-tk
Copy link

Ahh ok! I read a bit fast, i will try that! thanks!! :)

@dosco
Copy link
Collaborator

dosco commented Oct 14, 2024 via email

@taieb-tk
Copy link

Your answer above solved my problem! Really appriciate the help. Looking forward to continue testing this 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants