Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to know when completion has ended with Chat #23

Closed
AldeRoberge opened this issue Feb 24, 2024 · 2 comments
Closed

How to know when completion has ended with Chat #23

AldeRoberge opened this issue Feb 24, 2024 · 2 comments

Comments

@AldeRoberge
Copy link

AldeRoberge commented Feb 24, 2024

Hello! I hope you are well. Really cool useful wrapper for Ollama, we use it now that Ollama is available on Windows. I wanted to know if there is a way to know when the completion has ended (no more tokens generated) with the Chat class.

public class OllamaClient
{
    private const string OllamaApiUri = "http:https://localhost:11434";
    private const string OllamaModel = "DaVinci-v1:latest";

    private readonly OllamaApiClient _ollama = new(new Uri(OllamaApiUri))
    {
        SelectedModel = OllamaModel
    };

    private Chat? _chat;

    public async Task Setup(Action<ChatResponseStream> streamer)
    {
        var models = await _ollama.ListLocalModels();
        Console.WriteLine("Found the following available models : ");
        foreach (var model in models) Console.WriteLine(model.Name);

        _chat = _ollama.Chat(streamer);
    }

    /// <summary>
    /// Asks the model to generate a completion based on the input
    /// </summary>
    public async Task PerformInference(ChatRole chatRole, string input)
    {
        await _chat.SendAs(chatRole, input);
    }
    
    // How do we know when the inference is over?
}

I'd like to be able to perform some code when the completion is over.

Is there a way to know when the completion has completed?

@awaescher
Copy link
Owner

The streamer lambda receives these ChatResponseStream objects to stream the server response. There is a Done property that should be set to true for the last answer.

@AldeRoberge
Copy link
Author

image

Really cool stuff, thanks @awaescher !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants