We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problems with testing identical prompt across different Mistral models, and having consistent issue with mistral-large-latest.
mistral-large-latest
Prompt length as calculated by GPT tokenizer: 2,196 tokens and 11,331 characters
Note: this exact prompt works consistently with mistral-medium-2312 and mistral-medium-latest
mistral-medium-2312
mistral-medium-latest
So far, I have tried increasing timeout on client instantiation to ~5 minutes to see if that helped (though medium works fine on default).
I have tried both synchronous and streaming connections. Here is my current situation:
Error:
Unexpected exception (RemoteProtocolError): peer closed connection without sending complete message body (incomplete chunked read)
relevant code snippet:
stream_response = client.chat_stream(model=model, messages=messages, temperature=TEMPERATURE) for chunk in stream_response: print(chunk) summary = summary + chunk.choices[0].delta.content
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Problems with testing identical prompt across different Mistral models, and having consistent issue with
mistral-large-latest
.Prompt length as calculated by GPT tokenizer: 2,196 tokens and 11,331 characters
Note: this exact prompt works consistently with
mistral-medium-2312
andmistral-medium-latest
So far, I have tried increasing timeout on client instantiation to ~5 minutes to see if that helped (though medium works fine on default).
I have tried both synchronous and streaming connections. Here is my current situation:
Error:
relevant code snippet:
The text was updated successfully, but these errors were encountered: