-
-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: blank response with ollama backend #2694
Comments
Hey there, @travisgu! I'm here to assist you with any issues you're experiencing while using Quivr. I can help you solve bugs, answer questions, and even guide you on how to become a contributor to the project. Let's get started! To resolve the issue of blank chat responses when using the Ollam qwen:72b model in Quivr, ensure that the
Here is an example of how you might update the def save_answer(self, question, response_tokens, streamed_chat_history, save_answer):
# Combine response tokens into a single response
response = ''.join(response_tokens)
# Update the chat history with the response
streamed_chat_history.assistant = response
# Save the chat history to the database
if save_answer:
chat_service.save_chat_history(
chat_id=question.chat_id,
question=question.question,
answer=response,
prompt=streamed_chat_history.prompt,
assistant=streamed_chat_history.assistant
)
# LOCAL
OLLAMA_API_BASE_URL=https://host.docker.internal:11434 # Uncomment to activate ollama. This is the local url for the ollama api
Ensure that the |
I didn't change any code and I was running quivr with the docker image. The url configuration is correct. I can see the prompt in the Ollama log file and Ollama return 200 for the API call. @dosu |
what's the model name you use? I find that it should be prefixed with |
right,the model name is “ollama/qwen:72b” |
btw, I am using the quivr with docker-compose on my laptop, I am not sure if this is cause. |
Thanks for your contributions, we'll be closing this issue as it has gone stale. Feel free to reopen if you'd like to continue the discussion. |
What happened?
When I tried to use Quivr with Ollam qwen:72b model, the chat is already empty. There is no error showing in the log or web page. I checked the superbase chat history table the assistant text is empty:
Relevant log output
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: