-
-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Invalid Ollama configuration, please check Ollama configuration and try again #3047
Comments
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
I just noticed that I screwed up the URL in the bug report. The actual IP in use is: This does not pass the Connection Check with error: The error message returned when trying to chat is now: { Now I use http:https://127.0.0.1:11434 and the Connection Check passes. And now it works... Sigh... I don't know what the hell happened. Consider this closed. |
And now it doesn't work again... Trying to debug ragapp, I tried to see if I could get Lobe Chat to show me how I got Lobe Chat working the other day. Except now it doesn't work again. The connection check doesn't pass. Doing some research on Google, I see that almost no one can get Docker to connect to Ollama. It appears to be an utterly hit and miss proposition. Open-WebUI suggested this: Example Docker Command: docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http:https://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main ` I tried that - using Lobe Chat of course - didn't work. I used the "-network=host" and the "OLLAMA_BASE_URL" and it doesn't work. I hate Docker. It's almost impossible to make it work compared to a Flatpak or an AppImage. I have had at least three Ollama GUIs with Docker fail to work - the several I have with Flatpaks or AppImages all work fine. |
I just solved the problem, as I described in bug 3064. I reproduce my comment there here: I just had a similar problem with the Ragapp utility; wouldn't connect to Ollama via Docker. I just solved that problem a few minutes ago by doing this: Based on an article on the Open-WebUI Github, they suggest: `If you're experiencing connection issues, it’s often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the --network=host flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: http:https://localhost:8080/. Example Docker Command: docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http:https://127.0.0.1:11434/ --name open-webui --restart always ghcr.io/open-webui/open-webui:main `
So I executed this command for Lobe Chat: Then I entered the interface at port 3210, selected Ollama as the provider and left the Interface proxy address BLANK. The check passed, I went to the Just Chat, selected qwen2, and entered "Hi" as the message. I got the response from Qwen2. I don't know if connecting Docker to the local host network is a security risk, but since I'm running on my own machine with the usual firewalls and the like, I don't really care. Your mileage may vary. I'll also note that I'm running on openSUSE Tumbleweed Linux. For Windows or another Linux, your mileage may vary. |
I consider this bug closed. |
This issue is closed, If you have any questions, you can comment and reply. |
📦 Environment
Docker
📌 Version
0.162.17
💻 Operating System
Other Linux
🌐 Browser
Firefox
🐛 Bug Description
Invalid Ollama configuration, please check Ollama configuration and try again
Use custom Ollama API Key
Enter your Ollama API Key to start the session
📷 Recurrence Steps
docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http:https://host.docker.internal:11434/v1 lobehub/lobe-chat
🚦 Expected Behavior
Expect LLM model to respond to the prompt.
📝 Additional Information
I've tried using both http:https://127.0.0.1:11434/v1. This does not pass the Connection Check with details as:
{ "host": "http:https://127.0.0.1:11434/v1", "message": "please check whether your ollama service is available or set the CORS rules", "provider": "ollama" }
I have not done setting of "CORS Rules" because Ollama is running directly on local host and other front ends such as MSTY (which does not run from Docker) can connect to it fine at the above IP address. So I suspect this is some problem related to Lobe Chat interacting with Docker networking.
The text was updated successfully, but these errors were encountered: