Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Lobe Chat Database -w- PGVECTOR - Cannot change Ollama url #5625

Open
sushibait opened this issue Jan 29, 2025 · 5 comments
Open

[Bug] Lobe Chat Database -w- PGVECTOR - Cannot change Ollama url #5625

sushibait opened this issue Jan 29, 2025 · 5 comments
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models

Comments

@sushibait
Copy link

📦 Environment

Docker

📌 Version

1.49.5

💻 Operating System

Windows

🌐 Browser

Chrome, Edge, Firefox

🐛 Bug Description

After installing lobe-chat-database version in docker with pgvector, cannot change the ollama url in lobe-chat. It defaults to the localhost port 11434, but attempting to change this has no effect. Thus, the old CORS error is displayed every time and any attempt to change the url results in the same. I have verified that regardless of what url is entered, lobe still attempts to connect on 127.0.0.1:11434.

A few things I tried:

  • Complete reinstall.
  • Checking ollama logs (different machine - the one I am trying to connect to) - no attempted connections show in ollama server or app log.
  • Closed ollama on remote machine, used netstat. No attempts to connect on port 11434.
  • Fired up an nginx server on another machine and attempted to reverse proxy. Nginx logs show no connection attempts as well.
  • Chrome / FF debug consoles: ALWAYS show the connection attempt on 127.0.0.1 and regardless of what URL is specified in lobe chat, this behavior continues.
  • SSH into the wsl docker machine, check .env. Always has the ollama url set to localhost @ 11434. (not sure if this matters)

All other lobe functions are working, including connections to several other LLM providers.

📷 Recurrence Steps

Open lobe chat in browser.
Open dev tools in browser.
Change ollama url and attempt to check connection.
You will see regardless of the url entered, the connection attempt is made on localhost.

🚦 Expected Behavior

Successful connection to ollama.

📝 Additional Information

No response

@sushibait sushibait added the 🐛 Bug Something isn't working | 缺陷 label Jan 29, 2025
@github-project-automation github-project-automation bot moved this to Roadmap - Chat 1.x in Lobe Chat Routine Jan 29, 2025
@lobehubbot
Copy link
Member

👀 @sushibait

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

Copy link

dosubot bot commented Jan 29, 2025

Hey @sushibait! I'm here to help you with the issue you're experiencing. I can assist you in solving bugs, answering questions, and becoming a contributor. Let's work together to resolve this issue!

The issue you're experiencing with changing the Ollama URL in Lobe Chat might be related to the configuration settings not being updated correctly. Here are a few steps you can try to resolve this:

  1. Environment Variable: Ensure that the OLLAMA_PROXY_URL environment variable is set correctly when running the Docker container. You can specify it in your Docker run command like this:

    docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http://your-ollama-url:port lobehub/lobe-chat
  2. Check .env File: If you're using a .env file, make sure it contains the correct OLLAMA_PROXY_URL value. Update it if necessary and restart the Docker container.

  3. Cross-Origin Configuration: Ensure that Ollama is configured to allow cross-origin requests. This can be done by setting the OLLAMA_ORIGINS environment variable to * on the machine running Ollama. For example, on macOS, you can use:

    launchctl setenv OLLAMA_ORIGINS "*"
  4. Docker Configuration: If Ollama is running in a Docker container, ensure that the container is configured to allow connections from the Lobe Chat instance. You might need to adjust network settings or use Docker's --network option to ensure proper connectivity.

  5. Restart Services: After making changes, restart both the Lobe Chat and Ollama services to ensure that the new configurations are applied.

If these steps do not resolve the issue, there might be a deeper configuration problem or a bug in the version you are using. You may want to check for any updates or patches that address this issue.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@dosubot dosubot bot added the ollama Relative to Ollama Provider and ollama models label Jan 29, 2025
@sushibait
Copy link
Author

I am using the docker-compose lobe-chat database version. I did confirm the network settings / cors on the ollama machine and open-webui connects to it fine from another machine. How do i address this in the docker-compose?

@sushibait
Copy link
Author

I have now tried editing the .env file before running the compose command. It did insert the correct url into lobe-chat, but when I use the "check" button, it is still polling 127.0.0.1:11434.

Image

Also since you mentioned it, i am not running ollama in docker and i know ollama is running and i can connect to it remotely using other tools. Lobe chat just will not accept any url other than localhost... see screenshot.

@sushibait
Copy link
Author

Tried forwarding the port with netsh but that appears to not work either.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models
Projects
Status: Roadmap - Chat 1.x
Development

No branches or pull requests

2 participants