Access self hosted model from devcontainer #2112
Unanswered
martinzrrl
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I would like to access a self hosted language model from a running autogen studio devcontainer.
I am running the local language model on localhost with the help of LM studio. The server details are in the same format as the ones from the Open AI API. The connection without docker works and I can use the local language model with autogen.
Using a devcontainer I can not access the model anymore and get a "Connection error".
I assume the problem is that the docker container can not access the port of the local model.
Autogen would run on port 8100 and the language model would run on a port 2222.
I have tried to forward the port in the VS devcontainer, in the Terminal Ports Tab Port "2222" and Forwarded Address "8081". I get the error "The local port 8100 is not available. Port number 8101 has been used instead.
How to forward an outside port so it can be used inside the devcontainer?
Has anybody successfully set a connection to a local hosted model from autogen devcontainer before?
Beta Was this translation helpful? Give feedback.
All reactions