Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it still intended and possible to use custom endpoint in 0.2.0? #134

Closed
Kaschi14 opened this issue Mar 13, 2024 · 9 comments
Closed

Is it still intended and possible to use custom endpoint in 0.2.0? #134

Kaschi14 opened this issue Mar 13, 2024 · 9 comments
Labels

Comments

@Kaschi14
Copy link

No description provided.

@senbinyu
Copy link

senbinyu commented Mar 15, 2024

same question. I can use my own custom endpoint in the previous version, but it can not work in the newest 0.2.0 version with the same configuration.

Copy link
Contributor

This issue is stale because it has been open for 30 days with no activity.

@github-actions github-actions bot added the stale label Apr 15, 2024
@f-ricci
Copy link

f-ricci commented Apr 24, 2024

I have the same question

@github-actions github-actions bot removed the stale label Apr 25, 2024
Copy link
Contributor

This issue is stale because it has been open for 30 days with no activity.

@github-actions github-actions bot added the stale label May 25, 2024
@McPatate
Copy link
Member

Sorry for the late response. What do you mean by custom endpoint? Could you share your configuration? llm-vscode supports multiple backends, namely Hugging Face, TGI, OpenAI, ollama & llama.cpp

@senbinyu
Copy link

Sorry for the late response. What do you mean by custom endpoint? Could you share your configuration? llm-vscode supports multiple backends, namely Hugging Face, TGI, OpenAI, ollama & llama.cpp

@McPatate In the previous version of llm-vscode, I can use my own local deployed model. For example, I deployed my model at the port: http:https://192.168.1.73:8192/generate, then I can put the address at ModelID or Endpoint to obtain the response. But in the newest version of 0.2.0, it does not work anymore. Great thanks.

@McPatate
Copy link
Member

Could you share your configuration settings, any logs or anything else that could help understand what is going on?

Have you tried updating to the latest version of llm-vscode?

@github-actions github-actions bot removed the stale label May 30, 2024
Copy link
Contributor

This issue is stale because it has been open for 30 days with no activity.

@github-actions github-actions bot added the stale label Jun 29, 2024
@McPatate
Copy link
Member

McPatate commented Jul 3, 2024

I'll close this issue, feel free to re-open if you are still facing an issue.

@McPatate McPatate closed this as completed Jul 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants