Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: 🛠 Support for local LLMs tools in Terminal Chat like Ollama #16471

Open
Samk13 opened this issue Dec 14, 2023 · 2 comments
Open
Labels
Area-Chat All things LLM or "AI" Issue-Feature Complex enough to require an in depth planning process and actual budgeted, scheduled work. Needs-Tag-Fix Doesn't match tag requirements Product-Terminal The new Windows Terminal.
Milestone

Comments

@Samk13
Copy link

Samk13 commented Dec 14, 2023

Description of the new feature/enhancement

The Windows Terminal Chat currently only supports Azure OpenAI Service. This restriction limits developers who work with or are developing their own local Large Language Models (LLMs), or using tools such as Ollama and need to interface with them directly within the Terminal.
The ability to connect to a local LLM service would allow for better flexibility, especially for those concerned with privacy, working offline, or dealing with sensitive information that cannot be sent to cloud services.

Proposed technical implementation details (optional)

include functionality to support local LLM services by allowing users to configure a connection to local AI models. This would involve:

  1. Provide an option in the Terminal Chat settings to specify the endpoint of a local LLM service.
  2. Allowing the user to set the port that the local LLM service should listen to for incoming requests.

Thanks!

@Samk13 Samk13 added the Issue-Feature Complex enough to require an in depth planning process and actual budgeted, scheduled work. label Dec 14, 2023
@microsoft-github-policy-service microsoft-github-policy-service bot added Needs-Triage It's a new issue that the core contributor team needs to triage at the next triage meeting Needs-Tag-Fix Doesn't match tag requirements labels Dec 14, 2023
@adrastogi adrastogi added the Area-Chat All things LLM or "AI" label Dec 19, 2023
@carlos-zamora carlos-zamora added this to the Backlog milestone Jan 17, 2024
@carlos-zamora carlos-zamora added Product-Terminal The new Windows Terminal. and removed Needs-Triage It's a new issue that the core contributor team needs to triage at the next triage meeting Needs-Tag-Fix Doesn't match tag requirements labels Jan 17, 2024
@microsoft-github-policy-service microsoft-github-policy-service bot added Needs-Tag-Fix Doesn't match tag requirements labels Jan 17, 2024
@dossjj
Copy link

dossjj commented May 28, 2024

Would love to see this feature. Phi models would be great for this.

@g0t4
Copy link

g0t4 commented Jun 17, 2024

As a workaround, I setup https://github.com/g0t4/term-chat-ollama as an intermediate "proxy" that can forward requests to any OpenAI compat completions backend... i.e. ollama, OpenAI, groq.com, etc

FYI, video overview here: https://youtu.be/-QcSRmrsND0

@dossjj with this, you can use phi3 by setting the endpoint to https://fake.openai.azure.com:5000/answer?model=phi3

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Area-Chat All things LLM or "AI" Issue-Feature Complex enough to require an in depth planning process and actual budgeted, scheduled work. Needs-Tag-Fix Doesn't match tag requirements Product-Terminal The new Windows Terminal.
Projects
None yet
Development

No branches or pull requests

5 participants