Skip to content

Ollama OpenAI compatibility.

Compare
Choose a tag to compare
@jamesrochabrun jamesrochabrun released this 25 Jun 07:17
· 4 commits to main since this release

Ollama

Ollama now has built-in compatibility with the OpenAI Chat Completions API, making it possible to use more tooling and applications with Ollama locally.

Screenshot 2024-06-25 at 12 16 19 AM

⚠️ Important

Remember that these models run locally, so you need to download them. If you want to use llama3, you can open the terminal and run the following command:

ollama pull llama3

you can follow Ollama documentation for more.

How to use this models locally using SwiftOpenAI?

To use local models with an OpenAIService in your application, you need to provide a URL.

let service = OpenAIServiceFactory.ollama(baseURL: "http:https://localhost:11434")

Then you can use the completions API as follows:

let prompt = "Tell me a joke"
let parameters = ChatCompletionParameters(messages: [.init(role: .user, content: .text(prompt))], model: .custom("llama3"))
let chatCompletionObject = service.startStreamedChat(parameters: parameters)

Resources: