Skip to content

Releases: jamesrochabrun/SwiftOpenAI

AIProxy updates.

26 Jun 22:04
6bc5fe0
Compare
Choose a tag to compare
  • The factory method OpenAIServiceFactory.ollama has been changed to OpenAIServiceFactory.service, where you specify the url of the service that is OpenAI-API compatible. To specify the URL and api key (for Bearer API authentication), use:
OpenAIServiceFactory.service(apiKey: "YOUR_API_KEY", baseURL: "http:https://<DOMAIN>:<PORT>")
  • The AIProxy integration now uses certificate pinning to prevent threat actors from snooping on your traffic. There are no changes to your client code necessary to take advantage of this security improvement

  • The AIProxy integration has a changed method for hiding the DeviceCheck bypass token. This token is only intended to be used on iOS simulators, and the previous method of hiding was too easy to leak into production builds of the app. Please change your integration code from:

#if DEBUG && targetEnvironment(simulator)
	OpenAIServiceFactory.service(
		aiproxyPartialKey: "hardcode-partial-key-here",
		aiproxyDeviceCheckBypass: "hardcode-device-check-bypass-here"
	)
#else
	OpenAIServiceFactory.service(aiproxyPartialKey: "hardcode-partial-key-here")
#endif

To this:

OpenAIServiceFactory.service(
   aiproxyPartialKey: "hardcode-partial-key-here"
)

And use the method described in the README for adding the bypass token as an env variable to your Xcode project.

Ollama OpenAI compatibility.

25 Jun 07:17
Compare
Choose a tag to compare

Ollama

Ollama now has built-in compatibility with the OpenAI Chat Completions API, making it possible to use more tooling and applications with Ollama locally.

Screenshot 2024-06-25 at 12 16 19 AM

⚠️ Important

Remember that these models run locally, so you need to download them. If you want to use llama3, you can open the terminal and run the following command:

ollama pull llama3

you can follow Ollama documentation for more.

How to use this models locally using SwiftOpenAI?

To use local models with an OpenAIService in your application, you need to provide a URL.

let service = OpenAIServiceFactory.ollama(baseURL: "http:https://localhost:11434")

Then you can use the completions API as follows:

let prompt = "Tell me a joke"
let parameters = ChatCompletionParameters(messages: [.init(role: .user, content: .text(prompt))], model: .custom("llama3"))
let chatCompletionObject = service.startStreamedChat(parameters: parameters)

Resources:

Changelog Jun6th, 2024

21 Jun 05:42
Compare
Choose a tag to compare

Streaming chat completions now support usage detail.

10 Jun 18:39
e8b912b
Compare
Choose a tag to compare

On June 6th OpenAI announced that streaming chat completions now support usage details. Previously, usage details were only available on non-streaming chat completions.
This patch makes streaming chat completions default to include usage details. The final chunk of each streaming response looks like this (note the prompt_tokens, completion_tokens, and total_tokens lines):
data: {"id":"chatcmpl-9YM1lpTbJLDBnrawPqt2CjT3gnoVA","object":"chat.completion.chunk","created":1717974853,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_319be4768e","choices":[],"usage":{"prompt_tokens":11,"completion_tokens":20,"total_tokens":31}}

Run Status Decoded

05 Jun 05:37
cca9a9e
Compare
Choose a tag to compare

Added support for streaming Run events status.

let decoded = try self.decoder.decode(RunObject.self, from: data)
switch RunObject.Status(rawValue: decoded.status) {
case .queued:
   continuation.yield(.threadRunQueued(decoded))
case .inProgress:
   continuation.yield(.threadRunInProgress(decoded))
case .requiresAction:
   continuation.yield(.threadRunRequiresAction(decoded))
case .cancelling:
   continuation.yield(.threadRunCancelling(decoded))
case .cancelled:
   continuation.yield(.threadRunCancelled(decoded))
case .failed:
   continuation.yield(.threadRunFailed(decoded))
case .completed:
   continuation.yield(.threadRunCompleted(decoded))
case .expired:
   continuation.yield(.threadRunExpired(decoded))
}

Vision support for GPT4o

01 Jun 05:56
3be54cb
Compare
Choose a tag to compare

Vision support for GPT4o

Adapting Message content to latest changes for Vision

Previous payload

"image_url": "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg"

Current Payload

          "image_url": {
            "url": "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg",
            "detail": "high"
          }

Assistants API support for Azure

29 May 19:59
7821552
Compare
Choose a tag to compare

Getting started with Azure OpenAI Assistants (Preview)

Screenshot 2024-05-29 at 12 58 56 PM

https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/assistant

enum AzureOpenAIAPI {
   
   static var azureOpenAIResource: String = ""
   
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/assistants-reference?tabs=python
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/assistant
   case assistant(AssistantCategory)
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/reference#chat-completions
   case chat(deploymentID: String)
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/assistants-reference-messages?tabs=python
   case message(MessageCategory)
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/assistants-reference-runs?tabs=python
   case run(RunCategory)
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/assistants-reference-runs?tabs=python#list-run-steps
   case runStep(RunStepCategory)
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/assistants-reference-threads?tabs=python#create-a-thread
   case thread(ThreadCategory)
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/file-search?tabs=python#vector-stores
   case vectorStore(VectorStoreCategory)
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/file-search?tabs=python#vector-stores
   case vectorStoreFile(VectorStoreFileCategory)
   
   enum AssistantCategory {
      case create
      case list
      case retrieve(assistantID: String)
      case modify(assistantID: String)
      case delete(assistantID: String)
   }

   enum MessageCategory {
      case create(threadID: String)
      case retrieve(threadID: String, messageID: String)
      case modify(threadID: String, messageID: String)
      case list(threadID: String)
   }
   
   enum RunCategory {
      case create(threadID: String)
      case retrieve(threadID: String, runID: String)
      case modify(threadID: String, runID: String)
      case list(threadID: String)
      case cancel(threadID: String, runID: String)
      case submitToolOutput(threadID: String, runID: String)
      case createThreadAndRun
   }
   
   enum RunStepCategory {
      case retrieve(threadID: String, runID: String, stepID: String)
      case list(threadID: String, runID: String)
   }
   
   enum ThreadCategory {
      case create
      case retrieve(threadID: String)
      case modify(threadID: String)
      case delete(threadID: String)
   }
   
   enum VectorStoreCategory {
      case create
      case list
      case retrieve(vectorStoreID: String)
      case modify(vectorStoreID: String)
      case delete(vectorStoreID: String)
   }
   
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/file-search?tabs=python#file-search-support
   enum VectorStoreFileCategory {
      case create(vectorStoreID: String)
      case list(vectorStoreID: String)
      case retrieve(vectorStoreID: String, fileID: String)
      case delete(vectorStoreID: String, fileID: String)
   }
}

Support for gpt-4o

14 May 04:18
f0edddc
Compare
Choose a tag to compare

Support for gpt-4o.

Function Calling Required

04 May 05:01
Compare
Choose a tag to compare
Screenshot 2024-05-03 at 9 59 45 PM

Also Bug fixes.

Assistants API V2

30 Apr 05:14
97a0ec7
Compare
Choose a tag to compare

https://platform.openai.com/docs/assistants/whats-new

Screenshot 2024-04-29 at 10 06 07 PM

Migrated SwiftOpenAI to assistants V2, if you need V1 support make sure to use v2.3

Check OpenAI migration guide.

We have changed the way that tools and files work in the Assistants API between the v1 and v2 versions of the beta. Both versions of the beta continue to be accessible via the API today, but we recommend migrating to the newest version of our APIs as soon as feasible. We will deprecate v1 of the beta by the end of 2024.

If you do not use tools or files with the Assistants API today, there should be no changes required for you to migrate from the v1 version to the v2 version of the beta. Simply pass the v2 beta version header and/or move to the latest version of our Node and Python SDKs!
What has changed
The v2 version of the Assistants API contains the following changes:

Tool rename: The retrieval tool has been renamed to the file_search tool
Files belong to tools: Files are now associated with tools instead of Assistants and Messages. This means that:
AssistantFile and MessageFile objects no longer exist.
Instead of AssistantFile and MessageFile, files are attached to Assistants and Threads using the new tool_resources object.
The tool_resources for the code interpreter tool are a list of file_ids.
The tool_resources for the file_search tool are a new object called a vector_stores.
Messages now have an attachments, rather than a file_ids parameter. Message attachments are helpers that add the files to a Thread’s tool_resources.

Screenshot 2024-04-29 at 10 10 55 PM

Assistants have tools and tool_resources instead of file_ids. The retrieval tool is now the file_search tool. The tool_resource for the file_search tool is a vector_store.

Screenshot 2024-04-29 at 10 11 24 PM

Threads can bring their own tool_resources into a conversation.

Screenshot 2024-04-29 at 10 11 50 PM

Messages have attachments instead of file_ids. attachments are helpers that add files to the Thread’s tool_resources.

All v1 endpoints and objects for the Assistants API can be found under the Legacy section of the API reference.

  • Support for batch.
  • Support for vector stores.
  • Support for vector store files
  • Support for vector store file batch.

New interfaces:

 // MARK: Batch

   /// Creates and executes a batch from an uploaded file of requests
   ///
   /// - Parameter parameters: The parameters needed to create a batch.
   /// - Returns: A [batch](https://platform.openai.com/docs/api-reference/batch/object) object.
   /// - Throws: An error if the request fails
   ///
   /// For more information, refer to [OpenAI's Batch API documentation](https://platform.openai.com/docs/api-reference/batch/create).
   func createBatch(
      parameters: BatchParameter)
      async throws -> BatchObject

   /// Retrieves a batch.
   ///
   /// - Parameter id: The identifier of the batch to retrieve.
   /// - Returns: A [BatchObject](https://platform.openai.com/docs/api-reference/batch/object) matching the specified ID..
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's Batch documentation](https://platform.openai.com/docs/api-reference/batch/retrieve).
   func retrieveBatch(
      id: String)
      async throws -> BatchObject
   
   /// Cancels an in-progress batch.
   ///
   /// - Parameter id: The identifier of the batch to cancel.
   /// - Returns: A [BatchObject](https://platform.openai.com/docs/api-reference/batch/object) matching the specified ID..
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's Batch documentation](https://platform.openai.com/docs/api-reference/batch/cancel)
   func cancelBatch(
      id: String)
      async throws -> BatchObject

   /// List your organization's batches.
   ///
   /// - Parameters:
   ///   - after: A cursor for use in pagination. after is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include after=obj_foo in order to fetch the next page of the list.
   ///   - limit: A limit on the number of objects to be returned. Limit can range between 1 and 100, and the default is 20.
   /// - Returns: An `OpenAIResponse<BatchObject>` containing a list of paginated [Batch](https://platform.openai.com/docs/api-reference/batch/object) objects.
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's Batch API documentation](https://platform.openai.com/docs/api-reference/batch/list).
   func listBatch(
      after: String?,
      limit: Int?)
      async throws -> OpenAIResponse<BatchObject>
   
   // MARK: Vector Store
   
   /// Create a vector store.
   ///
   /// - Parameter parameters: The parameters needed to create a batc,.
   /// - Returns: A [Vector store](https://platform.openai.com/docs/api-reference/vector-stores) object.
   /// - Throws: An error if the request fails
   ///
   /// For more information, refer to [OpenAI's Vector store API documentation](https://platform.openai.com/docs/api-reference/vector-stores/create).
   func createVectorStore(
      parameters: VectorStoreParameter)
      async throws -> VectorStoreObject
   
   /// Returns a list of vector stores.
   ///
   /// - Parameter limit: A limit on the number of objects to be returned. Limit can range between 1 and 100, and the default is 20.
   /// - Parameter order: Sort order by the created_at timestamp of the objects. asc for ascending order and desc for descending order.
   /// - Parameter after: A cursor for use in pagination. after is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include after=obj_foo in order to fetch the next page of the list.
   /// - Parameter before: A cursor for use in pagination. before is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include before=obj_foo in order to fetch the previous page of the list.
   /// - Returns: A list of [VectorStoreObject](https://platform.openai.com/docs/api-reference/vector-stores) objects.
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's  Vector stores API documentation](https://platform.openai.com/docs/api-reference/vector-stores/list).
   func listVectorStores(
      limit: Int?,
      order: String?,
      after: String?,
      before: String?)
      async throws -> OpenAIResponse<VectorStoreObject>
   
   /// Retrieves a vector store.
   ///
   /// - Parameter id: The ID of the vector store to retrieve.
   /// - Returns: A [Vector Store](https://platform.openai.com/docs/api-reference/vector-stores) matching the specified ID..
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's Batch documentation](https://platform.openai.com/docs/api-reference/vector-stores/retrieve).
   func retrieveVectorStore(
      id: String)
      async throws -> VectorStoreObject
   
   /// Modifies a vector store.
   ///
   /// - Parameter id: The ID of the vector store to modify.
   /// - Returns: A [Vector Store](https://platform.openai.com/docs/api-reference/vector-stores) matching the specified ID..
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's Batch documentation](https://platform.openai.com/docs/api-reference/vector-stores/modify).
   func modifyVectorStore(
      id: String)
      async throws -> VectorStoreObject

   /// Delete a vector store.
   ///
   /// - Parameter id: The ID of the vector store to delete.
   /// - Returns: A Deletion status.
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's Batch documentation](https://platform.openai.com/docs/api-reference/vector-stores/modify).
   func deleteVectorStore(
      id: String)
      async throws -> DeletionStatus
   
   // MARK: Vector Store Files
   
   /// Create a vector store file by attaching a [File](https://platform.openai.com/docs/api-reference/files) to a vector store.
   ///
   /// - Parameter vectorStoreID: The ID of the vector store for which to create a File.
   /// - Parameter parameters: The paramaters needed to create a vector store File.
   /// - Returns: A [VectorStoreFileObject](https://platform.openai.com/docs/api-reference/vector-stores-files/file-object)
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's Vectore store file documentation.](https://platform.openai.com/docs/api-reference/vector-stores-files/createFile).
   func createVectorStoreFile(
      vectorStoreID: String,
      parameters: VectorStoreFileParameter)
      async throws -> VectorStoreFileObject
   
   /// Returns a list of vector ...
Read more