Skip to content

Releases: jamesrochabrun/SwiftOpenAI

Bug Fixes

25 Apr 04:45
3847516
Compare
Choose a tag to compare
  • Bug Fixes
  • Assistants API Response format fix.

Updating Swift Package to the latest Open AI updates.

Third Party Library AI Proxy

29 Mar 16:57
7278ddb
Compare
Choose a tag to compare

AIProxy

The AI Proxy team made this contribution independently. SwiftOpenAI's owner is not involved in its development but accepts it in the spirit of open-source collaboration. It is added for convenience, and its use is at the discretion of the developer.

video.mp4

Protect your OpenAI key without a backend.

What is it?

AIProxy is a backend for AI apps that proxies requests from your app to OpenAI. You can use this service to avoid exposing your OpenAI key in your app. We offer AIProxy support so that developers can build and distribute apps using SwiftOpenAI.

How does my SwiftOpenAI code change?
SwiftOpenAI supports proxying requests through AIProxy with a small change to your integration code.

Instead of initializing service with:

let apiKey = "your_openai_api_key_here"
let service = OpenAIServiceFactory.service(apiKey: apiKey)

Use:

#if DEBUG && targetEnvironment(simulator)
let service = OpenAIServiceFactory.service(
    aiproxyPartialKey: "hardcode_partial_key_here",
    aiproxyDeviceCheckBypass: "hardcode_device_check_bypass_here"
)
#else
let service = OpenAIServiceFactory.service(
    aiproxyPartialKey: "hardcode_partial_key_here"
)
#endif

The aiproxyPartialKey and aiproxyDeviceCheckBypass values are provided to you on the AIProxy developer dashboard.

⚠️ It is important that you do not let the aiproxyDeviceCheckBypass token leak into a distribution build of your app (including TestFlight distributions). Please retain the conditional compilation checks that are present in the sample code above.

What is the aiproxyDeviceCheckBypass constant?
AIProxy uses Apple's DeviceCheck to ensure that requests received by the backend originated from your app on a legitimate Apple device. However, the iOS simulator cannot produce DeviceCheck tokens. Rather than requiring you to constantly build and run on device during development, AIProxy provides a way to skip the DeviceCheck integrity check. The token is intended for use by developers only. If an attacker gets the token, they can make requests to your AIProxy project without including a DeviceCheck token, and thus remove one level of protection.

What is the aiproxyPartialKey constant?
This constant is intended to be included in the distributed version of your app. As the name implies, it is a partial representation of your OpenAI key. Specifically, it is one half of an encrypted version of your key. The other half resides on AIProxy's backend. As your app makes requests to AIProxy, the two encrypted parts are paired, decrypted, and used to fulfill the request to OpenAI.

How to setup my project on AIProxy?
Please see the AIProxy integration guide

⚠️ Disclaimer
Contributors of SwiftOpenAI shall not be liable for any damages or losses caused by third parties. Contributors of this library provide third party integrations as a convenience. Any use of a third party's services are assumed at your own risk.

Assistant API Stream

22 Mar 22:55
Compare
Choose a tag to compare

Assistants API stream support.

You can stream events from the Create Thread and Run, Create Run, and Submit Tool Outputs endpoints by passing "stream": true. The response will be a Server-Sent events stream.

In Swift:

   /// Creates a thread and run with stream enabled.
   ///
   /// - Parameter parameters: The parameters needed to create a thread and run.
   /// - Returns: An AsyncThrowingStream of [AssistantStreamEvent](https://platform.openai.com/docs/api-reference/assistants-streaming/events) objects.
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's  Run API documentation](https://platform.openai.com/docs/api-reference/runs/createThreadAndRun).
   func createThreadAndRunStream(
      parameters: CreateThreadAndRunParameter)
   async throws -> AsyncThrowingStream<AssistantStreamEvent, Error>
   
   /// Create a run with stream enabled.
   ///
   /// - Parameter threadID: The ID of the thread to run.
   /// - Parameter parameters: The parameters needed to build a Run.
   /// - Returns: An AsyncThrowingStream of [AssistantStreamEvent](https://platform.openai.com/docs/api-reference/assistants-streaming/events) objects.
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's  Run API documentation](https://platform.openai.com/docs/api-reference/runs/createRun).
   func createRunStream(
      threadID: String,
      parameters: RunParameter)
   async throws -> AsyncThrowingStream<AssistantStreamEvent, Error>
   
   
   /// When a run has the status: "requires_action" and required_action.type is submit_tool_outputs, this endpoint can be used to submit the outputs from the tool calls once they're all completed. All outputs must be submitted in a single request. Stream enabled
   ///
   /// - Parameter threadID: The ID of the [thread](https://platform.openai.com/docs/api-reference/threads) to which this run belongs.
   /// - Parameter runID: The ID of the run that requires the tool output submission.
   /// - Parameter parameters: The parameters needed for the run tools output.
   /// - Returns: An AsyncThrowingStream of [AssistantStreamEvent](https://platform.openai.com/docs/api-reference/assistants-streaming/events) objects.
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's  Run API documentation](https://platform.openai.com/docs/api-reference/runs/submitToolOutputs).
   func submitToolOutputsToRunStream(
      threadID: String,
      runID: String,
      parameters: RunToolsOutputParameter)
   async throws -> AsyncThrowingStream<AssistantStreamEvent, Error>

Added demo project/tutorial based on Python tutorial.

streamgif

Adding latest changes from OpenAI API https://platform.openai.com/docs/changelog

20 Feb 06:29
Compare
Choose a tag to compare

Azure OpenAI

24 Jan 06:57
1153951
Compare
Choose a tag to compare

This release provides support for both chat completions and chat stream completions through Azure OpenAI. Currently, DefaultOpenAIAzureService supports chat completions, including both streamed and non-streamed options.

openai-azure

v1.5

03 Jan 19:17
61d7539
Compare
Choose a tag to compare
  • Stream now can be canceled.
  • Updated demo for stream cancellation.

Simulator Screen Recording - iPhone 15 - 2024-01-03 at 11 15 49

Log probs are now available with chat completions.

19 Dec 07:12
Compare
Choose a tag to compare
Screenshot 2023-12-18 at 11 10 57 PM Screenshot 2023-12-18 at 11 10 57 PM

OpenAI DevDay Updates Final part

29 Nov 09:30
Compare
Choose a tag to compare

This release features all the new endpoints introduced at OpenAI Dev Day, including the beta version of the Assistants API. It supports a range of functionalities such as assistants, messages, threads, runs, run steps, message file objects, the Vision API, the Text-to-Speech API, and more.
Developers can create their own Assistant client like this.
demoassis

OpenAI DevDay Updates Part two

15 Nov 08:05
0d8a76c
Compare
Choose a tag to compare