Releases: jamesrochabrun/SwiftOpenAI
Third Party Library AI Proxy
The AI Proxy team made this contribution independently. SwiftOpenAI's owner is not involved in its development but accepts it in the spirit of open-source collaboration. It is added for convenience, and its use is at the discretion of the developer.
video.mp4
Protect your OpenAI key without a backend.
What is it?
AIProxy is a backend for AI apps that proxies requests from your app to OpenAI. You can use this service to avoid exposing your OpenAI key in your app. We offer AIProxy support so that developers can build and distribute apps using SwiftOpenAI.
How does my SwiftOpenAI code change?
SwiftOpenAI supports proxying requests through AIProxy with a small change to your integration code.
Instead of initializing service with:
let apiKey = "your_openai_api_key_here"
let service = OpenAIServiceFactory.service(apiKey: apiKey)
Use:
#if DEBUG && targetEnvironment(simulator)
let service = OpenAIServiceFactory.service(
aiproxyPartialKey: "hardcode_partial_key_here",
aiproxyDeviceCheckBypass: "hardcode_device_check_bypass_here"
)
#else
let service = OpenAIServiceFactory.service(
aiproxyPartialKey: "hardcode_partial_key_here"
)
#endif
The aiproxyPartialKey and aiproxyDeviceCheckBypass values are provided to you on the AIProxy developer dashboard.
What is the aiproxyDeviceCheckBypass constant?
AIProxy uses Apple's DeviceCheck to ensure that requests received by the backend originated from your app on a legitimate Apple device. However, the iOS simulator cannot produce DeviceCheck tokens. Rather than requiring you to constantly build and run on device during development, AIProxy provides a way to skip the DeviceCheck integrity check. The token is intended for use by developers only. If an attacker gets the token, they can make requests to your AIProxy project without including a DeviceCheck token, and thus remove one level of protection.
What is the aiproxyPartialKey constant?
This constant is intended to be included in the distributed version of your app. As the name implies, it is a partial representation of your OpenAI key. Specifically, it is one half of an encrypted version of your key. The other half resides on AIProxy's backend. As your app makes requests to AIProxy, the two encrypted parts are paired, decrypted, and used to fulfill the request to OpenAI.
How to setup my project on AIProxy?
Please see the AIProxy integration guide
Contributors of SwiftOpenAI shall not be liable for any damages or losses caused by third parties. Contributors of this library provide third party integrations as a convenience. Any use of a third party's services are assumed at your own risk.
Assistant API Stream
Assistants API stream support.
You can stream events from the Create Thread and Run, Create Run, and Submit Tool Outputs endpoints by passing "stream": true. The response will be a Server-Sent events stream.
In Swift:
/// Creates a thread and run with stream enabled.
///
/// - Parameter parameters: The parameters needed to create a thread and run.
/// - Returns: An AsyncThrowingStream of [AssistantStreamEvent](https://platform.openai.com/docs/api-reference/assistants-streaming/events) objects.
/// - Throws: An error if the request fails.
///
/// For more information, refer to [OpenAI's Run API documentation](https://platform.openai.com/docs/api-reference/runs/createThreadAndRun).
func createThreadAndRunStream(
parameters: CreateThreadAndRunParameter)
async throws -> AsyncThrowingStream<AssistantStreamEvent, Error>
/// Create a run with stream enabled.
///
/// - Parameter threadID: The ID of the thread to run.
/// - Parameter parameters: The parameters needed to build a Run.
/// - Returns: An AsyncThrowingStream of [AssistantStreamEvent](https://platform.openai.com/docs/api-reference/assistants-streaming/events) objects.
/// - Throws: An error if the request fails.
///
/// For more information, refer to [OpenAI's Run API documentation](https://platform.openai.com/docs/api-reference/runs/createRun).
func createRunStream(
threadID: String,
parameters: RunParameter)
async throws -> AsyncThrowingStream<AssistantStreamEvent, Error>
/// When a run has the status: "requires_action" and required_action.type is submit_tool_outputs, this endpoint can be used to submit the outputs from the tool calls once they're all completed. All outputs must be submitted in a single request. Stream enabled
///
/// - Parameter threadID: The ID of the [thread](https://platform.openai.com/docs/api-reference/threads) to which this run belongs.
/// - Parameter runID: The ID of the run that requires the tool output submission.
/// - Parameter parameters: The parameters needed for the run tools output.
/// - Returns: An AsyncThrowingStream of [AssistantStreamEvent](https://platform.openai.com/docs/api-reference/assistants-streaming/events) objects.
/// - Throws: An error if the request fails.
///
/// For more information, refer to [OpenAI's Run API documentation](https://platform.openai.com/docs/api-reference/runs/submitToolOutputs).
func submitToolOutputsToRunStream(
threadID: String,
runID: String,
parameters: RunToolsOutputParameter)
async throws -> AsyncThrowingStream<AssistantStreamEvent, Error>
Added demo project/tutorial based on Python tutorial.
Adding latest changes from OpenAI API https://platform.openai.com/docs/changelog
Adding latest changes from OpenAI API https://platform.openai.com/docs/changelog
Feb 9th, 2024
Added timestamp_granularities parameter to the Audio API
Feb 1st, 2024
Released gpt-3.5-turbo-0125, an updated GPT-3.5 Turbo model
Jan 25th, 2024
Released embedding V3 models and an updated GPT-4 Turbo preview
Added dimensions parameter to the Embeddings API
Dec 20th, 2023
Added additional_instructions parameter to run creation in the Assistants API
Dec 15th, 2023
Added logprobs and top_logprobs parameters to the Chat Completions API
Dec 14th, 2023
Changed function parameters argument on a tool call to be optional.
Azure OpenAI
v1.5
Log probs are now available with chat completions.
- Adding support for log probs in chat completion API https://platform.openai.com/docs/api-reference/chat/create#chat-create-logprobs
- Updated Chat demos with new log probs parameters.
OpenAI DevDay Updates Final part
This release features all the new endpoints introduced at OpenAI Dev Day, including the beta version of the Assistants API. It supports a range of functionalities such as assistants, messages, threads, runs, run steps, message file objects, the Vision API, the Text-to-Speech API, and more.
Developers can create their own Assistant client like this.
OpenAI DevDay Updates Part two
- Support for text to speech.
- Demos TTS.
- https://platform.openai.com/docs/api-reference/audio/createSpeech
- Minor updates on shared models.
- Improving demos for function call tutorial.
- https://medium.com/p/5b68b0f2e2f7
OpenAI DevDay Updates Part one
SwiftOpenAI: OpenAI API Integration for Swift
SwiftOpenAI v1.0.0 is a Swift package that provides a wrapper for the OpenAI API. This version supports all the primary OpenAI endpoints, including Audio, Chat, Embeddings, Fine-tuning, Files, Images, Models, and Moderations. It also comes with demos for each endpoint to help users understand its usage.