Skip to content

Releases: jamesrochabrun/SwiftOpenAI

SwiftOpenAI v4.0.1

02 Feb 07:12
e589864
Compare
Choose a tag to compare

SwiftOpenAI v4.0.0

What's Changed

Full Changelog: v4.0.0...v4.0.1

SwiftOpenAI v4.0.0

02 Feb 07:02
d72e7a7
Compare
Choose a tag to compare

DeepSeek

Image

The DeepSeek API uses an API format compatible with OpenAI. By modifying the configuration, you can use SwiftOpenAI to access the DeepSeek API.

Creating the service

let apiKey = "your_api_key"
let service = OpenAIServiceFactory.service(
   apiKey: apiKey,
   overrideBaseURL: "https://api.deepseek.com")

Non-Streaming Example

let prompt = "What is the Manhattan project?"
let parameters = ChatCompletionParameters(
    messages: [.init(role: .user, content: .text(prompt))],
    model: .custom("deepseek-reasoner")
)

do {
    let result = try await service.chat(parameters: parameters)
    
    // Access the response content
    if let content = result.choices.first?.message.content {
        print("Response: \(content)")
    }
    
    // Access reasoning content if available
    if let reasoning = result.choices.first?.message.reasoningContent {
        print("Reasoning: \(reasoning)")
    }
} catch {
    print("Error: \(error)")
}

Streaming Example

let prompt = "What is the Manhattan project?"
let parameters = ChatCompletionParameters(
    messages: [.init(role: .user, content: .text(prompt))],
    model: .custom("deepseek-reasoner")
)

// Start the stream
do {
    let stream = try await service.startStreamedChat(parameters: parameters)
    for try await result in stream {
        let content = result.choices.first?.delta.content ?? ""
        self.message += content
        
        // Optional: Handle reasoning content if available
        if let reasoning = result.choices.first?.delta.reasoningContent {
            self.reasoningMessage += reasoning
        }
    }
} catch APIError.responseUnsuccessful(let description, let statusCode) {
    self.errorMessage = "Network error with status code: \(statusCode) and description: \(description)"
} catch {
    self.errorMessage = error.localizedDescription
}

Notes

  • The DeepSeek API is compatible with OpenAI's format but uses different model names
  • Use .custom("deepseek-reasoner") to specify the DeepSeek model
  • The reasoningContent field is optional and specific to DeepSeek's API
  • Error handling follows the same pattern as standard OpenAI requests.

For more inofrmation about the DeepSeek api visit its documentation.

SwiftOpenAI v3.9.9

02 Feb 06:31
c581d02
Compare
Choose a tag to compare

OpenRouter

Image

OpenRouter provides an OpenAI-compatible completion API to 314 models & providers that you can call directly, or using the OpenAI SDK. Additionally, some third-party SDKs are available.

// Creating the service

let apiKey = "your_api_key"
let servcie = OpenAIServiceFactory.service(apiKey: apiKey, 
   overrideBaseURL: "https://openrouter.ai", 
   proxyPath: "api",
   extraHeaders: [
      "HTTP-Referer": "<YOUR_SITE_URL>", // Optional. Site URL for rankings on openrouter.ai.
         "X-Title": "<YOUR_SITE_NAME>"  // Optional. Site title for rankings on openrouter.ai.
   ])

// Making a request

let prompt = "What is the Manhattan project?"
let parameters = ChatCompletionParameters(messages: [.init(role: .user, content: .text(prompt))], model: .custom("deepseek/deepseek-r1:free"))
let stream = service.startStreamedChat(parameters: parameters)

For more inofrmation about the OpenRouter api visit its documentation.

DeepSeek

Image

The DeepSeek API uses an API format compatible with OpenAI. By modifying the configuration, you can use SwiftOpenAI to access the DeepSeek API.

// Creating the service

let apiKey = "your_api_key"
let service = OpenAIServiceFactory.service(
   apiKey: apiKey,
   overrideBaseURL: "https://api.deepseek.com")

// Making a request

let prompt = "What is the Manhattan project?"
let parameters = ChatCompletionParameters(messages: [.init(role: .user, content: .text(prompt))], model: .custom("deepseek-reasoner"))
let stream = service.startStreamedChat(parameters: parameters)

For more inofrmation about the DeepSeek api visit its documentation.

SwiftOpenAI v3.9.8

23 Jan 08:23
6f1a8dd
Compare
Choose a tag to compare

What's Changed

Full Changelog: v.3.9.6...v3.9.8

SwiftOpenAI v3.9.7

16 Jan 05:49
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v3.9.5...v3.9.7

SwiftOpenAI v.3.9.6

03 Jan 22:18
Compare
Choose a tag to compare

What's Changed

Predicted outputs support:

Usage:

let code = """
      ScrollView {
         VStack {
            textArea
            Text(chatProvider.errorMessage)
               .foregroundColor(.red)
            streamedChatResultView
         }
      }
"""

let content: ChatCompletionParameters.Message.ContentType = .text("Change this Scrollview to be a list" )
let parameters = ChatCompletionParameters(
     messages: [
         .init(role: .user, content: content),
         .init(role: .user, content: .text(code))], 
     model: .gpt4o,
     prediction: .init(content: .text(code)))
try await openAIService.startChat(parameters: parameters)

Other:

New Contributors

Full Changelog: v3.9.5...v.3.9.6

SwiftOpenAI v3.9.5

28 Dec 08:14
25c73a4
Compare
Choose a tag to compare

What's Changed

Added reasoning_effort parameter for o1 models.
Added metadata for Chat Completions.
Model updates.

Full Changelog: v3.9.4...v3.9.5

SwiftOpenAI v3.9.4

28 Dec 07:16
649a8f2
Compare
Choose a tag to compare

What's Changed

  • Message Content - added image url response type by @ebohdas in #103

New Contributors

Full Changelog: v3.9.3...v3.9.4

SwiftOpenAI v3.9.3

17 Nov 06:00
c1be624
Compare
Choose a tag to compare

Support for additional parameters, users now can use open router.

let service = OpenAIServiceFactory.service(
              apikey:  "${OPENROUTER_API_KEY}",  
             baseURL: "https://openrouter.ai",
             proxyPath: "api"
             headers: ["HTTP-Referer":  "${YOUR_SITE_URL}",  "X-Title":  "${YOUR_SITE_NAME}")

What's Changed

Full Changelog: v3.9.2...v3.9.3

SwiftOpenAI v3.9.2

12 Nov 19:17
c3a04bb
Compare
Choose a tag to compare

Gemini

Screenshot 2024-11-12 at 10 53 43 AM

Gemini is now accessible from the OpenAI Library. Announcement .
SwiftOpenAI support all OpenAI endpoints, however Please refer to Gemini documentation to understand which API's are currently compatible'

Gemini is now accessible through the OpenAI Library. See the announcement here.
SwiftOpenAI supports all OpenAI endpoints. However, please refer to the Gemini documentation to understand which APIs are currently compatible."

You can instantiate a OpenAIService using your Gemini token like this...

let geminiAPIKey = "your_api_key"
let baseURL = "https://generativelanguage.googleapis.com"
let version = "v1beta"

let service = OpenAIServiceFactory.service(
   apiKey: apiKey, 
   overrideBaseURL: baseURL, 
   overrideVersion: version)

You can now create a chat request using the .custom model parameter and pass the model name as a string.

let parameters = ChatCompletionParameters(
      messages: [.init(
      role: .user,
      content: content)],
      model: .custom("gemini-1.5-flash"))

let stream = try await service.startStreamedChat(parameters: parameters)