Skip to content

Commit

Permalink
Merge branch 'release/2.0.0'
Browse files Browse the repository at this point in the history
  • Loading branch information
antonio-war committed Mar 30, 2023
2 parents 5979bb8 + f5d045d commit f03782d
Show file tree
Hide file tree
Showing 16 changed files with 558 additions and 149 deletions.
5 changes: 4 additions & 1 deletion Package.swift
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,9 @@ let package = Package(
dependencies: ["SwiftyHTTP", "SwiftyRanged"]),
.testTarget(
name: "SwiftyGPTTests",
dependencies: ["SwiftyGPT", "SwiftyHTTP"]),
dependencies: ["SwiftyGPT", "SwiftyHTTP"],
resources: [
.copy("OpenAI-Info.plist")
])
]
)
100 changes: 85 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,10 +51,10 @@ Chat is the main feature of SwiftyGPT, as you can guess it allows you to ask Cha

## Deep Version

Deep versions allow you maximum over request creation. The main element of a request is a SwiftyGPTMessage.
Deep versions allow you maximum over request creation. The main element of a request is a SwiftyGPTChatMessage.

```swift
let message = SwiftyGPTMessage(role: .user, content: "Hi, how are you?")
let message = SwiftyGPTChatMessage(role: .user, content: "Hi, how are you?")
```

You can use role to instruct the model precisely as explained by the ChatGPT documentation and get the control you want.
Expand All @@ -65,7 +65,11 @@ swiftyGPT.chat(message: message) { result in
case .success(let response):
print(response)
case .failure(let error):
print(error)
if let error = error as? SwiftyGPTError {
print(error.message)
} else {
print(error.localizedDescription)
}
}
}
```
Expand All @@ -77,23 +81,41 @@ swiftyGPT.chat(messages: messages) { result in
case .success(let response):
print(response)
case .failure(let error):
print(error)
if let error = error as? SwiftyGPTError {
print(error.message)
} else {
print(error.localizedDescription)
}
}
}
```
In both method you can specify some optional parameters like model, temperature, maxTokens and others established by OpenAI.

```swift
swiftyGPT.chat(message: SwiftyGPTMessage(role: .user, content: "Hi, how are you?"), temperature: 5, user: "Test") { result in
swiftyGPT.chat(message: SwiftyGPTChatMessage(role: .user, content: "Hi, how are you?"), temperature: 5, user: "Test") { result in
switch result {
case .success(let response):
print(response)
case .failure(let error):
print(error)
if let error = error as? SwiftyGPTError {
print(error.message)
} else {
print(error.localizedDescription)
}
}
}
```

In case of success methods return a SwiftyGPTChatResponse object which is the entire transcript of ChatGPT HTTP response.
To access the received message or messages you have to check the content of the 'choices' attribute. By default choices array size is one, so you can get the message in this way and read its content or other attributes.

```swift
let message = response.choices.first?.message
```

However, if you have requested a different number of choices, the array will have a larger size and you will have to manage the response in a custom way.


## High Version

If you don't need a lot of control on your requests you can use High Versions methods that works with simple Strings. Obviously this brings some limitations :
Expand All @@ -108,10 +130,15 @@ swiftyGPT.chat(message: "Hi how are you ?") { response in
case .success(let response):
print(response)
case .failure(let error):
print(error)
if let error = error as? SwiftyGPTError {
print(error.message)
} else {
print(error.localizedDescription)
}
}
}
```
In this case the method directly returns the message of the single choice in string format.

## Async/Await

Expand All @@ -121,21 +148,64 @@ All methods of the chat feature are also available in Async/Await version.
let result: Result<String, Error> = await swiftyGPT.chat(message: "Hi how are you ?")
```

## Response Handling
---

In case you use high level methods the response will be directly in string format.
In Deep case instead methods return a SwiftyGPTResponse object which is the entire transcript of ChatGPT HTTP response.
To access the received message or messages you have to check the content of the 'choices' attribute. By default choices array size is one, so you can get the message in this way and read its content or other attributes.
# Image

SwiftyGPT uses DALL-E to generate images from textual descriptions. You can describe an object or a scene in words, and SwiftyGPT can create a corresponding image of it.

## Single Generation

The easiest way to generate an image is to use the following method, that accept a prompt and a size. It has the limitation of generating only square images of the following sizes: 256x256, 512x512 and 1024x1024. Also in this case if necessary you can specify a user for each call.

```swift
let message = response.choices.first?.message
swiftyGPT.image(prompt: "Draw an unicorn", size: .x256) { result in
switch result {
case .success(let response):
let image = UIImage(data: response)
case .failure(let error):
if let error = error as? SwiftyGPTError {
print(error.message)
} else {
print(error.localizedDescription)
}
}
}
```
If successful, the method returns an object of type Data so that you can build a UIImage if you use UIKit or an Image if you use SwiftUI, or make another use of it.

However, if you have requested a different number of choices, the array will have a larger size and you will have to manage the response in a custom way.
## Multiple Generation

In case you want to generate several different images starting from the same description, you can specify the choices parameter. In this case the method will return an array of Data.

```swift
swiftyGPT.image(prompt: "Draw an unicorn", choices: 2, size: .x256) { result in
switch result {
case .success(let response):
let images = response.compactMap({UIImage(data: $0)})
case .failure(let error):
if let error = error as? SwiftyGPTError {
print(error.message)
} else {
print(error.localizedDescription)
}
}
}
```

## Async/Await

All methods of the image feature are also available in Async/Await version.

```swift
let result: Result<Data, Error> = await swiftyGPT.image(prompt: "Draw an unicorn", size: .x256)
```

---

## Error Handling
# Error Handling

In case of failure the methods return an error, it can be a system error in case something went wrong on the iOS side. For example, network-level issues. If instead the error is related to ChatGPT you will get a SwiftyGPTError.
In case of failure methods return an error, it can be a system error in case something went wrong on the iOS side. For example, network-level issues or decoding issues. If instead the error is related to ChatGPT you will get a SwiftyGPTError.

```swift
if let error = error as? SwiftyGPTError {
Expand Down
88 changes: 88 additions & 0 deletions Sources/SwiftyGPT/Chat/SwiftyGPT+Chat.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
//
// SwiftyGPT+Chat.swift
//
//
// Created by Antonio Guerra on 30/03/23.
//

import Foundation
import SwiftyHTTP
import SwiftyRanged

// MARK: - Chat
extension SwiftyGPT {

public func chat(messages: [SwiftyGPTChatMessage], model: SwiftyGPTChatModel = .stable, @SwiftyOptionalRanged(0...2) temperature: Float? = nil, choices: Int? = nil, @SwiftyOptionalRanged(0...4096) maxTokens: Int? = nil, @SwiftyOptionalRanged(-2...2) presencePenalty: Float? = nil, @SwiftyOptionalRanged(-2...2) frequencyPenalty: Float? = nil, user: String? = nil, completion: @escaping (Result<SwiftyGPTChatResponse, Error>) -> ()) {

let request = SwiftyGPTChatRequest(messages: messages, model: model, temperature: temperature, choices: choices, stream: false, maxTokens: maxTokens, presencePenalty: presencePenalty, frequencyPenalty: frequencyPenalty, user: user)
SwiftyHTTP.request(with: SwiftyGPTRouter.chat(apiKey, request)) { result in
switch result {
case .success(let response):
if response.statusCode == 200 {
guard let body = try? JSONDecoder().decode(SwiftyGPTChatResponse.self, from: response.body) else {
completion(.failure(URLError(.badServerResponse)))
return
}
completion(.success(body))
} else {
guard let error = try? JSONDecoder().decode(SwiftyGPTError.self, from: response.body) else {
completion(.failure(URLError(.badServerResponse)))
return
}
completion(.failure(error))
}
case .failure(let error):
completion(.failure(error))
}
}
}

public func chat(messages: [SwiftyGPTChatMessage], model: SwiftyGPTChatModel = .stable, @SwiftyOptionalRanged(0...2) temperature: Float? = nil, choices: Int? = nil, @SwiftyOptionalRanged(0...4096) maxTokens: Int? = nil, @SwiftyOptionalRanged(-2...2) presencePenalty: Float? = nil, @SwiftyOptionalRanged(-2...2) frequencyPenalty: Float? = nil, user: String? = nil) async -> Result<SwiftyGPTChatResponse, Error> {

return await withCheckedContinuation { continuation in
chat(messages: messages, model: model, temperature: temperature, choices: choices, maxTokens: maxTokens, presencePenalty: presencePenalty, frequencyPenalty: frequencyPenalty, user: user) { result in
continuation.resume(returning: result)
}
}
}

public func chat(message: SwiftyGPTChatMessage, model: SwiftyGPTChatModel = .stable, @SwiftyOptionalRanged(0...2) temperature: Float? = nil, choices: Int? = nil, @SwiftyOptionalRanged(0...4096) maxTokens: Int? = nil, @SwiftyOptionalRanged(-2...2) presencePenalty: Float? = nil, @SwiftyOptionalRanged(-2...2) frequencyPenalty: Float? = nil, user: String? = nil, completion: @escaping (Result<SwiftyGPTChatResponse, Error>) -> ()) {
chat(messages: [message], model: model, temperature: temperature, choices: choices, maxTokens: maxTokens, presencePenalty: presencePenalty, frequencyPenalty: frequencyPenalty, user: user, completion: completion)
}

public func chat(message: SwiftyGPTChatMessage, model: SwiftyGPTChatModel = .stable, @SwiftyOptionalRanged(0...2) temperature: Float? = nil, choices: Int? = nil, @SwiftyOptionalRanged(0...4096) maxTokens: Int? = nil, @SwiftyOptionalRanged(-2...2) presencePenalty: Float? = nil, @SwiftyOptionalRanged(-2...2) frequencyPenalty: Float? = nil, user: String? = nil) async -> Result<SwiftyGPTChatResponse, Error> {

await chat(messages: [message], model: model, temperature: temperature, choices: choices, maxTokens: maxTokens, presencePenalty: presencePenalty, frequencyPenalty: frequencyPenalty, user: user)
}

public func chat(messages: [String], model: SwiftyGPTChatModel = .stable, user: String? = nil, completion: @escaping (Result<String, Error>) -> ()) {
chat(messages: messages.map({SwiftyGPTChatMessage(content: $0)}), model: model, user: user) { result in
switch result {
case .success(let response):
guard let message = response.choices.first?.message else {
completion(.failure(URLError(.badServerResponse)))
return
}
completion(.success(message.content))
case .failure(let error):
completion(.failure(error))
}
}
}

public func chat(messages: [String], model: SwiftyGPTChatModel = .stable, user: String? = nil) async -> Result<String, Error> {
return await withCheckedContinuation { continuation in
chat(messages: messages, model: model, user: user) { result in
continuation.resume(returning: result)
}
}
}

public func chat(message: String, model: SwiftyGPTChatModel = .stable, user: String? = nil, completion: @escaping (Result<String, Error>) -> ()) {
chat(messages: [message], model: model, user: user, completion: completion)
}

public func chat(message: String, model: SwiftyGPTChatModel = .stable, user: String? = nil) async -> Result<String, Error> {
await chat(messages: [message], model: model, user: user)
}
}
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
//
// SwiftyGPTMessage.swift
// SwiftyGPTChatMessage.swift
//
//
// Created by Antonio Guerra on 27/03/23.
//

import Foundation

public struct SwiftyGPTMessage: Codable, Identifiable {
public struct SwiftyGPTChatMessage: Codable, Identifiable {
public let id: UUID = UUID()
public let date: Date = Date()
public let role: SwiftyGPTRole
Expand Down
Original file line number Diff line number Diff line change
@@ -1,21 +1,21 @@
//
// SwiftyGPTModel.swift
// SwiftyGPTChatModel.swift
//
//
// Created by Antonio Guerra on 27/03/23.
//

import Foundation

public enum SwiftyGPTModel: String, Codable {
public enum SwiftyGPTChatModel: String, Codable {
case gpt4 = "gpt-4"
case gpt4_0314 = "gpt-4-0314"
case gpt4_32k = "gpt-4-32k"
case gpt4_32k_0314 = "gpt-4-32k-0314"
case gpt3_5_turbo = "gpt-3.5-turbo"
case gpt3_5_turbo_0301 = "gpt-3.5-turbo-0301"
public static var stable: SwiftyGPTModel {

public static var stable: SwiftyGPTChatModel {
gpt3_5_turbo
}
}
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
//
// SwiftyGPTRequest.swift
// SwiftyGPTChatRequest.swift
//
//
// Created by Antonio Guerra on 27/03/23.
Expand All @@ -8,9 +8,9 @@
import Foundation
import SwiftyHTTP

public struct SwiftyGPTRequest: SwiftyHTTPRequestBody {
public let messages: [SwiftyGPTMessage]
public let model: SwiftyGPTModel
public struct SwiftyGPTChatRequest: SwiftyHTTPRequestBody {
public let messages: [SwiftyGPTChatMessage]
public let model: SwiftyGPTChatModel
public let temperature: Float?
public let choices: Int?
public let stream: Bool?
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
//
// SwiftyGPTResponse.swift
// SwiftyGPTChatResponse.swift
//
//
// Created by Antonio Guerra on 27/03/23.
Expand All @@ -8,18 +8,18 @@
import Foundation
import SwiftyHTTP

public struct SwiftyGPTResponse: SwiftyHTTPResponseBody {
public struct SwiftyGPTChatResponse: SwiftyHTTPResponseBody {
public let id, object: String
public let created: TimeInterval
public let model: SwiftyGPTModel
public let usage: SwiftyGPTUsage
public let choices: [SwiftyGPTChoice]
public let model: SwiftyGPTChatModel
public let usage: SwiftyGPTChatUsage
public let choices: [SwiftyGPTChatChoice]
}

// MARK: - Choice
public struct SwiftyGPTChoice: Codable {
public let message: SwiftyGPTMessage
public let finishReason: SwiftyGPTFinishReason
public struct SwiftyGPTChatChoice: Codable {
public let message: SwiftyGPTChatMessage
public let finishReason: SwiftyGPTChatFinishReason
public let index: Int

enum CodingKeys: String, CodingKey {
Expand All @@ -30,7 +30,7 @@ public struct SwiftyGPTChoice: Codable {
}

// MARK: - Usage
public struct SwiftyGPTUsage: Codable {
public struct SwiftyGPTChatUsage: Codable {
public let promptTokens, completionTokens, totalTokens: Int

enum CodingKeys: String, CodingKey {
Expand All @@ -41,7 +41,7 @@ public struct SwiftyGPTUsage: Codable {
}

// MARK: - FinishReason
public enum SwiftyGPTFinishReason: String, Codable {
public enum SwiftyGPTChatFinishReason: String, Codable {
case stop
case length
case contentFilter = "content_filter"
Expand Down
Loading

0 comments on commit f03782d

Please sign in to comment.