Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(api): update via SDK Studio #4

Merged
merged 1 commit into from
Jun 7, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
98 changes: 49 additions & 49 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ It is generated with [Stainless](https://www.stainlessapi.com/).

```go
import (
"github.com/openlayer-ai/openlayer-go" // imported as githubcomopenlayeraiopenlayergo
"github.com/openlayer-ai/openlayer-go" // imported as openlayer
)
```

Expand Down Expand Up @@ -49,21 +49,21 @@ import (
)

func main() {
client := githubcomopenlayeraiopenlayergo.NewClient(
client := openlayer.NewClient(
option.WithAPIKey("My API Key"), // defaults to os.LookupEnv("OPENLAYER_API_KEY")
)
inferencePipelineDataStreamResponse, err := client.InferencePipelines.Data.Stream(
context.TODO(),
"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
githubcomopenlayeraiopenlayergo.InferencePipelineDataStreamParams{
Config: githubcomopenlayeraiopenlayergo.F[githubcomopenlayeraiopenlayergo.InferencePipelineDataStreamParamsConfigUnion](githubcomopenlayeraiopenlayergo.InferencePipelineDataStreamParamsConfigLlmData{
InputVariableNames: githubcomopenlayeraiopenlayergo.F([]string{"user_query"}),
OutputColumnName: githubcomopenlayeraiopenlayergo.F("output"),
NumOfTokenColumnName: githubcomopenlayeraiopenlayergo.F("tokens"),
CostColumnName: githubcomopenlayeraiopenlayergo.F("cost"),
TimestampColumnName: githubcomopenlayeraiopenlayergo.F("timestamp"),
openlayer.InferencePipelineDataStreamParams{
Config: openlayer.F[openlayer.InferencePipelineDataStreamParamsConfigUnion](openlayer.InferencePipelineDataStreamParamsConfigLlmData{
InputVariableNames: openlayer.F([]string{"user_query"}),
OutputColumnName: openlayer.F("output"),
NumOfTokenColumnName: openlayer.F("tokens"),
CostColumnName: openlayer.F("cost"),
TimestampColumnName: openlayer.F("timestamp"),
}),
Rows: githubcomopenlayeraiopenlayergo.F([]map[string]interface{}{{
Rows: openlayer.F([]map[string]interface{}{{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": map[string]interface{}{},
Expand Down Expand Up @@ -94,18 +94,18 @@ To send a null, use `Null[T]()`, and to send a nonconforming value, use `Raw[T](

```go
params := FooParams{
Name: githubcomopenlayeraiopenlayergo.F("hello"),
Name: openlayer.F("hello"),

// Explicitly send `"description": null`
Description: githubcomopenlayeraiopenlayergo.Null[string](),
Description: openlayer.Null[string](),

Point: githubcomopenlayeraiopenlayergo.F(githubcomopenlayeraiopenlayergo.Point{
X: githubcomopenlayeraiopenlayergo.Int(0),
Y: githubcomopenlayeraiopenlayergo.Int(1),
Point: openlayer.F(openlayer.Point{
X: openlayer.Int(0),
Y: openlayer.Int(1),

// In cases where the API specifies a given type,
// but you want to send something else, use `Raw`:
Z: githubcomopenlayeraiopenlayergo.Raw[int64](0.01), // sends a float
Z: openlayer.Raw[int64](0.01), // sends a float
}),
}
```
Expand Down Expand Up @@ -159,7 +159,7 @@ This library uses the functional options pattern. Functions defined in the
requests. For example:

```go
client := githubcomopenlayeraiopenlayergo.NewClient(
client := openlayer.NewClient(
// Adds a header to every request made by the client
option.WithHeader("X-Some-Header", "custom_header_info"),
)
Expand All @@ -186,7 +186,7 @@ with additional helper methods like `.GetNextPage()`, e.g.:
### Errors

When the API returns a non-success status code, we return an error with type
`*githubcomopenlayeraiopenlayergo.Error`. This contains the `StatusCode`, `*http.Request`, and
`*openlayer.Error`. This contains the `StatusCode`, `*http.Request`, and
`*http.Response` values of the request, as well as the JSON of the error body
(much like other response objects in the SDK).

Expand All @@ -196,15 +196,15 @@ To handle errors, we recommend that you use the `errors.As` pattern:
_, err := client.InferencePipelines.Data.Stream(
context.TODO(),
"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
githubcomopenlayeraiopenlayergo.InferencePipelineDataStreamParams{
Config: githubcomopenlayeraiopenlayergo.F[githubcomopenlayeraiopenlayergo.InferencePipelineDataStreamParamsConfigUnion](githubcomopenlayeraiopenlayergo.InferencePipelineDataStreamParamsConfigLlmData{
InputVariableNames: githubcomopenlayeraiopenlayergo.F([]string{"user_query"}),
OutputColumnName: githubcomopenlayeraiopenlayergo.F("output"),
NumOfTokenColumnName: githubcomopenlayeraiopenlayergo.F("tokens"),
CostColumnName: githubcomopenlayeraiopenlayergo.F("cost"),
TimestampColumnName: githubcomopenlayeraiopenlayergo.F("timestamp"),
openlayer.InferencePipelineDataStreamParams{
Config: openlayer.F[openlayer.InferencePipelineDataStreamParamsConfigUnion](openlayer.InferencePipelineDataStreamParamsConfigLlmData{
InputVariableNames: openlayer.F([]string{"user_query"}),
OutputColumnName: openlayer.F("output"),
NumOfTokenColumnName: openlayer.F("tokens"),
CostColumnName: openlayer.F("cost"),
TimestampColumnName: openlayer.F("timestamp"),
}),
Rows: githubcomopenlayeraiopenlayergo.F([]map[string]interface{}{{
Rows: openlayer.F([]map[string]interface{}{{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": map[string]interface{}{},
Expand All @@ -214,7 +214,7 @@ _, err := client.InferencePipelines.Data.Stream(
},
)
if err != nil {
var apierr *githubcomopenlayeraiopenlayergo.Error
var apierr *openlayer.Error
if errors.As(err, &apierr) {
println(string(apierr.DumpRequest(true))) // Prints the serialized HTTP request
println(string(apierr.DumpResponse(true))) // Prints the serialized HTTP response
Expand All @@ -240,15 +240,15 @@ defer cancel()
client.InferencePipelines.Data.Stream(
ctx,
"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
githubcomopenlayeraiopenlayergo.InferencePipelineDataStreamParams{
Config: githubcomopenlayeraiopenlayergo.F[githubcomopenlayeraiopenlayergo.InferencePipelineDataStreamParamsConfigUnion](githubcomopenlayeraiopenlayergo.InferencePipelineDataStreamParamsConfigLlmData{
InputVariableNames: githubcomopenlayeraiopenlayergo.F([]string{"user_query"}),
OutputColumnName: githubcomopenlayeraiopenlayergo.F("output"),
NumOfTokenColumnName: githubcomopenlayeraiopenlayergo.F("tokens"),
CostColumnName: githubcomopenlayeraiopenlayergo.F("cost"),
TimestampColumnName: githubcomopenlayeraiopenlayergo.F("timestamp"),
openlayer.InferencePipelineDataStreamParams{
Config: openlayer.F[openlayer.InferencePipelineDataStreamParamsConfigUnion](openlayer.InferencePipelineDataStreamParamsConfigLlmData{
InputVariableNames: openlayer.F([]string{"user_query"}),
OutputColumnName: openlayer.F("output"),
NumOfTokenColumnName: openlayer.F("tokens"),
CostColumnName: openlayer.F("cost"),
TimestampColumnName: openlayer.F("timestamp"),
}),
Rows: githubcomopenlayeraiopenlayergo.F([]map[string]interface{}{{
Rows: openlayer.F([]map[string]interface{}{{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": map[string]interface{}{},
Expand All @@ -271,7 +271,7 @@ The file name and content-type can be customized by implementing `Name() string`
string` on the run-time type of `io.Reader`. Note that `os.File` implements `Name() string`, so a
file returned by `os.Open` will be sent with the file name on disk.

We also provide a helper `githubcomopenlayeraiopenlayergo.FileParam(reader io.Reader, filename string, contentType string)`
We also provide a helper `openlayer.FileParam(reader io.Reader, filename string, contentType string)`
which can be used to wrap any `io.Reader` with the appropriate file name and content type.

### Retries
Expand All @@ -284,23 +284,23 @@ You can use the `WithMaxRetries` option to configure or disable this:

```go
// Configure the default for all requests:
client := githubcomopenlayeraiopenlayergo.NewClient(
client := openlayer.NewClient(
option.WithMaxRetries(0), // default is 2
)

// Override per-request:
client.InferencePipelines.Data.Stream(
context.TODO(),
"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
githubcomopenlayeraiopenlayergo.InferencePipelineDataStreamParams{
Config: githubcomopenlayeraiopenlayergo.F[githubcomopenlayeraiopenlayergo.InferencePipelineDataStreamParamsConfigUnion](githubcomopenlayeraiopenlayergo.InferencePipelineDataStreamParamsConfigLlmData{
InputVariableNames: githubcomopenlayeraiopenlayergo.F([]string{"user_query"}),
OutputColumnName: githubcomopenlayeraiopenlayergo.F("output"),
NumOfTokenColumnName: githubcomopenlayeraiopenlayergo.F("tokens"),
CostColumnName: githubcomopenlayeraiopenlayergo.F("cost"),
TimestampColumnName: githubcomopenlayeraiopenlayergo.F("timestamp"),
openlayer.InferencePipelineDataStreamParams{
Config: openlayer.F[openlayer.InferencePipelineDataStreamParamsConfigUnion](openlayer.InferencePipelineDataStreamParamsConfigLlmData{
InputVariableNames: openlayer.F([]string{"user_query"}),
OutputColumnName: openlayer.F("output"),
NumOfTokenColumnName: openlayer.F("tokens"),
CostColumnName: openlayer.F("cost"),
TimestampColumnName: openlayer.F("timestamp"),
}),
Rows: githubcomopenlayeraiopenlayergo.F([]map[string]interface{}{{
Rows: openlayer.F([]map[string]interface{}{{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": map[string]interface{}{},
Expand Down Expand Up @@ -345,9 +345,9 @@ or the `option.WithJSONSet()` methods.

```go
params := FooNewParams{
ID: githubcomopenlayeraiopenlayergo.F("id_xxxx"),
Data: githubcomopenlayeraiopenlayergo.F(FooNewParamsData{
FirstName: githubcomopenlayeraiopenlayergo.F("John"),
ID: openlayer.F("id_xxxx"),
Data: openlayer.F(FooNewParamsData{
FirstName: openlayer.F("John"),
}),
}
client.Foo.New(context.Background(), params, option.WithJSONSet("data.last_name", "Doe"))
Expand Down Expand Up @@ -382,7 +382,7 @@ func Logger(req *http.Request, next option.MiddlewareNext) (res *http.Response,
return res, err
}

client := githubcomopenlayeraiopenlayergo.NewClient(
client := openlayer.NewClient(
option.WithMiddleware(Logger),
)
```
Expand Down
2 changes: 1 addition & 1 deletion aliases.go
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
// File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.

package githubcomopenlayeraiopenlayergo
package openlayer

import (
"github.com/openlayer-ai/openlayer-go/internal/apierror"
Expand Down
Loading