Skip to content

The official Go library for Openlayer, the Evaluation Platform for AI. πŸ“ˆ

License

Notifications You must be signed in to change notification settings

openlayer-ai/openlayer-go

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

71 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Openlayer Go API Library

Go Reference

The Openlayer Go library provides convenient access to the Openlayer REST API from applications written in Go. The full API of this library can be found in api.md.

It is generated with Stainless.

Installation

import (
	"github.com/openlayer-ai/openlayer-go" // imported as openlayer
)

Or to pin the version:

go get -u 'github.com/openlayer-ai/openlayer-go@v0.1.0-alpha.11'

Requirements

This library requires Go 1.18+.

Usage

The full API of this library can be found in api.md.

package main

import (
	"context"
	"fmt"

	"github.com/openlayer-ai/openlayer-go"
	"github.com/openlayer-ai/openlayer-go/option"
)

func main() {
	client := openlayer.NewClient(
		option.WithAPIKey("My API Key"), // defaults to os.LookupEnv("OPENLAYER_API_KEY")
	)
	response, err := client.InferencePipelines.Data.Stream(
		context.TODO(),
		"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
		openlayer.InferencePipelineDataStreamParams{
			Config: openlayer.F[openlayer.InferencePipelineDataStreamParamsConfigUnion](openlayer.InferencePipelineDataStreamParamsConfigLlmData{
				InputVariableNames:   openlayer.F([]string{"user_query"}),
				OutputColumnName:     openlayer.F("output"),
				NumOfTokenColumnName: openlayer.F("tokens"),
				CostColumnName:       openlayer.F("cost"),
				TimestampColumnName:  openlayer.F("timestamp"),
			}),
			Rows: openlayer.F([]map[string]interface{}{{
				"user_query": "what is the meaning of life?",
				"output":     "42",
				"tokens":     map[string]interface{}{},
				"cost":       map[string]interface{}{},
				"timestamp":  map[string]interface{}{},
			}}),
		},
	)
	if err != nil {
		panic(err.Error())
	}
	fmt.Printf("%+v\n", response.Success)
}

Request fields

All request parameters are wrapped in a generic Field type, which we use to distinguish zero values from null or omitted fields.

This prevents accidentally sending a zero value if you forget a required parameter, and enables explicitly sending null, false, '', or 0 on optional parameters. Any field not specified is not sent.

To construct fields with values, use the helpers String(), Int(), Float(), or most commonly, the generic F[T](). To send a null, use Null[T](), and to send a nonconforming value, use Raw[T](any). For example:

params := FooParams{
	Name: openlayer.F("hello"),

	// Explicitly send `"description": null`
	Description: openlayer.Null[string](),

	Point: openlayer.F(openlayer.Point{
		X: openlayer.Int(0),
		Y: openlayer.Int(1),

		// In cases where the API specifies a given type,
		// but you want to send something else, use `Raw`:
		Z: openlayer.Raw[int64](0.01), // sends a float
	}),
}

Response objects

All fields in response structs are value types (not pointers or wrappers).

If a given field is null, not present, or invalid, the corresponding field will simply be its zero value.

All response structs also include a special JSON field, containing more detailed information about each property, which you can use like so:

if res.Name == "" {
	// true if `"name"` is either not present or explicitly null
	res.JSON.Name.IsNull()

	// true if the `"name"` key was not present in the repsonse JSON at all
	res.JSON.Name.IsMissing()

	// When the API returns data that cannot be coerced to the expected type:
	if res.JSON.Name.IsInvalid() {
		raw := res.JSON.Name.Raw()

		legacyName := struct{
			First string `json:"first"`
			Last  string `json:"last"`
		}{}
		json.Unmarshal([]byte(raw), &legacyName)
		name = legacyName.First + " " + legacyName.Last
	}
}

These .JSON structs also include an Extras map containing any properties in the json response that were not specified in the struct. This can be useful for API features not yet present in the SDK.

body := res.JSON.ExtraFields["my_unexpected_field"].Raw()

RequestOptions

This library uses the functional options pattern. Functions defined in the option package return a RequestOption, which is a closure that mutates a RequestConfig. These options can be supplied to the client or at individual requests. For example:

client := openlayer.NewClient(
	// Adds a header to every request made by the client
	option.WithHeader("X-Some-Header", "custom_header_info"),
)

client.InferencePipelines.Data.Stream(context.TODO(), ...,
	// Override the header
	option.WithHeader("X-Some-Header", "some_other_custom_header_info"),
	// Add an undocumented field to the request body, using sjson syntax
	option.WithJSONSet("some.json.path", map[string]string{"my": "object"}),
)

See the full list of request options.

Pagination

This library provides some conveniences for working with paginated list endpoints.

You can use .ListAutoPaging() methods to iterate through items across all pages:

Or you can use simple .List() methods to fetch a single page and receive a standard response object with additional helper methods like .GetNextPage(), e.g.:

Errors

When the API returns a non-success status code, we return an error with type *openlayer.Error. This contains the StatusCode, *http.Request, and *http.Response values of the request, as well as the JSON of the error body (much like other response objects in the SDK).

To handle errors, we recommend that you use the errors.As pattern:

_, err := client.InferencePipelines.Data.Stream(
	context.TODO(),
	"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
	openlayer.InferencePipelineDataStreamParams{
		Config: openlayer.F[openlayer.InferencePipelineDataStreamParamsConfigUnion](openlayer.InferencePipelineDataStreamParamsConfigLlmData{
			InputVariableNames:   openlayer.F([]string{"user_query"}),
			OutputColumnName:     openlayer.F("output"),
			NumOfTokenColumnName: openlayer.F("tokens"),
			CostColumnName:       openlayer.F("cost"),
			TimestampColumnName:  openlayer.F("timestamp"),
		}),
		Rows: openlayer.F([]map[string]interface{}{{
			"user_query": "what is the meaning of life?",
			"output":     "42",
			"tokens":     map[string]interface{}{},
			"cost":       map[string]interface{}{},
			"timestamp":  map[string]interface{}{},
		}}),
	},
)
if err != nil {
	var apierr *openlayer.Error
	if errors.As(err, &apierr) {
		println(string(apierr.DumpRequest(true)))  // Prints the serialized HTTP request
		println(string(apierr.DumpResponse(true))) // Prints the serialized HTTP response
	}
	panic(err.Error()) // GET "/inference-pipelines/{inferencePipelineId}/data-stream": 400 Bad Request { ... }
}

When other errors occur, they are returned unwrapped; for example, if HTTP transport fails, you might receive *url.Error wrapping *net.OpError.

Timeouts

Requests do not time out by default; use context to configure a timeout for a request lifecycle.

Note that if a request is retried, the context timeout does not start over. To set a per-retry timeout, use option.WithRequestTimeout().

// This sets the timeout for the request, including all the retries.
ctx, cancel := context.WithTimeout(context.Background(), 5*time.Minute)
defer cancel()
client.InferencePipelines.Data.Stream(
	ctx,
	"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
	openlayer.InferencePipelineDataStreamParams{
		Config: openlayer.F[openlayer.InferencePipelineDataStreamParamsConfigUnion](openlayer.InferencePipelineDataStreamParamsConfigLlmData{
			InputVariableNames:   openlayer.F([]string{"user_query"}),
			OutputColumnName:     openlayer.F("output"),
			NumOfTokenColumnName: openlayer.F("tokens"),
			CostColumnName:       openlayer.F("cost"),
			TimestampColumnName:  openlayer.F("timestamp"),
		}),
		Rows: openlayer.F([]map[string]interface{}{{
			"user_query": "what is the meaning of life?",
			"output":     "42",
			"tokens":     map[string]interface{}{},
			"cost":       map[string]interface{}{},
			"timestamp":  map[string]interface{}{},
		}}),
	},
	// This sets the per-retry timeout
	option.WithRequestTimeout(20*time.Second),
)

File uploads

Request parameters that correspond to file uploads in multipart requests are typed as param.Field[io.Reader]. The contents of the io.Reader will by default be sent as a multipart form part with the file name of "anonymous_file" and content-type of "application/octet-stream".

The file name and content-type can be customized by implementing Name() string or ContentType() string on the run-time type of io.Reader. Note that os.File implements Name() string, so a file returned by os.Open will be sent with the file name on disk.

We also provide a helper openlayer.FileParam(reader io.Reader, filename string, contentType string) which can be used to wrap any io.Reader with the appropriate file name and content type.

Retries

Certain errors will be automatically retried 2 times by default, with a short exponential backoff. We retry by default all connection errors, 408 Request Timeout, 409 Conflict, 429 Rate Limit, and >=500 Internal errors.

You can use the WithMaxRetries option to configure or disable this:

// Configure the default for all requests:
client := openlayer.NewClient(
	option.WithMaxRetries(0), // default is 2
)

// Override per-request:
client.InferencePipelines.Data.Stream(
	context.TODO(),
	"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
	openlayer.InferencePipelineDataStreamParams{
		Config: openlayer.F[openlayer.InferencePipelineDataStreamParamsConfigUnion](openlayer.InferencePipelineDataStreamParamsConfigLlmData{
			InputVariableNames:   openlayer.F([]string{"user_query"}),
			OutputColumnName:     openlayer.F("output"),
			NumOfTokenColumnName: openlayer.F("tokens"),
			CostColumnName:       openlayer.F("cost"),
			TimestampColumnName:  openlayer.F("timestamp"),
		}),
		Rows: openlayer.F([]map[string]interface{}{{
			"user_query": "what is the meaning of life?",
			"output":     "42",
			"tokens":     map[string]interface{}{},
			"cost":       map[string]interface{}{},
			"timestamp":  map[string]interface{}{},
		}}),
	},
	option.WithMaxRetries(5),
)

Making custom/undocumented requests

This library is typed for convenient access to the documented API. If you need to access undocumented endpoints, params, or response properties, the library can still be used.

Undocumented endpoints

To make requests to undocumented endpoints, you can use client.Get, client.Post, and other HTTP verbs. RequestOptions on the client, such as retries, will be respected when making these requests.

var (
    // params can be an io.Reader, a []byte, an encoding/json serializable object,
    // or a "…Params" struct defined in this library.
    params map[string]interface{}

    // result can be an []byte, *http.Response, a encoding/json deserializable object,
    // or a model defined in this library.
    result *http.Response
)
err := client.Post(context.Background(), "/unspecified", params, &result)
if err != nil {
    …
}

Undocumented request params

To make requests using undocumented parameters, you may use either the option.WithQuerySet() or the option.WithJSONSet() methods.

params := FooNewParams{
    ID:   openlayer.F("id_xxxx"),
    Data: openlayer.F(FooNewParamsData{
        FirstName: openlayer.F("John"),
    }),
}
client.Foo.New(context.Background(), params, option.WithJSONSet("data.last_name", "Doe"))

Undocumented response properties

To access undocumented response properties, you may either access the raw JSON of the response as a string with result.JSON.RawJSON(), or get the raw JSON of a particular field on the result with result.JSON.Foo.Raw().

Any fields that are not present on the response struct will be saved and can be accessed by result.JSON.ExtraFields() which returns the extra fields as a map[string]Field.

Middleware

We provide option.WithMiddleware which applies the given middleware to requests.

func Logger(req *http.Request, next option.MiddlewareNext) (res *http.Response, err error) {
	// Before the request
	start := time.Now()
	LogReq(req)

	// Forward the request to the next handler
	res, err = next(req)

	// Handle stuff after the request
	end := time.Now()
	LogRes(res, err, start - end)

    return res, err
}

client := openlayer.NewClient(
	option.WithMiddleware(Logger),
)

When multiple middlewares are provided as variadic arguments, the middlewares are applied left to right. If option.WithMiddleware is given multiple times, for example first in the client then the method, the middleware in the client will run first and the middleware given in the method will run next.

You may also replace the default http.Client with option.WithHTTPClient(client). Only one http client is accepted (this overwrites any previous client) and receives requests after any middleware has been applied.

Semantic versioning

This package generally follows SemVer conventions, though certain backwards-incompatible changes may be released as minor versions:

  1. Changes to library internals which are technically public but not intended or documented for external use. (Please open a GitHub issue to let us know if you are relying on such internals).
  2. Changes that we do not expect to impact the vast majority of users in practice.

We take backwards-compatibility seriously and work hard to ensure you can rely on a smooth upgrade experience.

We are keen for your feedback; please open an issue with questions, bugs, or suggestions.

Contributing

See the contributing documentation.

About

The official Go library for Openlayer, the Evaluation Platform for AI. πŸ“ˆ

Resources

License

Security policy

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages