Skip to content

Commit

Permalink
Fix grammar and improve clarity in docs
Browse files Browse the repository at this point in the history
  • Loading branch information
rizerphe committed Jul 1, 2023
1 parent 4a5e17d commit 6ba3054
Show file tree
Hide file tree
Showing 6 changed files with 24 additions and 29 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ pip install openai-functions

## Usage

1. Import the necessary modules and provide your api key:
1. Import the necessary modules and provide your API key:

```python
import enum
Expand Down Expand Up @@ -61,7 +61,7 @@ response = conversation.ask("What's the weather in San Francisco?")
# The current weather in San Francisco is 72 degrees Fahrenheit and it is sunny and windy.
```

You can read more about `Conversation`s [here](https://openai-functions.readthedocs.io/en/latest/conversation.html).
You can read more about how to use `Conversation` [here](https://openai-functions.readthedocs.io/en/latest/conversation.html).

## More barebones use - just schema generation and result parsing:

Expand All @@ -77,7 +77,7 @@ Or you could use [skills](https://openai-functions.readthedocs.io/en/latest/skil

## Another use case: data extraction

1. Import the necessary modules and provide your api key:
1. Import the necessary modules and provide your API key:

```python
from dataclasses import dataclass
Expand Down
8 changes: 4 additions & 4 deletions docs/conversation.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Conversations

For assistant-type applications, `Conversation` is the most intuitive tool. It allows you to store messages and generate new ones, either by using the AI or by calling a function you provide.
For assistant-type applications, `Conversation` is the most intuitive tool. It allows you to store messages and generate new ones using AI or calling a function you provide.

A conversation contains two things:

Expand All @@ -11,7 +11,7 @@ When initializing the conversation, you can pass in the list of skills and the m

## Managing messages

The main feature of a conversation is its management of messages. You can either directly access them with `conversation.messages` (which is a list of objects adhering to the [GenericMessage](openai_functions.GenericMessage) protocol - use [Message](openai_functions.Message) to create your own), or you can do these:
The main feature of a conversation is its management of messages. You can either use `conversation.messages` (which is a list of objects adhering to the [GenericMessage](openai_functions.GenericMessage) protocol - use [Message](openai_functions.Message) to create your own) to directly access them or you can use these:

```python
conversation.add_message(Message("Hi there", role="user")) # "system", "user", "assistant"
Expand All @@ -27,7 +27,7 @@ conversation.clear_messages()

## Managing skills

A conversation also includes the skills - the functions the AI can call. You can either provide your skills when creating the conversation, or add skills/functions like this:
A conversation also includes the skills - the functions the AI can call. You can either provide your skills when creating the conversation or add skills/functions like this:

```python
conversation.add_skill(skill)
Expand All @@ -53,7 +53,7 @@ conversation.remove_function("my_amazing_function")
The arguments passed to `add_function` are the same as those an [OpenAIFunction](openai_functions.OpenAIFunction) inherently has:

- `save_return` - whether to send the return value of the function back to the AI; some functions - mainly those that don't return anything - don't need to do this
- `serialize` - whether to serialize the return value of the function before sending the result back to the AI; openai expects the result of a function call to be a string, so if this is set to False, the result of the function execution should be a string. Otherwise, it will use json serialization, so if `serialize` is set to True, the function return needs to be json-serializable
- `serialize` - whether to serialize the function's return value before sending the result back to the AI; openai expects a function call to be a string, so if this is False, the result of the function execution should be a string. Otherwise, it will use JSON serialization, so if `serialize` is set to True, the function return needs to be JSON-serializable
- `interpret_as_response` - whether to interpret the return value of the function (the serialized one if `serialize` is set to True) as the response from the AI, replacing the function call

You can read more about how to use skills [here](skills).
9 changes: 2 additions & 7 deletions docs/index.rst
Original file line number Diff line number Diff line change
@@ -1,18 +1,13 @@
.. openai-functions documentation master file, created by
sphinx-quickstart on Fri Jun 30 19:32:59 2023.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to openai-functions's documentation!
============================================


The `openai-functions` library simplifies the usage of OpenAI's function calling feature. It abstracts away the complexity of parsing function signatures and docstrings by providing developers with a clean and intuitive interface.
The ``openai-functions`` library simplifies the usage of OpenAI's function calling feature. It abstracts away the complexity of parsing function signatures and docstrings by providing developers with a clean and intuitive interface.

Where to start
--------------

Either way, you'll want to install `openai-functions`:
Either way, you'll want to install ``openai-functions``:

.. code-block:: bash
Expand Down
16 changes: 8 additions & 8 deletions docs/introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,13 +26,13 @@ pip install openai-functions
Now, there are **three ways you can use this** - start with just one:

- For managing conversations, use the [conversational](#your-first-conversation) interface
- For data extraction etc, for working with just one function, use the [data extraction](extracting-data) interface
- For data extraction etc., for working with just one function, use the [data extraction](extracting-data) interface
- For just generating schemas and parsing call results, nothing more, use [raw schema generation](just-generating-the-schemas) next.

(your-first-conversation)=
## Your first conversation

The easiest way to use `openai-functions` is through the [conversation](conversation) interface. For that, you first import all of the necessary modules and initialize openai with your api key:
The easiest way to use `openai-functions` is through the [conversation](conversation) interface. For that, you first import all of the necessary modules and initialize openai with your API key:

```python
import enum
Expand All @@ -48,7 +48,7 @@ Then, we can create a [conversation](openai_functions.Conversation).
conversation = Conversation()
```

A conversation contains our and the AI's messages, the functions we provide, as well as a set of methods for calling the AI with our functions. Now, we can add our functions to the conversation using the `@conversation.add_function` decorator to make them available for the AI:
A conversation contains our and the AI's messages, the functions we provide, and a set of methods for calling the AI with our functions. Now, we can add our functions to the conversation using the `@conversation.add_function` decorator to make them available for the AI:

```python
class Unit(enum.Enum):
Expand All @@ -71,20 +71,20 @@ def get_current_weather(location: str, unit: Unit = Unit.FAHRENHEIT) -> dict:
}
```

Note that the function _must_ have type annotations for all arguments, and this includes extended type annotations for lists/dictionaries (for example, `list[int]` and not just `list`), otherwise the tool won't be able to generate a schema. Our conversation is now ready for function calling. The easiest way to do so is through the `conversation.ask` method. This method will repeatedly ask the AI for a response, running function calls, until the AI responds with text to return:
Note that the function _must_ have type annotations for all arguments, including extended type annotations for lists/dictionaries (for example, `list[int]` and not just `list`); otherwise the tool won't be able to generate a schema. Our conversation is now ready for function calling. The easiest way to do so is through the `conversation.ask` method. This method will repeatedly ask the AI for a response, running function calls, until the AI responds with text to return:

```python
response = conversation.ask("What's the weather in San Francisco?")
# Should return something like:
# The current weather in San Francisco is 72 degrees Fahrenheit and it is sunny and windy.
```

The AI will probably (nobody can say for sure) then return a function call with the arguments of `{"location": "San Francisco, CA"}`, which will get translated to `get_current_weather("San Francisco, CA")`. The function response will be serialized and sent back to the AI, and a text description will be returned. You can read more about how to work with conversations [here](conversation).
The AI will probably (nobody can say for sure) then return a function call with the arguments of `{"location": "San Francisco, CA"}`, which will get translated to `get_current_weather("San Francisco, CA")`. The function response will be serialized and sent back to the AI, and the AI will return a text description. You can read more about how to work with conversations [here](conversation).

(extracting-data)=
## Extracting data

There are two common uses for function calls: assistant-type applications, which is what conversations are for, and data extraction, where you force the AI to call a specific function and to fill in the arguments. For data extraction, we have the [nlp interface](nlp_interface). It acts as a decorator, turning a function (or a class, including a dataclass) into a [wrapper](openai_functions.Wrapper) object, exposing methods for calling a function with natural language and annotating the call result with an AI response. To use it, you first import all of the necessary modules and initialize openai with your api key:
There are two common uses for function calls: assistant-type applications, which is what conversations are for, and data extraction, where you force the AI to call a specific function and fill in the arguments. We have the [nlp interface](nlp_interface) for data extraction. It acts as a decorator, turning a function (or a class, including a dataclass) into a [wrapper](openai_functions.Wrapper) object, exposing methods for calling a function with natural language and annotating the call result with an AI response. To use it, you first import all of the necessary modules and initialize openai with your API key:

```python
from dataclasses import dataclass
Expand Down Expand Up @@ -121,7 +121,7 @@ The tool will call the AI, telling it to call the function `Person`. It will the
(just-generating-the-schemas)=
## Just generating the schemas

If you just want to generate the schemas, you can use a [FunctionWrapper](openai_functions.FunctionWrapper):
If you want to generate the schemas, you can use a [FunctionWrapper](openai_functions.FunctionWrapper):

```python
from openai_functions import FunctionWrapper
Expand All @@ -131,7 +131,7 @@ schema = wrapper.schema
result = wrapper({"location": "San Francisco, CA"})
```

This creates an object that can both return you a schema of a function and provide the function with the properly parsed arguments. Another tool is a [FunctionSet](openai_functions.BasicFunctionSet) that allows you to aggregate multiple functions into one schema:
This creates an object that can both return you a schema of a function and provide the function with properly parsed arguments. Another tool is a [FunctionSet](openai_functions.BasicFunctionSet) that allows you to aggregate multiple functions into one schema:

```python
from openai_functions import BasicFunctionSet
Expand Down
12 changes: 6 additions & 6 deletions docs/skills.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Skills

A skill allows you to combine several functions into one object, generate schemas for all of those functions, and then call the function that the AI requests. The most basic skill is one defined with a [BasicFunctionSet](openai_functions.BasicFunctionSet) - it is just a container for functions. Here's an example of its useage:
A skill allows you to combine several functions into one object, generate schemas for all those functions, and then call the function the AI requests. The most basic skill is defined with a [BasicFunctionSet](openai_functions.BasicFunctionSet) - it is just a function container. Here's an example of its usage:

```python
skill = BasicFunctionSet()
Expand All @@ -20,7 +20,7 @@ def set_weather(location: str, weather_description: str):
schema = skill.functions_schema
```

`schema` will be a list of JSON objects, ready to be sent to OpenAI. You can then call your functions directly with the response returned from OpenAI:
`schema` will be a list of JSON objects ready to be sent to OpenAI. You can then call your functions directly with the response returned from OpenAI:

```python
weather = skill(
Expand All @@ -30,22 +30,22 @@ weather = skill(

## Union skills

A bit more advanced is a [union skillset](openai_functions.UnionSkillSet) - one that combines others. It exposes one new method:
A more advanced one is a [union skillset](openai_functions.UnionSkillSet) that combines others. It exposes one new method:

```python
union_skill.add_skill(skill)
```

It still supports everything a [BasicFunctionSet](openai_functions.BasicFunctionSet) though; it can have a few functions inherent to it while still combining the other skillsets.
It still supports everything a [BasicFunctionSet](openai_functions.BasicFunctionSet), though; it can have a few functions inherent to it while still combining the other skillsets.

## Developing your own

Skills are made to be extensible; they must be inherited from the [FunctionSet](openai_functions.FunctionSet) base class. You then have to provide these methods and properties:
Skills are extensible; you can build your own by inheriting them from the [FunctionSet](openai_functions.FunctionSet) base class. You then have to provide these methods and properties:

- `functions_schema` - the schema of the functions; list of JSON objects
- `run_function(input_data)` - that runs the function and returns the result; takes in the raw dictionary retrieved from OpenAI. Should raise [FunctionNotFoundError](openai_functions.FunctionNotFoundError) if there isn't a function with this name in the skillset

You can also inherit from the [MutableFunctionSet](openai_functions.MutableFunctionSet), which simplifies adding and removing functions from the skill greatly. Then, you have to define two additional methods:
You can also inherit from the [MutableFunctionSet](openai_functions.MutableFunctionSet), which greatly simplifies adding and removing functions from the skill. Then, you have to define two additional methods:

- `_add_function(function)` - adds an [OpenAIFunction](openai_functions.OpenAIFunction) to the skill
- `_remove_function(name)` - takes in a string and deletes the function with that name
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "openai-functions"
version = "0.6.2"
version = "0.6.3"
description = "Simplifies the usage of OpenAI's function calling."
authors = ["rizerphe <44440399+rizerphe@users.noreply.github.com>"]
readme = "README.md"
Expand Down

0 comments on commit 6ba3054

Please sign in to comment.