Skip to content

Commit

Permalink
v0.3.54
Browse files Browse the repository at this point in the history
  • Loading branch information
github-actions committed May 17, 2024
1 parent d6b8dbc commit e1bbb7c
Show file tree
Hide file tree
Showing 8 changed files with 107 additions and 401 deletions.
61 changes: 58 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,64 @@ for chunk in response:
print(chunk.choices[0].delta["content"], end="")
```

### Prompt Templates

Should your operations entail the frequent utilization of identical prompt structures, the **Prompt Template** functionality facilitates the streamlining of this process. It enables the instantiation and subsequent reuse of predefined prompts, thereby optimizing efficiency and maintaining uniformity across interactions.

#### Creating a Prompt Template
Create your own Prompt Template in just a few steps:

- Navigate to the **Launchpad** section of your project.
- Click on the **Templates** tab.
- Hit the button to create a new Prompt Template.

From here, you can either create your custom Prompt Template or choose one of our default presets.

Within the template, you can include placeholders for dynamic content by using the `${placeholder_name}` syntax, as illustrated below:

```markdown
Summarize the following text:
"""
${text}
"""
```

In this example, we have a placeholder named `text`. To implement this through our SDK, follow this sample code:

```python
# Text you want to summarize
text_to_summarize = "This is the great tale of ... "
# Construct the message with the template
messages = [
{
"role": "user",
"template_id": TEMPLATE_ID, # Your template's ID
"params": {"text": text_to_summarize}
}
]

response = client.chat.completions.create(
project_id=project_id,
messages=messages
)
```

#### Key Points for Using Prompt Templates
When using prompt templates, remember these important guidelines:
- Replace the `content` field with `template_id` and `params`.
- Both `template_id` (the unique ID of your prompt template) and `params` (a key-value object mapping your placeholders to their desired values) are required to utilize prompt templates.

Any keys in `params` not matching placeholders will be ignored. If a placeholder is omitted in `params`, it defaults to an empty string. For instance, if you provide the following message set, the `${text}` placeholder will be left empty:

```python
messages = [
{
"role": "user",
"template_id": TEMPLATE_ID,
"params": {} # No parameters provided for placeholders
}
]
```
### Optional parameters

By default, the `chat.completions` module uses the default launchpad parameters. You can also specify the following optional parameters:
Expand All @@ -74,9 +132,6 @@ By default, the `chat.completions` module uses the default launchpad parameters.
- `session_id`: A unique identifier to maintain session context, useful for tracking conversations or data across multiple requests.
- `temperature`: The temperature to use for completion. If omitted, the default launchpad temperature will be used.
- `max_tokens`: The maximum number of tokens to generate for completion. If omitted, the default launchpad max tokens will be used.
- `top_p`: The nucleus sampling probability to use for completion. If omitted, the default launchpad top p will be used.
- `frequency_penalty`: The frequency penalty to use for completion. If omitted, the default launchpad frequency penalty will be used.
- `presence_penalty`: The presence penalty to use for completion. If omitted, the default launchpad presence penalty will be used.

Example:

Expand Down
8 changes: 2 additions & 6 deletions premai/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,6 @@
from .catch_all_error import CatchAllError
from .catch_all_error_code_enum import CatchAllErrorCodeEnum
from .chat_completion_input import ChatCompletionInput, ChatCompletionInputDict
from .chat_completion_input_logit_bias_type_0 import ChatCompletionInputLogitBiasType0
from .chat_completion_input_response_format_type_0 import ChatCompletionInputResponseFormatType0
from .chat_completion_input_tools_item import ChatCompletionInputToolsItem
from .chat_completion_response import ChatCompletionResponse
from .conflict_error import ConflictError
from .conflict_error_code_enum import ConflictErrorCodeEnum
Expand All @@ -26,6 +23,7 @@
from .feedback_create import FeedbackCreate, FeedbackCreateDict
from .feedback_create_feedback import FeedbackCreateFeedback
from .message import Message
from .message_params import MessageParams
from .message_role_enum import MessageRoleEnum
from .messages import Messages
from .messages_role_enum import MessagesRoleEnum
Expand Down Expand Up @@ -73,9 +71,6 @@
"CatchAllError",
"CatchAllErrorCodeEnum",
"ChatCompletionInput",
"ChatCompletionInputLogitBiasType0",
"ChatCompletionInputResponseFormatType0",
"ChatCompletionInputToolsItem",
"ChatCompletionResponse",
"ConflictError",
"ConflictErrorCodeEnum",
Expand All @@ -91,6 +86,7 @@
"FeedbackCreate",
"FeedbackCreateFeedback",
"Message",
"MessageParams",
"MessageRoleEnum",
"Messages",
"MessagesRoleEnum",
Expand Down
Loading

0 comments on commit e1bbb7c

Please sign in to comment.