diff --git a/content/guides/box-ai/ask-questions.md b/content/guides/box-ai/ask-questions.md index 16df9a200..c76937962 100644 --- a/content/guides/box-ai/ask-questions.md +++ b/content/guides/box-ai/ask-questions.md @@ -53,7 +53,7 @@ Mandatory parameters are in **bold**. |**`items.id`** | The Box file ID you want to provide as input. | | `112233445566`| | **`items.type`** | The type of the provided input. Currently, it can be a single file or multiple files. | `file` | `file` | | `items.content` | The content of the item, often the text representation. | | `An application programming interface (API) is a way for two or more computer programs or components to communicate with each other. It is a type of software interface...` | -|`ai_agent` | The AI agent used to override the default agent configuration. You can use this parameter replace the default LLM with a custom one using the [`model`][model-param] parameter for shorter and longer texts, tweak the base [`prompt`][prompt-param] to allow for a more customized user experience, or change an LLM parameter, such as `temperature`, to make the results more or less creative. Before you use the `ai_agent` parameter, you can get the default configuration using the [`GET 2.0/ai_agent_default`][agent] request. For specific use cases, see the [override tutorial][overrides] ||| +|`ai_agent` | The AI agent used to override the default agent configuration. You can use this parameter replace the default LLM with a custom one using the [`model`][model-param] parameter for shorter and longer texts, tweak the base [`prompt`][prompt-param] to allow for a more customized user experience, or change an LLM parameter, such as `temperature`, to make the results more or less creative. Before you use the `ai_agent` parameter, you can get the default configuration using the [`GET 2.0/ai_agent_default`][agent] request. For specific use cases, see the [AI model overrides tutorial][overrides]. || [prereq]: g://box-ai/prerequisites [agent]: e://get_ai_agent_default diff --git a/content/guides/box-ai/extract-metadata-structured.md b/content/guides/box-ai/extract-metadata-structured.md index d8a5ba095..51f21b5c0 100644 --- a/content/guides/box-ai/extract-metadata-structured.md +++ b/content/guides/box-ai/extract-metadata-structured.md @@ -56,7 +56,7 @@ The `items` array can have exactly one element. | `fields.options` | A list of options for this field. This is most often used in combination with the `enum` and `multiSelect` field types. | `[{"key":"First Name"},{"key":"Last Name"}]` | | `fields.options.key` | A unique identifier for the field. | `First Name` | | `fields.prompt` | Additional context about the key (identifier) that may include how to find and format it. | `Name is the first and last name from the email address` | -| `ai_agent` | The AI agent used to override the default agent configuration. This parameter allows you to, for example, replace the default LLM with a custom one using the [`model`][model-param] parameter, tweak the base [`prompt`][prompt-param] to allow for a more customized user experience, or change an LLM parameter, such as `temperature`, to make the results more or less creative. Before using the `ai_agent` parameter for overrides, you can get the default configuration using the [`GET 2.0/ai_agent_default`][agent] request. For details on | | +| `ai_agent` | The AI agent used to override the default agent configuration. This parameter allows you to, for example, replace the default LLM with a custom one using the [`model`][model-param] parameter, tweak the base [`prompt`][prompt-param] to allow for a more customized user experience, or change an LLM parameter, such as `temperature`, to make the results more or less creative. Before you use the `ai_agent` parameter, you can get the default configuration using the [`GET 2.0/ai_agent_default`][agent] request. For specific use cases, see the [AI model overrides tutorial][overrides]. | | ## Use case @@ -181,3 +181,4 @@ The response lists the fields included in the metadata template and their values [prompt-param]: r://ai_agent_text_gen#param_basic_gen_prompt_template [templates-console]: https://support.box.com/hc/en-us/articles/360044194033-Customizing-Metadata-Templates [templates-api]: g://metadata/templates/create +[overrides]: g://box-ai/ai-agents/overrides-tutorial \ No newline at end of file diff --git a/content/guides/box-ai/extract-metadata.md b/content/guides/box-ai/extract-metadata.md index 65ed52d59..1f62a0ea4 100644 --- a/content/guides/box-ai/extract-metadata.md +++ b/content/guides/box-ai/extract-metadata.md @@ -46,7 +46,7 @@ The `items` array can have exactly one element. | `dialogue_history.prompt` | The prompt previously provided by the client and answered by the Large Language Model (LLM). | `Make my email about public APIs sound more professional` | | `dialogue_history.answer` | The answer previously provided by the LLM. | `Here is a draft of your professional email about public APIs.` | | `dialogue_history.created_at` | The ISO date formatted timestamp of when the previous answer to the prompt was created. | `2012-12-12T10:53:43-08:00` | -|`ai_agent` | The AI agent used to override the default agent configuration. This parameter allows you to, for example, replace the default LLM with a custom one using the [`model`][model-param] parameter, tweak the base [`prompt`][prompt-param] to allow for a more customized user experience, or change an LLM parameter, such as `temperature`, to make the results more or less creative. Before using the `ai_agent` parameter, you can get the default configuration using the [`GET 2.0/ai_agent_default`][agent] request.| | +|`ai_agent` | The AI agent used to override the default agent configuration. This parameter allows you to, for example, replace the default LLM with a custom one using the [`model`][model-param] parameter, tweak the base [`prompt`][prompt-param] to allow for a more customized user experience, or change an LLM parameter, such as `temperature`, to make the results more or less creative. Before you use the `ai_agent` parameter, you can get the default configuration using the [`GET 2.0/ai_agent_default`][agent] request. For specific use cases, see the [AI model overrides tutorial][overrides].| | ## Use case @@ -155,4 +155,5 @@ In such a case, the response will be based on the keywords included in the query [prereq]: g://box-ai/prerequisites [agent]: e://get_ai_agent_default [model-param]: r://ai_agent_text_gen#param_basic_gen_model -[prompt-param]: r://ai_agent_text_gen#param_basic_gen_prompt_template \ No newline at end of file +[prompt-param]: r://ai_agent_text_gen#param_basic_gen_prompt_template +[overrides]: g://box-ai/ai-agents/overrides-tutorial \ No newline at end of file diff --git a/content/guides/box-ai/generate-text.md b/content/guides/box-ai/generate-text.md index bf5645ca7..45bb1eaf0 100644 --- a/content/guides/box-ai/generate-text.md +++ b/content/guides/box-ai/generate-text.md @@ -48,9 +48,10 @@ To make a call, you must pass the following parameters. Mandatory parameters are | `dialogue_history.prompt` | The prompt previously provided by the client and answered by the Large Language Model (LLM). | `Make my email about public APIs sound more professional` | | `dialogue_history.answer` | The answer previously provided by the LLM. | `Here is a draft of your professional email about public APIs.` | | `dialogue_history.created_at` | The ISO date formatted timestamp of when the previous answer to the prompt was created. | `2012-12-12T10:53:43-08:00` | -|`ai_agent` | The AI agent used to override the default agent configuration. This parameter allows you to, for example, replace the default LLM with a custom one using the [`model`][model-param] parameter, tweak the base [`prompt`][prompt-param] to allow for a more customized user experience, or change an LLM parameter, such as `temperature`, to make the results more or less creative. Before using the `ai_agent` parameter, you can get the default configuration using the [`GET 2.0/ai_agent_default`][agent] request.| | +|`ai_agent` | The AI agent used to override the default agent configuration. This parameter allows you to, for example, replace the default LLM with a custom one using the [`model`][model-param] parameter, tweak the base [`prompt`][prompt-param] to allow for a more customized user experience, or change an LLM parameter, such as `temperature`, to make the results more or less creative. Before you use the `ai_agent` parameter, you can get the default configuration using the [`GET 2.0/ai_agent_default`][agent] request. For specific use cases, see the [AI model overrides tutorial][overrides]| | [prereq]: g://box-ai/prerequisites [agent]: e://get_ai_agent_default [model-param]: r://ai_agent_text_gen#param_basic_gen_model -[prompt-param]: r://ai_agent_text_gen#param_basic_gen_prompt_template \ No newline at end of file +[prompt-param]: r://ai_agent_text_gen#param_basic_gen_prompt_template +[overrides]: g://box-ai/ai-agents/overrides-tutorial \ No newline at end of file