Skip to content

Commit

Permalink
Merge branch 'main' into inardini--autosxs-custom-task
Browse files Browse the repository at this point in the history
  • Loading branch information
holtskinner authored Jul 11, 2024
2 parents f0921a1 + f49f23e commit a61de37
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
4 changes: 2 additions & 2 deletions gemini/function-calling/intro_diy_react_agent.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@
"\n",
"In the third example in this notebook, we leverage [Function Calling in Gemini](https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling) to build our simple agent. It lets developers create a description of a function in their code, then pass that description to a language model in a request. The response from the model includes the name of a function that matches the description and the arguments to call it with.\n",
"\n",
"There are also other tools-calling and agents building frameworks to increase developers productivity. For example, the [Tool-Calling Agents](https://python.langchain.com/v0.1/docs/modules/agents/agent_types/tool_calling/) from LangChain, and at an even higher level of abstraction, [Reasoning Engine](https://cloud.google.com/vertex-ai/generative-ai/docs/reasoning-engine/overview) is a Google Cloud managed service that helps you to build and deploy an agent reasoning framework ([See sample notebooks](https://github.com/gkcng/generative-ai/blob/gkcng-demo/gemini/reasoning-engine)). Reasoning Engine integrates closely with the Python SDK for the Gemini model in Vertex AI, and it can manage prompts, agents, and examples in a modular way. Reasoning Engine is compatible with LangChain, LlamaIndex, or other Python frameworks. "
"There are also other tools-calling and agents building frameworks to increase developers productivity. For example, the [Tool-Calling Agents](https://python.langchain.com/v0.1/docs/modules/agents/agent_types/tool_calling/) from LangChain, and at an even higher level of abstraction, [Reasoning Engine](https://cloud.google.com/vertex-ai/generative-ai/docs/reasoning-engine/overview) is a Google Cloud managed service that helps you to build and deploy an agent reasoning framework ([See sample notebooks](https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/reasoning-engine)). Reasoning Engine integrates closely with the Python SDK for the Gemini model in Vertex AI, and it can manage prompts, agents, and examples in a modular way. Reasoning Engine is compatible with LangChain, LlamaIndex, or other Python frameworks. "
]
},
{
Expand Down Expand Up @@ -1324,7 +1324,7 @@
" function_calls.append(function_call_dict)\n",
" return function_calls\n",
"```\n",
"In recent versions of specific Gemini Pro models (from May 2024 and on), Gemini has the ability to return two or more function calls in parallel (i.e., two or more function call responses within the first function call response object). Parallel function calling allows you to fan out and parallelize your API calls or other actions that you perform in your application code, so you don't have to work through each function call response and return one-by-one! Refer to the [Gemini Function Calling documentation](https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling) for more information on which Gemini model versions support parallel function calling, and this [notebook on parallel function calling](https://github.com/gkcng/generative-ai/blob/gkcng-demo/gemini/function-calling/parallel_function_calling.ipynb) for examples."
"In recent versions of specific Gemini Pro models (from May 2024 and on), Gemini has the ability to return two or more function calls in parallel (i.e., two or more function call responses within the first function call response object). Parallel function calling allows you to fan out and parallelize your API calls or other actions that you perform in your application code, so you don't have to work through each function call response and return one-by-one! Refer to the [Gemini Function Calling documentation](https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling) for more information on which Gemini model versions support parallel function calling, and this [notebook on parallel function calling](https://github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/function-calling/parallel_function_calling.ipynb) for examples."
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -213,7 +213,7 @@
},
"outputs": [],
"source": [
"PROJECT_ID = \"cloud_llm_preview1\" # @param {type:\"string\"}\n",
"PROJECT_ID = \"YOUR_PROJECT_ID\" # @param {type:\"string\"}\n",
"LOCATION = \"us-central1\" # @param {type:\"string\"}\n",
"\n",
"\n",
Expand Down

0 comments on commit a61de37

Please sign in to comment.