From d27167af485370b77035cda6d073146423c35ccf Mon Sep 17 00:00:00 2001 From: Eashan Thakuria Date: Wed, 16 Oct 2024 17:56:21 -0400 Subject: [PATCH 1/4] updated docs --- docs/01-Business Overview/01-Overview.mdx | 114 +++++++ .../01-Identity_Access_Mgmt.mdx | 45 +++ .../02-Capabilities/02-GenAI_routing.mdx | 47 +++ .../02-Capabilities/03-RAG_DocumentSearch.mdx | 33 +++ .../02-Capabilities/04-Integrations.mdx | 22 ++ .../02-Capabilities/05-Governance.mdx | 45 +++ .../06-subordinate_bot_int.mdx | 18 ++ docs/01-Prepare/01-sample.mdx | 11 - docs/02-Create/01-sample.mdx | 11 - docs/03-Prepare/01-Requirements.mdx | 51 ++++ docs/03-Use Cases/01-sample.mdx | 9 - .../01-Static-Webpage.mdx | 46 +++ .../01-Watson_Discovery.mdx | 80 +++++ .../02-Watsonx_Discovery.mdx | 278 ++++++++++++++++++ .../03-Governance/01-watsonx_gov.mdx | 72 +++++ .../03-Governance/02-rag_sourcelinks.mdx | 77 +++++ .../04-Third Party Apps/01-ServiceNow.mdx | 86 ++++++ .../04-Third Party Apps/02-Workday.mdx | 24 ++ .../04-Third Party Apps/03-Genesys.mdx | 84 ++++++ .../01-assistant_custom_extension.mdx | 96 ++++++ .../02-watsonx_orchestrate.mdx | 26 ++ .../01-IBM_Security_Verify.mdx | 112 +++++++ .../03-Simulated.mdx | 64 ++++ docs/04-Create/07-GenAI_routing_create.mdx | 31 ++ docs/04-Takeaways/01-Takeaways.mdx | 8 + docs/04-Takeaways/01-sample.mdx | 11 - docs/05-Resources.mdx | 17 ++ docs/05-Resources/01-sample.mdx | 11 - docs/homepage.mdx | 9 +- 29 files changed, 1481 insertions(+), 57 deletions(-) create mode 100644 docs/01-Business Overview/01-Overview.mdx create mode 100644 docs/01-Business Overview/02-Capabilities/01-Identity_Access_Mgmt.mdx create mode 100644 docs/01-Business Overview/02-Capabilities/02-GenAI_routing.mdx create mode 100644 docs/01-Business Overview/02-Capabilities/03-RAG_DocumentSearch.mdx create mode 100644 docs/01-Business Overview/02-Capabilities/04-Integrations.mdx create mode 100644 docs/01-Business Overview/02-Capabilities/05-Governance.mdx create mode 100644 docs/01-Business Overview/02-Capabilities/06-subordinate_bot_int.mdx delete mode 100644 docs/01-Prepare/01-sample.mdx delete mode 100644 docs/02-Create/01-sample.mdx create mode 100644 docs/03-Prepare/01-Requirements.mdx delete mode 100644 docs/03-Use Cases/01-sample.mdx create mode 100644 docs/04-Create/01-Webchat Interface/01-Static-Webpage.mdx create mode 100644 docs/04-Create/02-RAG Document Search/01-Watson_Discovery.mdx create mode 100644 docs/04-Create/02-RAG Document Search/02-Watsonx_Discovery.mdx create mode 100644 docs/04-Create/03-Governance/01-watsonx_gov.mdx create mode 100644 docs/04-Create/03-Governance/02-rag_sourcelinks.mdx create mode 100644 docs/04-Create/04-Third Party Apps/01-ServiceNow.mdx create mode 100644 docs/04-Create/04-Third Party Apps/02-Workday.mdx create mode 100755 docs/04-Create/04-Third Party Apps/03-Genesys.mdx create mode 100644 docs/04-Create/05-Subordinate Bots/01-assistant_custom_extension.mdx create mode 100644 docs/04-Create/05-Subordinate Bots/02-watsonx_orchestrate.mdx create mode 100644 docs/04-Create/06-Identity and Access Management/01-IBM_Security_Verify.mdx create mode 100644 docs/04-Create/06-Identity and Access Management/03-Simulated.mdx create mode 100644 docs/04-Create/07-GenAI_routing_create.mdx create mode 100644 docs/04-Takeaways/01-Takeaways.mdx delete mode 100644 docs/04-Takeaways/01-sample.mdx create mode 100644 docs/05-Resources.mdx delete mode 100644 docs/05-Resources/01-sample.mdx diff --git a/docs/01-Business Overview/01-Overview.mdx b/docs/01-Business Overview/01-Overview.mdx new file mode 100644 index 0000000..b3a2995 --- /dev/null +++ b/docs/01-Business Overview/01-Overview.mdx @@ -0,0 +1,114 @@ +--- +title: Overview +sidebar_position: 1 +description: sample page +custom_edit_url: null +--- + +
+The goal of a solution doc is to outline a clear and actionable plan for implementing a unified agent that effectively +leverages generative AI to route conversations to the appropriate actions and integrations. This document serves as a blueprint for developers +and project managers by providing a detailed roadmap for the overall implementation and configuration of various generative AI capabilities and third-party integrations, +as well as outlining the necessary infrastructure and workflows. + +This solution doc will address the different use cases for this soltuion, as well as outline the necessary requirements for building out each solution component to ultimately +demonstrate a successful implementation of the unified agent with generative AI capabilities. + +### Business Statement +- A plethora of siloed internal chatbots is posing challenges to an enterprise's operational, compliance, and enterprise architecture standards. +- Employees are frustrated because they have to locate the right chatbot to answer their specific questions. +- Enterprise Architecture team is frustrated because lines of business will launch their own chatbot to fill their teams’ specific needs. +- Compliance teams are frustrated because there is no single point of oversight to ensures that chatbot responses are grounded on vetted information and that sensitive material remains protected. ​ + +### Challenges +- **User Experience challenges:** + - **Time to information:** Information is difficult to find in a timely manner, as employees need to first know which chatbot to leverage for their particular question. + - **Limited Access:** users will only be able to leverage the chatbots they are aware of. Many employees will be without the conversational search of a chatbot unless they have awareness and access to it. + - **Limited capabilities and integration**: very few chatbots are currently designed to do more than provide information. Employees need to take extra steps in order to action on the insight (like opening tickets, connecting with support, etc) +- **Cost and Operational challenges:** + - **Redundant efforts:** lines of business across the business are duplicating efforts by creating redundant chatbots. + - **Scalability:** Without a unified approach, every new chatbot that is added only magnifies the existing user experience challenges and info risk challenges. +- **Information Challenges:** + - **Veracity Risk:** How can Enterprise Architecture and Compliance teams ensure that the many chatbots across each line of business are providing employees information grounded on vetted source material? + - **Security and Sensitivity Risk:** How can Enterprise Architecture and Compliance teams ensure that sensitive material remains protected across specified clearance levels? +- **Enterprise-level governance:** Enterprise Architecture lacks a single view of all LLMs deployed in the organization. + +### Desired Outcomes +- A platform that can unite across lines of business +- Flexibility to connect with existing systems as well as extend to future additions. +- Enable users to take actions (like opening tickets, changing passwords, etc) +- Enable users to answer questions, grounded on pre-approved source material +- Governed access to sensitive content (only allow access to content that you are cleared for) +- Delegate specified questions to vetted chatbots that may already exist across the enterprise + +### Expected Benefits to the business +- A More Informed Workforce: less noise per inquiry, less time to answers +- Cost and Operational Efficiency Gains: reduced duplicated efforts and a scalable framework +- Enhanced Compliance and Governance: a united framework enables clearer centralized oversight - including models deployed, model health, source information, governed access to sensitive material + + +## Solution Components +The objective of this solution is to demonstrate a chatbot which has the ability to orchestrate conversations to the +appropriate channels and/or 3rd-party applications while leveraging generative AI technologies. This unified agent solution revolves around six key capabilities: + +**Core Products:** watsonx Orchestrate, IBM Cloud Object Storage +| Capability | IBM Product | +| -------- | ------- | +| Identity Access Management | IBM Security Verify | +| Generative AI-Driven Conversational Routing | watsonx.gov or watsonx.ai
Watson Matchine Learning | +| RAG Document Search |
watsonx Discovery
watsonx.ai
Watson Machine Learning
| +| 3rd-Party Application Integrations | watsonx Orchestrate | +| Governance | watsonx.gov | +| Subordinate Bot Integration | watsonx Orchestrate Assistant | + + + +--- + +![Solution Components](https://media.github.ibm.com/user/386696/files/275d5fa0-c2a4-416c-861d-b9c3c0dc83c0) + +---- +### **Web-Chat Interface** +* Leveraging IBM Cloud Object Storage to host a static website for the chatbot. + * Configuration Steps [here](/Create/Webchat%20Interface/Static-Webpage) + +---- +### **Identity and Access Management (IAM)** + +Security can be demostrated in one of two ways: + +1. **Security Verify SSO** + * Configuration Steps [here](/Create/User%20Authentication/IBM_Security_Verify) + +2. **Simulation:**

+ * Leverage stored user information (e.g. username, password, role, access, etc,) in a data structure within the Assistant Builder. + * Configuration Steps [here](/Create/User%20Authentication/Simulated) +---- +### **Conversational Engine** +* Built within the watsonx Orchestrate Assistant Builder +* Built out Generative AI layer to route conversations to the right actions based on the logged-in user's access/role. + * GenAI Routing: Configuration Steps [here](/Create/GenAI_routing_create) +---- +### **Integrations** +#### **RAG Document Search** +* Leverage native assistant builder extensions to integrate with watsonx Discovery or watson Discovery + * Watsonx Discovery Configuration Steps [here](/Create/RAG%20Document%20Search/Watsonx_Discovery) + * Watson Discovery Configuration Steps [here](/Create/RAG%20Document%20Search/Watson_Discovery) + +#### **Governance** +* Leverage native assistant builder extensions to integrate with watsonx.gov +* Configuration Steps [here](/Create/Governance/watsonx_gov) + +#### **3rd-Party Applications** +* Leverage native assistant builder extensions to integrate with Genesys. + * Genesys Configuration Steps [here](/Create/Third%20Party%20Apps/Genesys) +* Leverage watsonx Orchestrate skills to integrate with ServiceNow and Workday + * ServiceNow Configuration Steps [here](/Create/Third%20Party%20Apps/ServiceNow) + * Workday Configuration Steps [here](/Create/Third%20Party%20Apps/Workday) + +#### **Subordinate Bot** +* Leverage native assistant builder extensions to integrate with watsonx.gov +* Configuration Steps [here](/Create/Governance/watsonx_gov) +---- +### **Data Repository** +* Leverage Cloud Object Storage to store documents relevant to the use case diff --git a/docs/01-Business Overview/02-Capabilities/01-Identity_Access_Mgmt.mdx b/docs/01-Business Overview/02-Capabilities/01-Identity_Access_Mgmt.mdx new file mode 100644 index 0000000..2fbeb18 --- /dev/null +++ b/docs/01-Business Overview/02-Capabilities/01-Identity_Access_Mgmt.mdx @@ -0,0 +1,45 @@ +--- +title: Identity and Access Management +sidebar_position: 1 +description: sample page +custom_edit_url: null +--- + +## Overview +
+It is essential to consider the security and access control aspects for the user interface within the agent. Implementing user login functionality allows users to securely authenticate themselves, enabling them to access specific actions and features based on their access rights. By integrating user access information with the chatbot, the system can ensure that users can only interact with and modify data they are authorized to access. This not only enhances the overall security of the system but also provides a more personalized and controlled user experience. + +For instance, a user with administrative privileges may have access to change theirs' and others' passwords , while a regular user will be limited to basic functions. +This approach ensures that users are only exposed to the features and actions they are qualified to perform, reducing the risk of errors and improving the usability of the chatbot. + +### **Examples** + +For this solution there was a focus on four simulated personas. + +![User Examples](https://media.github.ibm.com/user/386696/files/f08aec9d-0c3a-4984-8903-8f499ec14a42) + +**Admin** : Has the ability to change passwords + +**Manager** : Has the ability to file short-term disablity requests for oneself and others + +**Employees** : Has the ability to search corpus for answers and open tickets to change passwords or open short-term disablity requests + +**fixed-income access** : Has the ability to search documents pertaining to fixed-income reports + +**real-estate access** : Has the ability to search documents pertaining to real-estate reports + +## Solution Implementation +
+ +### **Method 1: Simulation** +Simulating user login within Watsonx Orchestrate Assistant Builder can be achieved by storing simulated individuals in a data structure. This data structure, typically a list or map, can store information about each simulated user, such as their name, access level, and other relevant details. By initializing this data structure with a set of simulated users, you can create a realistic login scenario where the system checks user credentials and grants or denies access accordingly. + +Storing simulated individuals in a data structure within Watsonx Orchestrate Assistant Builder allows you to test the login functionality and ensure that the system behaves as expected. This can help identify potential issues early in the development process, saving time and resources. Additionally, you can customize the simulated users' data to represent different user types, enabling you to test the system's access control mechanisms and ensure that users are only granted access to the appropriate actions and data. + +By using simulated users in this manner, you can thoroughly test the user login functionality and ensure that the unified agent with generative AI capabilities is secure, reliable, and user-friendly. + +[Implementation Guide Here](/Create/Identity%20and%20Access%20Management/Simulated) +### **Method 2: IBM Security Verify** +Leverage the IBM Security Verify API to carry differnt actions like authenticating user logins and resetting passwords. + +[Implementation Guide Here](/Create/Identity%20and%20Access%20Management/IBM_Security_Verify) \ No newline at end of file diff --git a/docs/01-Business Overview/02-Capabilities/02-GenAI_routing.mdx b/docs/01-Business Overview/02-Capabilities/02-GenAI_routing.mdx new file mode 100644 index 0000000..b23e3c6 --- /dev/null +++ b/docs/01-Business Overview/02-Capabilities/02-GenAI_routing.mdx @@ -0,0 +1,47 @@ +--- +title: GenAI Routing +sidebar_position: 2 +description: sample page +custom_edit_url: null +--- +## Overview +
+Leveraging generative AI to help classify user prompts as different groups can significantly improve the routing of conversations to the right actions +within an assistant builder. + +This approach not only streamlines the conversation routing process but also enhances the overall user experience by providing more accurate and relevant responses. +For example, if a user inputs "I need to reset my password," the system can classify the prompt as an "Action" intent and route the conversation to the relevant action or workflow. +Similarly, if a user inputs "What are the password policies?", the system can classify the prompt as a "Query" intent and route the conversation to the appropriate e-commerce action or workflow. + +## Solution Implementation +
+ +### **Action vs. Query** +Generative AI is leveraged within the Assistant to determine whether a user's request is an "Action", such as changing a password, or a request to +"Query" a knowledge base for an answer. + +For example, if a user inputs "change my password," the system can classify the request as an "Acion" and route the conversation to the relevant action +or workflow. Similarly, if a user inputs "What are the different password policies?" the system will classify the request as a "Query" and route the +conversation to the appropriate knowledge base action or workflow, where the AI model can generate an accurate and relevant answer. + +By accurately determining the nature of a user's request, the system can provide more targeted and relevant responses, improving the overall user +experience. Additionally, generative AI can be used to generate tailored responses for each action type, ensuring that users receive accurate and +engaging information for both action requests and knowledge base queries. + +### **Financial vs. Non-Financial Query** +Generative AI is leveraged within the Assistant to determine whether a user's query is related to financial reports or not. If the query is related +to financial reports, the Large Language Model classifies the request as "Financial" and classifies the request as "None" if it is not financially related. + +For example, if a user inputs "How did the mortgage rates change from the previous year to the fourth quarter of 2023," the system will classify the request as "Financial" and route the conversation to the relevant action +or workflow to search against a finanical-focused corpus. Similarly, if a user inputs "What are the different password policies?" the system can classify the request as a "None" and route the +conversation to the appropriate knowledge base action or workflow where an AI model can generate an accurate and relevant answer. + +### **Individual Request vs. Request for another** +Generative AI is leveraged within the Assistant to determine whether a user's query is for the individual or if the request is for others. If the query is for the individual +user, the Large Language Model classifies the request as "Individual" and classifies the request as "Other" if it is request for another individual. + +For example, if a user inputs "I would like to change my password" the system will classify the request as "Individual" and route the conversation to the relevant action +or workflow. Similarly, if a user inputs "I would like to file a short-term disability claim for Robert" the system will classify the request as a "Other" and route the +conversation to the appropriate action or workflow to ensure the right users can carry out specific actions. + +[Implementation Guide Here](/Create/GenAI_routing_create) \ No newline at end of file diff --git a/docs/01-Business Overview/02-Capabilities/03-RAG_DocumentSearch.mdx b/docs/01-Business Overview/02-Capabilities/03-RAG_DocumentSearch.mdx new file mode 100644 index 0000000..614baac --- /dev/null +++ b/docs/01-Business Overview/02-Capabilities/03-RAG_DocumentSearch.mdx @@ -0,0 +1,33 @@ +--- +title: RAG Document Search +sidebar_position: 3 +description: sample page +custom_edit_url: null +--- +## Overview +
+ +By leveraging a RAG pipeline to help users query a given knowledge base corpus, the Assistant can provide a more reliable and accurate knowledge base search experience. +This not only enhances the overall user experience but also ensures that users receive the most relevant and up-to-date information possible by providing source links to the provided answers. + +A RAG pipeline for Document Search usually consists of a Data Repository, a Vector Database and a Large Language Model. This pipeline can be carried out as one of three patterns. + +## Solution Implementation +
+ +### **Method 1: Watson Discovery** + +This pattern consists of creating two integrations with Watson Discovery and Watsonx.ai. Watson Discovery is used to store and carry out searches on data collections. The native native search capability to pass relevant passages into an LLM prompt template to generate an answer to a user's query. + +**Required Integrations:** +* Watson Discovery +* Watsonx.ai + +![RAG Method 1](https://media.github.ibm.com/user/386696/files/e5296158-ada0-4aa8-beee-d9b1d47b6ab6) + +[Implementation Guide Here](/Create/RAG%20Document%20Search/Watson_Discovery) +### **Method 2: Watsonx Discovery with Elasticsearch** + +![RAG Method 2](https://media.github.ibm.com/user/386696/files/635e8b89-32f7-4899-9284-86d376e54d9e) + +[Implementation Guide Here](/Create/RAG%20Document%20Search/Watsonx_Discovery) diff --git a/docs/01-Business Overview/02-Capabilities/04-Integrations.mdx b/docs/01-Business Overview/02-Capabilities/04-Integrations.mdx new file mode 100644 index 0000000..2028c72 --- /dev/null +++ b/docs/01-Business Overview/02-Capabilities/04-Integrations.mdx @@ -0,0 +1,22 @@ +--- +title: 3rd-Party Integrations +sidebar_position: 4 +description: sample page +custom_edit_url: null +--- +## Overview +
+ +Integrating third-party applications with a chatbot is pivotal for predefined workflows within an organization because it allows the chatbot to seamlessly interact with various systems, streamlining processes and enhancing efficiency. By leveraging existing tools and platforms, the chatbot can execute tasks such as updating databases, triggering notifications, or retrieving real-time data without manual intervention. This integration ensures that the chatbot is not just a standalone solution but becomes a central hub for workflow automation, reducing operational silos and enabling more cohesive and agile business operations. Moreover, the ability to connect with third-party services empowers organizations to customize and expand their workflows, making the chatbot a more flexible and powerful tool that adapts to the organization's unique needs. + +## Solution Implementation +
+ +### ServiceNow +[Implementation Guide Here](/Create/Third%20Party%20Apps/ServiceNow) + +### Workday +[Implementation Guide Here](/Create/Third%20Party%20Apps/Workday) + +### Genesys +[Implementation Guide Here](Create/Third%20Party%20Apps/Genesys) \ No newline at end of file diff --git a/docs/01-Business Overview/02-Capabilities/05-Governance.mdx b/docs/01-Business Overview/02-Capabilities/05-Governance.mdx new file mode 100644 index 0000000..45d4c79 --- /dev/null +++ b/docs/01-Business Overview/02-Capabilities/05-Governance.mdx @@ -0,0 +1,45 @@ +--- +title: Governance +sidebar_position: 5 +description: sample page +custom_edit_url: null +--- +## Overview +
+ +Implementing watsonx governance mechanisms for AI activities is essential to effectively manage the complexities and risks inherent in AI Model deployments. + +By establishing clear guidelines, processes, and oversight mechanisms, can help in mitigate risks such as data bias, model drift, and can continously monitor models on real time. + +Moreover, governance fosters transparency and accountability by documenting the AI model development process, data sources, and decision criteria. This transparency builds trust among stakeholders and also streamlines the process of building AI solutions. + +Additionally, governance frameworks enable organizations to optimize resource allocation by prioritizing AI initiatives that deliver tangible business value. By providing oversight and guidance, governance ensures that resources are deployed efficiently and effectively. + +Overall, governance is essential for organizations to responsibly harness the power of AI while mitigating risks and ensuring compliance. IBM® watsonx.governance™ offers a comprehensive framework for organizations to establish transparency, accountability, and compliance in their AI initiatives, enabling them to unlock the full potential of AI technologies while minimizing associated risks. + + +## Solution Implementation +
+ +### watsonx.gov +![Watsonx.gov architecture](https://media.github.ibm.com/user/195534/files/a4b5c77a-8fdd-4ece-9334-bb407aec47e8) + + +Utilizing IBM® watsonx.governance™ offers a streamlined solution for the development, evaluation, deployment, and monitoring of AI models, ensuring optimal performance and compliance throughout the model lifecycle. + +The process begins with the AI Engineer writing prompts using the Prompt Lab interface, leveraging various models to achieve the desired outcomes. These prompts undergo continous evaluation using watsonx.gov's evaluation features, leveraging pre-created evaluation datasets to assess model performance and effectiveness. Iterative cycles of prompt refinement and evaluation continue until the desired outcomes and evaluation metrics are achieved. + +Once a model meets the established evaluation metrics, it is seamlessly integrated into the model inventory within the governance framework. Here, the model's lifecycle is tracked, and its metadata is captured in AI Factsheets, providing comprehensive documentation and transparency. + +Upon approval within the model inventory, the model progresses to deployment status (Test, Pre-prod & Prod). Depending on its designated status, the model is deployed into Watson Machine Learning (WML) deployment spaces, including test, pre-production, and production environments. This ensures a controlled and systematic rollout of AI models across operational stages. + +Continuous monitoring is a key aspect of the solution, with deployed models being continuously monitored for incoming data and evaluated against predefined metrics such as model health and drift. Any deviations from established thresholds trigger automatic alerts, enabling proactive intervention and maintenance of model performance and integrity. + +In summary, the implementation of IBM® watsonx.governance™ provides a comprehensive and seamless solution for the development, evaluation, deployment, and monitoring of AI models, ensuring compliance, transparency, and optimal performance throughout the AI lifecycle. + +[Implementation Guide Here](/Create/Governance/watsonx_gov) + +### RAG Source Links +For every user query that involves RAG Document Search, a source link will be provided with each response to ensure transparency by clearly indicating the data corpus from which the answer was derived. + +[Implementation Guide Here](/Create/Governance/rag_sourcelinks) diff --git a/docs/01-Business Overview/02-Capabilities/06-subordinate_bot_int.mdx b/docs/01-Business Overview/02-Capabilities/06-subordinate_bot_int.mdx new file mode 100644 index 0000000..b22df88 --- /dev/null +++ b/docs/01-Business Overview/02-Capabilities/06-subordinate_bot_int.mdx @@ -0,0 +1,18 @@ +--- +title: Subordinate Bot Integration +sidebar_position: 5 +description: sample page +custom_edit_url: null +--- + +## Overview +
+ +One aspect of creating a unifying platform is the ability to connect and interface with existing chatbots which may have their own data corpuses or access policies. The parent or unifying agent should be able to hand-off requests to the relevant chatbots. +For example, if there is an existing chatbot which has access to financial analyst reports data source, the parent/unifying bot should be able to hand-off questions to that chatbot's domain and return it back to the user in the parent bot. + +## Solution Implementation +
+To implement this capability another assistant is required either with watsonx Assistant or within the assistant builder in watsonx Orchestrate. There are two ways to configure this solution: +1. [Assistant Custom Extension](/Create/Subordinate%20Bots/assistant_custom_extension) +2. [watsonx Orchestrate Skill Flow](/Create/Subordinate%20Bots/watsonx_orchestrate) \ No newline at end of file diff --git a/docs/01-Prepare/01-sample.mdx b/docs/01-Prepare/01-sample.mdx deleted file mode 100644 index 0d9845c..0000000 --- a/docs/01-Prepare/01-sample.mdx +++ /dev/null @@ -1,11 +0,0 @@ ---- -title: Sample Page -description: sample page -custom_edit_url: null ---- - -# Sample Page - -- Staging work -- Setup -- Pre-requisites \ No newline at end of file diff --git a/docs/02-Create/01-sample.mdx b/docs/02-Create/01-sample.mdx deleted file mode 100644 index dea9742..0000000 --- a/docs/02-Create/01-sample.mdx +++ /dev/null @@ -1,11 +0,0 @@ ---- -title: Sample Page -description: sample page -custom_edit_url: null ---- - -# Sample Page - -Details regarding the technical solution. What did you do and how you did it. - -i.e. Deployment, installment, upgrade, configuration, automation. diff --git a/docs/03-Prepare/01-Requirements.mdx b/docs/03-Prepare/01-Requirements.mdx new file mode 100644 index 0000000..48435af --- /dev/null +++ b/docs/03-Prepare/01-Requirements.mdx @@ -0,0 +1,51 @@ +--- +title: Software Requirements +sidebar_position: 2 +description: Software requirements for this POC +custom_edit_url: null +--- + +# Prepare + +## **Web-chat Interface** +- *Watsonx Orchestrate* + - Dedicated WxO Tenant with Admin access +- *IBM Cloud Object Storage*: Lite Plan + +## **RAG Document Search** +For RAG Document Search you can leverage two different methods: +- Watsdon Discovery +- Watsonx Discovery + + +### Watson Discovery +- *Watson Discovery*: Plus Plan +- *Watsonx.ai* + - *Watson Machine Learning*: Standard Plan + - *Watsonx Studio*: Lite Plan + +### Watsonx Discovery +- *Watsonx Discovery*: Platinum Plan (Probably need more than 16GB of RAM) +- *Watsonx.ai* + - *Watson Machine Learning*: Standard Plan + - *Watsonx Studio*: Lite Plan +- *IBM Cloud Object Storage*: Lite Plan + +## **Governance** +- *Watsonx.ai* + - *Watson Machine Learning*: Standard Plan +- *Watsonx.governance*: Essentials Plan + +## **Third Party Applications** + +### ServiceNow +- ServiceNow developer account + +### Workday +- Workday Application + +### Genesys +- Genesys Cloud CX Account + +## **User Authentication** +- *IBM Security Verify* \ No newline at end of file diff --git a/docs/03-Use Cases/01-sample.mdx b/docs/03-Use Cases/01-sample.mdx deleted file mode 100644 index 6b8e374..0000000 --- a/docs/03-Use Cases/01-sample.mdx +++ /dev/null @@ -1,9 +0,0 @@ ---- -title: Sample Page -description: sample page -custom_edit_url: null ---- - -# Sample Page - -What are the use cases we delivered for the engagement? Explain the use cases in detail. diff --git a/docs/04-Create/01-Webchat Interface/01-Static-Webpage.mdx b/docs/04-Create/01-Webchat Interface/01-Static-Webpage.mdx new file mode 100644 index 0000000..343ad34 --- /dev/null +++ b/docs/04-Create/01-Webchat Interface/01-Static-Webpage.mdx @@ -0,0 +1,46 @@ +--- +title: Static-webpage +sidebar_position: 1 +description: how to create static website for chatbot +custom_edit_url: null +--- + +# Static Webpage Integration + +## Overview +
+ +This doc will go through how to build an externally accessible webpage with the embedded assitant webchat for public access. This webpage is created using a static html script which is hosted within a Cloud Object Storage Bucket. + +**Software Requirements:** +* IBM Cloud Object Storage - Lite +* Watsonx Orchestrate or Watsonx Assistant + +## Build Walkthrough +
+ +### Embed Assistant Webchat into HTML file +1. Within the Assitant Builder's sidebar, navigate to the "Integrations" section +2. Under the "Essential channels" section select "Open" within the "Webchat" channel +3. Select the appropriate environment and navigate to the "Embed" tab +4. Copy the provided script and insert into html script +5. Optional : Add ```showRestartButton: true``` after ```window.watsonAssistantChatOptions ={}``` to display web-chat restart button + +### Create Cloud Object Storage (COS) Instance +1. Go to the dedicated IBM Cloud Account "Resource" List [here](https://cloud.ibm.com/resources) and click "Create Resource +" +2. Search and select "Object Storage" +3. Select "IBM Cloud" as the infrastructure and the appropriate pricing plan +4. Name the service and click "Create" + +### Create a Custom COS Bucket +1. From the "Resource" List select the newly created COS instance +2. Click "Create a Custom Bucket" +3. Enter a valid bucket name and select the appropriate values for "Resiliency", "Location", "Storage class", "Object Versioning" and "Immutablity" +4. Within the section "Advanced configurations (optional)" click the "Add +" for "Static website hosting" + COS Static website +5. Ensure the "Public access" toggle is switched to "On" +6. Enter the name of the target html file which will be used to build the desired web app +7. Click "Save" and then click "Create bucket" + +### **Upload HTML file** +1. Upload the HTML file from [here](#embed-assistant-webchat-into-html-file) \ No newline at end of file diff --git a/docs/04-Create/02-RAG Document Search/01-Watson_Discovery.mdx b/docs/04-Create/02-RAG Document Search/01-Watson_Discovery.mdx new file mode 100644 index 0000000..e933df2 --- /dev/null +++ b/docs/04-Create/02-RAG Document Search/01-Watson_Discovery.mdx @@ -0,0 +1,80 @@ +--- +title: Watson Discovery +sidebar_position: 1 +description: How to setup RAG with Watson Discovery +custom_edit_url: null +--- + +## Overview +
+ +:::warning + Prerequisite: [Watson Disovery Software Requirement](/Prepare/Requirements#watson-discovery) +::: + +This build consists of three main components: +1. [Setup Watson Discovery](#1-setup-watson-discovery) +2. [Create Watsonx and Watson Discovery Extensions](#2-create-watsonx-and-watson-discovery-extensions) + 1. [Create Watson Discovery custom extension](#21-create-watson-discovery-custom-extension) + 2. [Create Watsonx custom extension](#22-create-watsonx-custom-extension) + 3. [Integrate Watsonx Search using Watson Discovery to Assistant](#23-integrate-watsonx-search-using-watson-discovery-to-assistant) +3. [Setup Watson Assistant](#3-setup-watson-assistant) + +-------- +### 1. Setup Watson Discovery + +1. New projects, input Project Name, select an option "None of the above — I’m working on a custom project", click "Next" +2. select the appropriatae method of upload, click "Next" +3. Input Collection Name +4. Upper left Hamburger icon -> Manage Collections -> New collections +5. Select data source +6. If webcrawl, input url links to "Starting URLs" and click "Add" -> Finish +-------- +### 2. Create Watsonx and Watson Discovery Extensions +**Required Steps:** + +#### **2.1 Create Watson Discovery custom extension** +1. In your assistant, navigate to "Integrations" page. +2. Click "Build custom extensions" -> click "Next" -> Input Extension name `Watson Discovery` -> click "Next" +3. download json file: [watson-discovery-query-openapi.json](https://github.com/watson-developer-cloud/assistant-toolkit/blob/master/integrations/extensions/starter-kits/watson-discovery/watson-discovery-query-openapi.json) and import file to WA +4. click "Next" -> click "Finish" +5. Lower Right corner of the Watson Disovery extension, click "Add" -> click "Add" -> click "Next" +6. In Authentication page, in the Authentication type dropdown, select "Basic auth" + 1. For Username enter `apikey` + 2. For password, create and copy a new API key from [API key](https://cloud.ibm.com/iam/apikeys) + 3. For discovery_url, within IBM Cloud -> resource list -> Watson Discovery Instance -> Manage -> Credentials -> URL + 4. Paste URL into discovery_url and remove `https://` from the beginning of the string +7. Click "Next", click "Finish", click "Close" + +- Reference: [starter kit for accessing the IBM Watson Discovery v2 search API via a custom extension to IBM Watson Assistant](https://github.com/watson-developer-cloud/assistant-toolkit/tree/master/integrations/extensions/starter-kits/watson-discovery) + +#### **2.2 Create Watsonx custom extension** +1. In your assistant, navigate to Integrations page, click "Build custom extension" -> click "Next" -> Input Extension name `watsonx` -> click "Next" . +2. download json file: [watsonx-openapi.json](https://github.com/watson-developer-cloud/assistant-toolkit/blob/master/integrations/extensions/starter-kits/language-model-watsonx/watsonx-openapi.json) and import file to WA +3. click "Next" -> click "Finish" +4. Lower Right corner of the watsonx extension, click "Add" -> click "Add" -> click "Next" +5. In Authentication page, in the Authentication type dropdown, select "OAuth 2.0" + 1. For Apikey, create and copy a new API key from [API key](https://cloud.ibm.com/iam/apikeys) +6. Click "Next", click "Finish", click "Close" + +#### **2.3 Integrate Watsonx Search using Watson Discovery to Assistant** +##### Upload Actions: +1. Download [discovery-watsonx-actions.json](https://github.com/watson-developer-cloud/assistant-toolkit/blob/master/integrations/extensions/starter-kits/language-model-conversational-search/discovery-watsonx-actions.json) +2. Navigate to "Actions" page, click "Global Settings" icon on the upper right corner +3. Navigate to Upload/Download tab, upload the downloaded JSON file [discovery-watsonx-actions.json](https://github.com/watson-developer-cloud/assistant-toolkit/blob/master/integrations/extensions/starter-kits/language-model-conversational-search/discovery-watsonx-actions.json) onto the tab or click to select a file from your local system, then click "Upload", and "Uplaod and replace". +4. within the Actions page, navigate to "Actions / Variables / Created by you". Set `discovery_project_id` and `watsonx_project_id` session variable + :::info + **Where to get credentials** + - **discovery_project_id**: within Watson Discovery: Upper left Hamburger icon -> Integrate and deploy -> API Information + - **watsonx_project_id**: + - Go to [WatsonX Platform](https://dataplatform.cloud.ibm.com/wx/home?context=wx) + - Projects (click on project)-> Manage -> General -> Details -> Project ID + ::: + +##### No action matches Setup +1. Navigate to "All items" -> "Set by assistant" -> "No action matches". +2. Click on the "No action matches" action and delete the existing step 1 and step 2. +3. "New Step". In the "And then" section, select "go to a subaction" -> select "Search" in the dropdown options -> "Apply". +4. "Save" and "Close" +5. You're all set. Navigate to "Preview" to test the integration! +-------- \ No newline at end of file diff --git a/docs/04-Create/02-RAG Document Search/02-Watsonx_Discovery.mdx b/docs/04-Create/02-RAG Document Search/02-Watsonx_Discovery.mdx new file mode 100644 index 0000000..ae9bc28 --- /dev/null +++ b/docs/04-Create/02-RAG Document Search/02-Watsonx_Discovery.mdx @@ -0,0 +1,278 @@ +--- +title: Watsonx Discovery +sidebar_position: 2 +description: How to setup RAG with Watsonx Discovery +custom_edit_url: null +--- + +# Watsonx Discovery +:::warning + Prerequisite: [Watsonx Disovery Software Requirement](/Prepare/Requirements#watsonx-discovery) +::: + +## Build setup +
+ +Navigate to this [github](https://github.ibm.com/skol-assets/watsonx-RAG-w-watsonxdiscovery-method2/) +and download [this project file](https://github.ibm.com/skol-assets/watsonx-RAG-w-watsonxdiscovery-method2/blob/main/project/WatsonStudioProjectTemplate.zip) + + +## Build Walkthrough + +
+ +### **Create Elasticsearch Resource** + +Make sure to select the Platinum version with native ELSIR model, and be mindful of the RAM allocation. + +:::note +We ran into issues with extremely high RAM usage and unreliability, so we provisioned an instance of Elasticsearch with 64GB of RAM, 100GB of storage, and 16 cores. This was probably overkill, and the RAM usage was still over 80%, but we didn't have the same issues with Elasticsearch again. We weren't exactly sure why this was happening, as our largest index was about 6000 documents and 56.2 MB. +::: + +### **Create Watson Machine Learning Deployment Space** + +1. Navigate to the Deployments section of Watson Studio. +2. Create a new deployment space. + - The storage service should automatically be assigned to your Cloud Object Storage service. + - Assign your Watson Machine Learning service to the Machine Learning Service. + Once the deployment space is created, navigate to the space's Manage tab and copy the Space GUID. + +### **Upload Documents to COS Bucket** +1. Navigate to the appropriate "Cloud Object Storage" resource and select "Create Bucket" +2. Select "Create a Custom Bucket" and fill out the necessary fields and press "Create" +3. Upload the relevant documents +4. Create a "Content Reader" credential for the Cloud Object Storage resource + +### **Create an IBM Cloud API Key** + +Create an IBM Cloud API key in IBM Cloud [here](https://cloud.ibm.com/iam/apikeys) and save it. + +### **Create the Watson Studio Project** + +1. Download the "WatsonStudioProjectTemplate.zip"[here](https://github.ibm.com/skol-assets/watsonx-RAG-w-watsonxdiscovery-method2/tree/main/project). +2. Navigate to the Projects section of Watson Studio and change the context from "Watson Studio" to "watsonx". You can change the context in the top right corner of the UI. +3. Select `New Project` -> `Create a new project from a sample or file` and upload the zip file from [the "Build Setup"](#build-setup) + +#### Assiciate WML instance + +1. Navigate to the "`Manage` Tab and Select `Associate service +` and select the appropriate WML instance + +#### **Populate Parameter Set** + +1. Click on the `Notebook_and_Deployment_Parameters` parameter set in the project. +2. Set the `wml_space_id` and `ibm_cloud_apikey` to the Space GUID and IBM Cloud API key, respectively. + +#### **Complete Connection to Databases for Elasticsearch** + +1. Click on the `WatsonxDiscovery` connection in the project. +2. In a separate tab, navigate to the Databases for Elasticsearch service on IBM Cloud. +3. Within the `Overview` tab scroll down to the `Endpoints` section and select the `HTTPS` tab. +4. Copy the **hostname**, **port**, and **TLS certificate**. +5. Go to the service credentials tab of the service and create a `New Credential`. +6. Copy the username and password under `connection.https.authentication` in the new service credential JSON. +7. Return to the Watson Studio project's `WatsonxDiscovery` connection and set the following fields: + - Edit the URL with the saved values for the `HOSTNAME` and `PORT` with the format of `https://{HOSTNAME}:{PORT}`. + - The username and password should be the ones copied from the service credentials. + - The SSL certificate should be the TLS certificate. +8. Select `Test connection` in the top right corner to validate a working connection. + +#### **Complete Connection to Databases for Elasticsearch** +1. Click on the `CloudObjectStorage` connection in the project. +2. Set the bucket name to the name of the bucket created in the ["Add Documents to Cloud Object Storage" step](#add-documents-to-cloud-object-storage). +3. In the configuration tab of the bucket in Cloud Object Storage, copy the public endpoint to the Login URL field. +5. Paste the Cloud Object Storage service credential you created earlier into the "Service credentials" field. +6. Test the connection by clicking the "Test connection" button. The test should be successful. + +### **Run Notebooks** + +Once the setup is complete, the notebooks in the project can be run without errors. For each of the notebooks, make sure to insert the project token via the top-right menu in the notebook UI before running any cells. This creates a cell that connects your notebook to the project and its assets. + +Steps: + +1. Ingest documents into Elasticsearch via COS or Watson Discovery +2. Deploy RAG function + +--- + +#### **Watson Discovery: Ingest Documents to Elasticsearch** + +The `1-file-ingestion-from-dis` notebook in the project handles document ingestion from Watson Discovery for Elasticsearch. + +1. Update Project ID and Project Token Values: + + - Navigate to the "Manage" Tab of the watson Studio project and copy the "Project ID" to be used in the notebook. + - Navigate to the "Access control" and copy the existing token to be used in the notebook. + +2. Install the necessary python libraries: + :::note + Only need too install python libs on the first run of the notebook + ::: + + Insert new cell and run + + ```shell + !pip install nltk --quiet + !pip install ibm_watson --quiet + !pip install elasticsearch --quiet + !pip install llama-index --quiet + !pip install llama_index.vector_stores.elasticsearch --quiet + ``` + +3. Update the Watson Discovery credentials + 1. On the IBM Cloud resource list select the "Watson Discovery" instance + 2. On the "Manage" under "Overview" copy the "API Key" and "URL" value + 3. Click "Launch Discovery" -> "Select the relevant Project" -> "Manage Collections" -> click relevanty collection + 4. Grab the Collection ID from the url after "/collection/" + 5. Under the "Integrate and Deploy" section copy the "Project ID" + +--- + +#### **COS: Ingest Documents to Elasticsearch** + +The `1-file-ingestion-from-cos` notebook in the project handles document ingestion from Watson Discovery for Elasticsearch. + +1. Update Project ID and Project Token Values: + + - Navigate to the "Manage" Tab of the watson Studio project and copy the "Project ID" to be used in the notebook. + - Navigate to the "Access control" and copy the existing token to be used in the notebook. + +2. Install the necessary python libraries: + :::note + Only need too install python libs on the first run of the notebook + ::: + Insert new cell and run + ``` + !pip install nltk --quiet + !pip install ibm_watson --quiet + !pip install elasticsearch --quiet + !pip install llama-index --quiet + !pip install llama_index.vector_stores.elasticsearch --quiet + ``` +3. Update the COS credentials +4. Create a new `index_name` for each new collection of data (if applicable) + +--- + +#### **Deploy RAG Function in Watson Machine Learning** + +The `2-deploy-rag-function-in-wml` notebook in the project handles the deployment of a Python function that performs RAG using the Databases for Elasticsearch database and watsonx.ai. This step is not necessary if you plan to use the native search integration in watsonx Assistant. Optionally, you can test your deployment using the third notebook `3-test-rag-deployment` in the project. This notebook calls the deployment endpoint and reformats the deployment responses for better readability. + +1. Update Project ID and Project Token Values + + - Navigate to the "Manage" Tab of the watson Studio project and copy the "Project ID" to be used in the notebook. + - Navigate to the "Access control" and copy the existing token to be used in the notebook. + +2. Install the following libraries: + + ````shell + !pip install nltk --quiet + !pip install ibm_watson --quiet + !pip install elasticsearch --quiet + !pip install llama-index --quiet + !pip install llama_index.vector_stores.elasticsearch --quiet + ``` + + ```` + +3. Deploy seperate functions for each index created + :::tip + Note down the `deployment_id` for each function for each index to be leveraged for the Assistant integration + ::: + +4. Run cell under "Update project assets" to generate the OpenApi Spec for assistant integrations (if applicable) + +--- + +### **Assistant Integration** + +Watsonx Assistant provides the query interface, using either: + +- [Custom Extension for RAG Deploment Configuration](#deploy-rag-function-in-watson-machine-learning) + :::tip + Useful for routing queries/requests to different collections of data + ::: +- Native Extension. + - The assets for setting up watsonx Assistant are located in the [assistant folder](./assistant/) of this repository, and are also included in the Watson Studio project for convenience. + +#### **Custom Extension** + +1. Within the Assistant Builder's sidebar navigate to the "Integrations" section and +2. Select "Build custom extension" +3. Upload the OpenApi spec from step 4 of [RAG Deploment Extension Configuration](#deploy-rag-function-in-watson-machine-learning) and press "Finish" +4. Select "Add+" within the newly configured extension +5. Select "Next" -> "Authentication type: OAuth 2.0" +6. Enter the API Key value from [here](#create-an-ibm-cloud-api-key) + +#### **Native Extension** + +To configure watsonx Assistant to use the native search extension, follow these steps: + +1. In the integrations tab on the bottom left of the watsonx Assistant user interface, select the search extension and then Elasticsearch. +2. Use the Databases for Elasticsearch credentials obtained in the [Complete Connection to Databases For Elasticsearch section](#complete-connection-to-databases-for-elasticsearch) to fill out the next page. + - Note that "https://" should be appended before the hostname obtained from the Elasticsearch credentials. + - The index name should be the `es_index_name` from your project's parameter set. +3. In the "Configure result content" section, set the Title to `file_name`, Body to `text_field`, and URL to `url`. If you modified your `es_index_text_field` in your static notebooks parameters, the body should be set to the modified value. +4. Under "Advanced Elasticsearch settings", set the query body to + +```json +{ + "sort": [ + { + "_score": "desc" + } + ], + "query": { + "text_expansion": { + "ml.tokens": { + "model_id": ".elser_model_1", + "model_text": "$QUERY$" + } + } + } +} +``` + +5. Enable conversational search and save the extension. Conversational search is a beta feature that you need to request access for [here](https://form.asana.com/?k=U0gIIpwhM2_LY8r8LC_qDw&d=8612789739828). + +Once you have finished configuring the search extension, configure the assistant's actions as follows: + +6. In the "No action matches" action, set the assistant "Search for the answer". +7. Save your action and navigate to the preview tab. Your assistant is now configured. Test it out by passing in natural language queries regarding your document collection. + +Please refer to [the official documentation](https://cloud.ibm.com/docs/watson-assistant?topic=watson-assistant-search-elasticsearch-add) for more information on using the native search extension. + +### **Assistant Integration Utility** + +:::warning +Dependent Steps: [Assistant Integration](#assistant-integration) +::: + +#### **Assistant Action Integrationn** + +1. Within the appropriate action step select "And then" option as "Use an extension" and select the appropriate Watsonx Discovery extension +2. For "Operation" select "Get the predications" +3. For the "Parameters" set: + - input_data: Expression type as `[{"fields": ["Text"],"values": [[input.text]]}]` + - wml_deploment_id": deployment_id from step 3 of [Custom Extension for RAG Deploment Configuration](#deploy-rag-function-in-watson-machine-learning) + +#### **Extract WxD Values to User** + +llm_response value: + +`${[APPROPRIATE_STEP]_result_1.body.predictions[0].llm_response}` + +Source links: + +`${[APPROPRIATE_STEP]_result_1.body.references[VALUE].metadata.file_name}` diff --git a/docs/04-Create/03-Governance/01-watsonx_gov.mdx b/docs/04-Create/03-Governance/01-watsonx_gov.mdx new file mode 100644 index 0000000..691f7a8 --- /dev/null +++ b/docs/04-Create/03-Governance/01-watsonx_gov.mdx @@ -0,0 +1,72 @@ +--- +title: watsonx.gov +sidebar_position: 1 +description: How to configure watsonx.gov +custom_edit_url: null +--- + +## Overview +:::warning + Prerequisite: [watsonx.gov Software Requirement](/Prepare/Requirements#governance) +::: + +This portion of the Create tab will walk you through how to create and integrate the watsonx.gov platform with the Orchestrate solutions. That way, you can monitor, track, and update your models in real-time and before deployment. To check out how watsonx.gov adds to the Business Value of this particular use case, visit the +[Business Use Case](/Business%20Overview/Use%20Cases/Governance#overview) tab. + +--- + +## Steps to Track Models in the watsonx.gov Platform + +1. Save your Prompt Lab as a Prompt Template. +2. Within the project underneath _Assets_, click on the vertical 3 dots --> "Go to AI Factsheet" button. +3. Once your in your AI Factsheet, make sure to click "Track in AI use case" which will allow you to track the model's lifecycle from development to deployment. + - If you have not created an AI Use Case yet, please do so in order to track this model in your AI Use Case. +4. Now, back in your project underneath _Assets_, please click on the vertical 3 dots again --> "Promote to space" button. + - If you do not have a deployment space created yet, please create one on the main watsonx.ai homepage in order to hold all the deployed models that you create. + - OPTIONAL: It might be beneficial to create a Pre-Production and a Post-Production deployment for the same model. This is beneficial because at different stages of deployments, there are different metrics unlocked in order to showcase the attributes of the model. +5. Now in the Deployment Space underneath the _Assets_ tab, click again on the vertical 3 dots --> "Deploy" button. +6. Once you have deployed, the page will bring you to all of your _Deployments_, click into the model that you would like to Evaluate. Next, click on the header "Evaluations" then click the "Evaluate" button. At this point, it is now time to evaluate your models under the _Evaluate_ tab, but a pre-requisite is to have some sort of feedback dataset with pre-generated inputs and outputs of what you want your model to spit out in an ideal situation. That way, the model can then evaluate its results against this dataset. + - If you do not have this dataset, an easy way to create it is to utilize LLMs to create synthetic data. Or you can utilize one that we have provided for you: + - [query_actions_feedback.csv](../../../assets/governance_assets/query_actions_feedback.csv) + - [IndividualvsOther_feedback.csv](../../../assets/governance_assets/IndividualVsOther.csv) + - [IndividualvsOther_self_feedback.csv](../../../assets/governance_assets/IndividualVsOther_self.csv) + - [financialVsOther_ground_truth_feedback.csv](../../../assets/governance_assets/financialVsOther_ground_truth.csv) + - [Extract_references.csv](../../../assets/governance_assets/Extract_references.csv) +7. Once you have your feedback dataset, once you have clicked the "Evaluate" button, under the header "Select text data," upload your feedback dataset. +8. The model will likely take a few minutes to evaluate, but once it has, in the _Actions_ tab, feel free to adjust parameters depending on what you feel your model thresholds should be -- each use case requires a different set of thresholds. + - Also, note that if you set your model to Production, then you will be able to perform a Drift evaluation. This requires another dataset called the payload dataset, which is typically shorter than the feedback one but similarly has synthetic data in it. Drift will track whether or not your model degrades over time. +9. Once deploying and evaluation the models on the watsonx.ai platform, you will be able to view your models in OpenScale where additional metrics regarding quality and model health will be shared. Within OpenScale, you can also perform various evaluations on your models and readjust thresholds. + +--- + +## Assistant Integration + +### Create Custom Extension + +1. Within the watsonx Orchestrate instance, navigate to the sidebar and go to the header "AI Assistant Builder" +2. Once you've entered the AI Assistant Builder, navigate to the sidebar and go to the header "Integrations" +3. Under the "Extensions" tab, click on "Build custom extension" so that you can create a custom extension for watsonx.gov in your assistant. + - Name the extension something descriptive. When it asks for the OpenAPI Spec, you can either create your own OpenAPI Spec, or utilize the one that we provide, located [here](assets/openAPI_specs/watsonx_Discovery/governance_openapi_spec.json). +4. Once you've created the extension, you need to add it to the assistant by clicking the "Add +" button on the Integrations homepage. + - Click on Next, then click OAuth authentication. To create an API Key, please go to IBM Cloud at this [link](https://cloud.ibm.com/iam/apikeys) then create a new API key and paste it into where it asks for an API key. + +### Assistant Action Configuration + +1. To then utilize the extension that you've just created, go to the Actions tab on the sidebar of the Assistant. Create a new action for the Assistant where you want to utilize the extension. +2. Under the step that you want to utilize the extension, in the "And then" section, click on "Use an extension" and choose the extension that you just created. Click on the operation "Get the predictions" and be sure to click on "Apply" for the **wml_deployment_id, version, and query text** variables. + - To get the _wml_deployment_id_, go to your cloud environment, underneath Deployments, click on your Deployment Space, then on the right-hand side where it says "About this deployment," under "Deployment Details," copy the Deployment ID. + - For the _query_text_ variable, this should be filled with the input that you want to send to the model. +3. Congratulations, you've successfully integrated a custom extension! + +### Retrieve Relevant watsonx.gov Metadata + +1. Within the watsonx Orchestrate instance, navigate to the sidebar and go to the header "AI Assistant Builder" +2. Once you've entered the AI Assistant Builder, navigate to the sidebar and go to the header "Assistant" +3. Go to the action where you utilize the custom extension (the action that you built in the last section). Navigate to the conversation step where you added "Use an extension" then click on "New Step +" to add a step after this extension. +4. In that new step, set a new variable to a new expression. Call the new variable anything you want, but something descriptive. In the expression free text space, type in something like this: + ```${step_001_result_1}.body.results[0].generated_text``` + :::note + - Instead of step_001 though, please replace with the specific step number of the previous step where you called the extension. To find this step number, go to the step before where you called the extension, go to the URL and all the way at the right, it should say the step number. Copy that step number and replace the step_001 with it. + - Also, make sure to put the step\_#_result_1 in curly brackets. + ::: +5. Great, now you can use this variable in any of your "Assistant says" sections to depict the model's response in the Assistant response. diff --git a/docs/04-Create/03-Governance/02-rag_sourcelinks.mdx b/docs/04-Create/03-Governance/02-rag_sourcelinks.mdx new file mode 100644 index 0000000..e2eb647 --- /dev/null +++ b/docs/04-Create/03-Governance/02-rag_sourcelinks.mdx @@ -0,0 +1,77 @@ +--- +title: RAG Source Links +sidebar_position: 2 +description: How to setup document source links for RAG +custom_edit_url: null +--- + +## Overview +:::warning + Prerequisite: [watsonx Orchestrate Software Requirement](/Prepare/Requirements#web-chat-interface) +::: + +This section will go over how to add source links for files within an Assistant. Though there are different methods in which source links can be created, this +document will cover these methods: +- Access via COS (Cloud Object Storage) + +### Access via COS (Cloud Object Storage) +1. In the appropriate Cloud Object Storage instance create a new "Custom Bucket" +2. Within the new bucket, navigate to the "Permissions" tab +3. Within the "Public Access" Section select "Content Reader" as the "Role for this bucket" +4. Select "Create access policy" to enable public access for the bucket +5. Upload the necessary documents which will be the source for the dedicated RAG Document Search + :::note + Ensure the name of the sources are the same as the sources in the target data courpus/source (ie. COS, Watson Discovery, etc.) + ::: + +## Assistant Integration +:::warning +**Prerequisites**: Ensure the necessary documents have been uploaded to the appropriate COS buckets [here: Access via COS](#access-via-cos-cloud-object-storage) +::: + +### Extract Source Links' Base Url +1. Navigate to the COS bucket with all the designated source links for the RAG Document Search functionality. +2. On any of the documents click the three dots to the left and find the "Object public url". May require to refresh the page if "Object public url" is not there. +3. Within the url extract and copy the first part of the link up till the first "/" (ex. "https://[domain].s3.us-south.cloud-object-storage.appdomain.cloud") + +### Configure to existing metadata + +There are two ways to retrieve source links from RAG Document Search demonstrated in these docs: +- Watson Discovery +- Watsonx Discovery + +#### **Watsonx Discovery** +1. Within the appropriate Assistant step concatenante the link value from step 3 to the RAG Document Search metatdata source url +2. Create source link variable with the expression value of: + + ``` "[BASE URL LINK]" + ${step_[x]_result_1}.body.references[y].metadata.file_name ``` + + :::note + - Replace "x" with the appropriate step number of the result of the watsonx discovery extension + - Replace "y" with either 0-3 to reference source file_name 1-3 + ::: + + *Reference Example:* + + ``` "https://examplebucket.s3.us-south.cloud-object-storage.appdomain.cloud/" + ${step_001_result_1.body.references[0].file_name} ``` + +3. Optional: Insert source link variable in output for click=able links + + ``` For more information click here ``` + +#### **Watson Discovery** +1. Within the appropriate Assistant step concatenante the link value from step 3 to the RAG Document Search metatdata source url +2. Create source link variable with the expression value of: + + ``` "[BASE URL LINK]" + ${step_[x]_result_2}.body.results[0].metadata.source.url ``` + + :::note + - Replace "x" with the appropriate step number of the result of the watsonx discovery extension or choose from the appropriate step variables + ::: + + *Reference Example:* + + ``` "https://examplebucket.s3.us-south.cloud-object-storage.appdomain.cloud/" + ${step_001_result_2}.body.results[0].metatdata.source.url ``` +3. Optional: Insert source link variable in output for click=able links + + ``` For more information click here ``` \ No newline at end of file diff --git a/docs/04-Create/04-Third Party Apps/01-ServiceNow.mdx b/docs/04-Create/04-Third Party Apps/01-ServiceNow.mdx new file mode 100644 index 0000000..f8c8bdd --- /dev/null +++ b/docs/04-Create/04-Third Party Apps/01-ServiceNow.mdx @@ -0,0 +1,86 @@ +--- +title: ServiceNow +sidebar_position: 1 +description: How to configure ServiceNow to an assistant +custom_edit_url: null +--- + +# ServiceNow + +## Setup ServiceNow Developer Instance +
+ +### Create Instance +Follow steps [here](https://developer.servicenow.com/dev.do#!/learn/learning-plans/tokyo/new_to_servicenow/app_store_learnv2_buildmyfirstapp_tokyo_personal_developer_instances) + +### Acquire Relevant Instance Credentials +1. Login into the developer [site](https://developer.servicenow.com/dev.do) +2. Click on the drop down arrow near your profile in top right corner and select "Manage Instance Password" + drawing +3. Save values for: + * Instance URL + * Username + * Password +4. Exit. Within "My Instance" view, select "Start Building" +5. Select "All" in the top navigation panel and search"System OAuth" and select "Applicaiton Registry" +6. In the top right select "New" and select "Create an OAuth API endpoint for external clients" +7. Enter necessary details and save the Client ID and Client Secret for later use + + +## Assistant Integration +
+There are two ways in which the ServiceNow application can be integrated to the Assistant: +* [Watsonx Orchestrate Skill](#watsonx-orchestrate-skill) +* [Assistant Custom Extension](#assistant-custom-extension) + +### Watsonx Orchestrate Skill +1. Within the watsonx orchestrate platform, navigate to the sidebar and select "Chat" +2. In the dropdown menu in the top select the right target Assistant environment +3. Select "Add skills from the catalog" and select "ServiceNow" +4. In the top right select "Connect App" +5. Enter values [Acquire Relevant Instance Credentials](#acquire-relevant-instance-credentials): + * **Enpoint URL** -> Step 3 + * **Username** -> Step 3 + * **Password** -> Step 3 + * **Client ID** -> Step 7 + * **Client Secret** -> Step 7 +6. Select "Connect App" +7. Navigate to the platforms sidebar and select "AI Assistant Builder" +8. Within the "Actions" section of the buider, select "New Action+" -> "Action from skills" -> appropriate ServiceNow skill + +### Assistant Custom Extension + +#### **Get Developer Instance Credentials and OpenAPI spec** + +1. Login into the developer [site](https://developer.servicenow.com/dev.do) +2. Click on the drop down arrow near your profile in top right corner and select "Manage Instance Password" + drawing +3. Make note of the "username" and "password" values (this will be used later) +4. Exit out of the window and select "Start Building" +5. Press "All" in the header and search "REST API Explorer" +6. Press "Export OpenAPI Specification (YAML/JSON)" + +#### **Edit Service Now OpenAPI spec** +1. Open the downloaded API spec +2. Remove the forward slash at the end of the url string within the "servers" block to look like this: + drawing +3. Add BasicAuth Component to the OpenAPI spec (make sure each block is comma delimited): + ```sh + "components":{ + "securitySchemes": { + "basicAuth": { + "type": "http", + "scheme": "basic" + } + } + } + ``` +4. Save file +#### **Build Custom Extenstion** +1. Within WatsonX Assistant, navigate to the sidebar and select "Integrations" +2. Select "Build Custom Extension" +3. For the "Basic Information" page fill out all appropriate fields and click "Next" +4. Upload the Service Now OpenAPI spec, click "Next" and then "Finish" +5. Within the extensions in Watson Assistant click "Add+" on the recently made Service Now custom extension +6. On the Authentication page fill out the username and password fields with the values saved from "Get Developer Instance Credentials and OpenAPI spec" step 3 +7. Click "Next" and then "Finish" diff --git a/docs/04-Create/04-Third Party Apps/02-Workday.mdx b/docs/04-Create/04-Third Party Apps/02-Workday.mdx new file mode 100644 index 0000000..46532af --- /dev/null +++ b/docs/04-Create/04-Third Party Apps/02-Workday.mdx @@ -0,0 +1,24 @@ +--- +title: Workday +sidebar_position: 2 +description: How to configure Workday to an assistant +custom_edit_url: null +--- + +# Workday + +## Assistant Integration +
+The Workday application can be integrated to the Assistant with: +* [Watsonx Orchestrate Skill](#watsonx-orchestrate-skill) + +### Watsonx Orchestrate Skill +1. Within the watsonx orchestrate platform, navigate to the sidebar and select "Chat" +2. In the dropdown menu in the top select the right target Assistant environment +3. Select "Add skills from the catalog" and select "Workday" +4. In the top right select "Connect App" +5. Enter the appropriate values +6. Select "Connect App" +7. Navigate to the platforms sidebar and select "AI Assistant Builder" +8. Within the "Actions" section of the buider, select "New Action+" -> "Action from skills" -> appropriate Workday skill + diff --git a/docs/04-Create/04-Third Party Apps/03-Genesys.mdx b/docs/04-Create/04-Third Party Apps/03-Genesys.mdx new file mode 100755 index 0000000..ddbd244 --- /dev/null +++ b/docs/04-Create/04-Third Party Apps/03-Genesys.mdx @@ -0,0 +1,84 @@ +--- +title: Genesys +sidebar_position: 3 +description: How to configure Genesys to an assistant +custom_edit_url: null +--- + +# Genesys Setup +:::warning + Prerequisite: [Genesys Requirement](/Prepare/Requirements#genesys) +::: +## 1. Setup Genesys Web Messenger +
+1. Sign up for a free trial of Genesys Cloud CX at [this link.](https://www.genesys.com/campaign/try-genesys-cloud-for-free) This trial will last for 14 days. +2. Once Genesys Cloud CX is running, configure the web messenger following [these steps:](https://help.mypurecloud.com/articles/configure-messenger/) + 1. Click **Admin**. + 2. Under **Message**, click **Messenger Configurations**. + 3. Enter a name and description for your web messenger. + 4. In the **Appearance** tab, select **Hide** for **Set your Launcher Button Visibility** and select **Off** for the **User Interface**. + 5. The other settings are optional for basic set up. Once you have looked over all the options, click **Save New Version**. +3. Once your web messenger is configured, deploy the messenger following [these steps:](https://help.mypurecloud.com/articles/deploy-messenger/) + 1. Click **Admin**. + 2. Under **Message**, click **Messenger Deployments**. + 3. Click **New Deployment**. + 4. Enter a name and description. + 5. Set the **Status** to **Active**. + 6. Under **Select your Configuration** click **Select Configuration** and select the verision of web messenger that you configured in the previous section. + 7. Under **Restrrict domain access** select **Allow all domains** for testing and development purposes. + 8. Click **Save**. The **Messenger Deployments** page now displays your deployed messenger which you can click to access the deployment script which will be used in the next section to integrate with Watson Assistant. + +## 2. Connect Genesys to Watson Assistant +
+:::warning + Prerequisite: [Webchat Interface](../01-Webchat%20Interface/01-Static-Webpage.mdx) + + Note: The Genesys SDK does not work if launched from a local HTML file opened directly in your browser (using file://). It needs to be served from a server (which can be localhost) over HTTP. You can use [http-server](https://www.npmjs.com/package/http-server) if you need a simple local server for testing purposes. +::: + +### Congfigure HTML File for Genesys +1. Get the **script URL, environment, and deployment ID** values from the **Mesenger Deployments** page on Genesys and have them ready to add to the Watson Assistant Embed Script. +2. Follow the sample script which shows how to enable the Genesys integration at [this link.](https://web-chat.global.assistant.watson.cloud.ibm.com/docs.html?to=service-desks-genesys#enabling) + 1. The serviceDesk portion of the script enables the Genesys integration. Simply add those values to where you have the Watson Assistant Embed script written out. Shown Below. + `````` + +### Configure Bot Connector in Assistant +1. Navigate to the Assistant side bar and select "Integrations" +2. Within the "Channels" section select "Genesys Bot Connector" +3. Select the appropriate environment +4. Fill out the necessary credentials from the Genesys Console + +## 3. Using Genesys Live Agent +
+1. Log in to your Genesys trial account and toggle the **Off Queue** option in the top right to become **On Queue**. +2. Open your Watson Assistant that has been integrated with Genesys (over HTTP not on the UI or local HTML file) and follow action to connect to live agent. +3. Once you have clicked **Connect to Live Agent** go back to the Genesys account and you should see an incoming message which you can accept or decline. Click **Accept** and now you are able to communicate with a live customer using your Watson Assistant instance. + +## References +
+- [Integrating Watson Assistant with Genesys](https://cloud.ibm.com/docs/watson-assistant?topic=watson-assistant-deploy-genesys) diff --git a/docs/04-Create/05-Subordinate Bots/01-assistant_custom_extension.mdx b/docs/04-Create/05-Subordinate Bots/01-assistant_custom_extension.mdx new file mode 100644 index 0000000..41d678b --- /dev/null +++ b/docs/04-Create/05-Subordinate Bots/01-assistant_custom_extension.mdx @@ -0,0 +1,96 @@ +--- +title: Assistant Custom Extension +sidebar_position: 1 +description: How to setup a bot-to-bot integration with another watsonx assistant +custom_edit_url: null +--- + +# watsonx Assistant Subordinate Bot + +This section will describe how we were able to query one watsonx assistant from another using the watsonx assistant API. In addition, this guide will also include information about how to update session variables on the subordinate bot from the calling bot. +There are two different ways to integrate with a subordinate watsonx assisant bot: +1. [Assistant Custom Extension](#assistant-custom-extension) +2. [Watsonx Orchestrate Skill](#watsonx-orchestrate-skill) + +## Assistant Custom Extension +
+ +### Generate API Key +1. Navigate to the subordinate bot instance +2. In the sidebar, navigate to "Assistant Settings" +3. Under "Assistant IDs and API details", select "Generate API key" and save it for later + +### Identify OpenAPI spec values +Two values will be needed to connect to a subordinate Assistant: +* service instance url +* environmentID + +1. Within the Assistant Builder's sidebar, navigate to the "Assistant Settings" +2. Under "Assistant IDs and API details", select "View details" +3. Save/copy the values for: + * ```service instance URL``` + * ```Draft/Live Environment ID``` (whichever is applicable) + +### Create the Custom Extension +1. Navigate and download the OpenAPI spec for another assistant bot [here](https://github.ibm.com/ibm-client-engineering/solution-watsonx-orchestrate/tree/main/assets/openAPI_specs/otherBot) +2. Modify the server url at line 10 with the ```service instance url``` from step 3 [here](#identify-openapi-spec-values) +3. Navigate to the integrations section within the Assistant Builder sidebar +4. Select "Build custom extension" and name the extension +5. Upload the OpenApi spec from step 1 and press "Finish" +6. Select "Add+" within the newly configured extension +7. Select "Next" and update values: + * Authentication type: Basic auth + * Username: apikey + * Password: Password from [here](#create-an-ibm-cloud-api-key) + + +### Action Integration +Two api calls are needed to successfully integrate this extension: +1. [Create a session id for the subordinate bot](#create-a-session-id) +2. [Make a dialog request to the subordinate bot](#make-a-dialog-request) + +#### **Create a session id** +1. Within the appropriate action, create a step and under "And then" select "Use an extension" +2. Select the appropriate subordinate bot extension made [here](#assistant-custom-extension) +3. Select the Operation as "Create Session" +4. Set the Parameters to: + * **environment_id**: Environment ID from step 3 of [Identify OpenAPI spec values](#identify-openapi-spec-values) + +#### **Make a dialog request** +1. Create a new step after the action with the "create a session" extension +2. Under the "And then" Section, select "Use an extension" +2. Select the appropriate subordinate bot extension made [here](#assistant-custom-extension) +3. Select the Operation as "Make dialog request" +4. Set the Parameters to: + * **input.text**: input.text + * **input.message_type**: text + * **useContext.skills.actions skill.skill_variables** : ```{"DEFINED_VARIABLE" : "VARIABLE_VALUE"}``` + * **session_id**: [session_id from previous step w/ "Create Session" call] + * **environment_id**: Environment ID from step 3 of [Identify OpenAPI spec values](#identify-openapi-spec-values) + + +#### **A note about session variables** +Session variables are an important tool for storing information within and between watsonx assistant actions. They are defined either by the user or by the assistant, and each session has a unique instance of each session variable. + +If you need to pass more information than just the input query to the subordinate bot, a session variable is the way to go. To do this, the session variable must already be defined on the subordinate side before its value can be updated by a dialog request. Once the variable is defined, it can be updated by including its name along with a new value in a json object under the **useContext.skills.actions skill.skill_variables** field. + +## Watsonx Orchestrate Skill +
+This section will cover how the bot-bot communication OpenAPI spec was adapted for use by watsonx orchestrate. The example file can be found [here](https://github.ibm.com/ibm-client-engineering/solution-watsonx-orchestrate/tree/main/assets/openAPI_specs/otherBot/wxo_otherBot_v2_OAS.yaml). The setup instructions for the specification itself are the same as for the watsonx assistant spec. + +### Changes made +Of the significant changes made to the specification, there was one functional change and one more cosmetic change. + +The functional change was to specify in the specification the session variable that will be changed by the watsonx orchestrate skill. For whatever reason, watsonx orchestrate cannot take arbitrary expressions such as a json object as inputs. As a workaround, the session variables must be explicity defined in the specification. Because I was feeling whimsical when I made this, the example session variable is named "jellybelly". + +The other change was to give all of the inputs default values in order to be able to hide the input forms that would otherwise appear in the watsonx orchestrate chat. This will not, however, allow you to hide the forms when the skill is imported into watsonx assistant. Unfortunately, those forms cannot be hidden. + +Another minor change is that the second step, sending the dialog request, is given an output format using the watsonx orchestrate tag "x-ibm-nl-output-template". This allows the user to forgo the very ugly table that shows up by default, although this can also be done through the watsonx orchestrate UI. + +### Setup + +In order to build a skill flow from this [OpenAPI specification](https://github.ibm.com/ibm-client-engineering/solution-watsonx-orchestrate/blob/main/assets/openAPI_specs/otherBot/wxo_otherBot_v2_OAS.yaml), import the spec and setup the authorization in the same that you would for watsonx assistant. Then follow these steps: +1. Click "Enhance this skill" for both skills in order to publish them. +2. Click the dropdown next to "Add Skills" and select "Create a skill flow" +3. Arrange the two skills you just imported into the skill flow, with the first skill being the "Create Session" skill and the second being "Make a dialog request w required" +4. Complete the inputs for the two skills, and select "hide this form from the user" if you would like to not show the form when running the skill flow. make sure that the "input.message_type" is set to "text" for the second skill. diff --git a/docs/04-Create/05-Subordinate Bots/02-watsonx_orchestrate.mdx b/docs/04-Create/05-Subordinate Bots/02-watsonx_orchestrate.mdx new file mode 100644 index 0000000..12f599d --- /dev/null +++ b/docs/04-Create/05-Subordinate Bots/02-watsonx_orchestrate.mdx @@ -0,0 +1,26 @@ +--- +title: watsonx Orchestrate Skill Flow +sidebar_position: 2 +description: Implementing bot communication on watsonx orchestrate +custom_edit_url: null +--- + +## Overview +This section will cover how the bot-bot communication OpenAPI spec was adapted for use by watsonx orchestrate. The example file can be found [here](https://github.ibm.com/ibm-client-engineering/solution-watsonx-orchestrate/tree/main/assets/openAPI_specs/otherBot/wxo_otherBot_v2_OAS.yaml). The setup instructions for the specification itself are the same as for the watsonx assistant spec. + +### Changes made +Of the significant changes made to the specification, there was one functional change and one more cosmetic change. + +The functional change was to specify in the specification the session variable that will be changed by the watsonx orchestrate skill. For whatever reason, watsonx orchestrate cannot take arbitrary expressions such as a json object as inputs. As a workaround, the session variables must be explicity defined in the specification. Because I was feeling whimsical when I made this, the example session variable is named "jellybelly". + +The other change was to give all of the inputs default values in order to be able to hide the input forms that would otherwise appear in the watsonx orchestrate chat. This will not, however, allow you to hide the forms when the skill is imported into watsonx assistant. Unfortunately, those forms cannot be hidden. + +Another minor change is that the second step, sending the dialog request, is given an output format using the watsonx orchestrate tag "x-ibm-nl-output-template". This allows the user to forgo the very ugly table that shows up by default, although this can also be done through the watsonx orchestrate UI. + +### Setup + +In order to build a skill flow from this [OpenAPI specification](https://github.ibm.com/ibm-client-engineering/solution-watsonx-orchestrate/blob/main/assets/openAPI_specs/otherBot/wxo_otherBot_v2_OAS.yaml), import the spec and setup the authorization in the same that you would for watsonx assistant. Then follow these steps: +1. Click "Enhance this skill" for both skills in order to publish them. +2. Click the dropdown next to "Add Skills" and select "Create a skill flow" +3. Arrange the two skills you just imported into the skill flow, with the first skill being the "Create Session" skill and the second being "Make a dialog request w required" +4. Complete the inputs for the two skills, and select "hide this form from the user" if you would like to not show the form when running the skill flow. make sure that the "input.message_type" is set to "text" for the second skill. diff --git a/docs/04-Create/06-Identity and Access Management/01-IBM_Security_Verify.mdx b/docs/04-Create/06-Identity and Access Management/01-IBM_Security_Verify.mdx new file mode 100644 index 0000000..2c5322f --- /dev/null +++ b/docs/04-Create/06-Identity and Access Management/01-IBM_Security_Verify.mdx @@ -0,0 +1,112 @@ +--- +title: IBM Security Verify +sidebar_position: 1 +description: Integrating IBM Security Verify into Watsonx Assistant +custom_edit_url: null +--- + +# IBM Security Verify Integration + +:::warning + Prerequisite: [IBM Security Verify Software Requirement](/Prepare/Requirements#user-authentication) +::: + +This section will go over how to setup IBM Security Verify and integrate it with Watsonx Assistant to verify users, authenticate using a one time password (OTP) +, and reset passwords. + +## Setting up IBM Security Verify +
+ +### Create Users and Groups +1. Login into the dedicated IBM Security Verify instance +2. Click on the user profile in the top right corner and select "Switch to admin" +3. In the sidebar, navigate to "Directory" -> "Users and Groups" +4. Select "Add User+" and fill out the necessary fields. + :::note + - Provide an email you have access to for each added user. + - Make sure to "View extended profile" to fill out the other fields if applicable + ::: +5. Navigate to the "Groups" tab and select "Add Group" and fill out the required fields and assign the appropriate users. + +### Create API Client +1. Login into the dedicated IBM Security Verify instance +2. Click on the user profile in the top right corner and select "Switch to admin" +3. In the sidebar, navigate to "Security" -> "API Access" +4. Select "Add API Client +" +5. Fill out the necessary fields and save the "Client ID" and "Client Secret" values for later reference +## Assistant Custom Extension +
+ +Before jumping to the custom watsonx assistant we created for IBM Security Verify, it would be prudent to explore the included postman collection for the Verify API. Because of the way that watsonx assistant handles json arrays (or to be more specific, doesn't handle json arrays), you will need to manually set some of the inputs in case they are arrays. Making API calls using the postman collection makes this process quite a bit easier. + +### Identify OpenAPI spec values +Three values will be needed to connect to IBM Security Verify: +* verify instance url +* client_id +* client_secret + +The client_id and client_secret will need to be for a corresponding admin account. +TODO: Add instructions for getting these values + + +### Create the Custom Extension +1. Navigate and download the OpenAPI spec for another assistant bot [here](https://github.ibm.com/ibm-client-engineering/solution-watsonx-orchestrate/tree/main/assets/openAPI_specs/IBM%20Verify/Verify_Spec_V3.json) +2. Modify the server url at line 10 with the ```service instance url``` from step 3 [here](#identify-openapi-spec-values) +3. Navigate to the integrations section within the Assistant Builder sidebar +4. Select "Build custom extension" and name the extension +5. Upload the OpenApi spec from step 1 and press "Finish" +6. Select "Add+" within the newly configured extension +7. Select "Next" and update values: + * Authentication type: Oauth 2.0 + * Client ID: client_id + * Client Secret: client_secret + + +### Action Integration +This extension has 3 features with 4 api calls total + +1. [Verify username and password](#verify-a-username-and-password) +2. [Authenticate using a OTP](#authenticate-using-a-one-time-password-(otp)) (has two api calls) +3. [Reset Password](#reset-password) + +#### **Verify a Username and Password** +1. Within the appropriate action, create a step and under "And then" select "Use an extension" +2. Select the appropriate subordinate bot extension made [here](#create-the-custom-extension) +3. Select the Operation as "verify user and pass" +4. Set the Parameters to: + * **returnUserRecord**: *true* or *false* depending on if you want the user data to be returned + * **Accept**: \*/\* see the note below + * **schemas**: *["urn:ietf:params:scim:schemas:ibm:core:2.0:AuthenticateUser"]* see the note below + * **userName**: The username for the user logging in + * **password**: The password for the user logging in + +#### **Authenticate using a One Time Password (OTP)** +1. Create a new step after the step with the "create a session" extension (either in the same action or another) +2. Under the "And then" Section, select "Use an extension" +2. Select the appropriate subordinate bot extension made [here](#assistant-custom-extension) +3. Select the Operation as "Create and email otp transaction" +4. Set the Parameters to: + * **correlation**: any sequence of 4 numbers (these will appear to the verifying user as a sequence before a dash, followed by the actual verification code) + * **emailAddress**: The email address to send the verification code to. If **returnUserRecord** is set to true above, the response json will include a list of all of the user's associated emails. +5. Add a step and prompt the user to input the verification code they recieved less correlation. +6. Add another extension call in a further step using the "Attempt email OTP verification" Operation +7. Set the Parameters to: + * **trxnid**: body.id from the extension call in step 3/4 + * **otp**: The user input + +#### **Reset Password** +1. Create a new step after the step with the "create a session" extension (either in the same action or another) +2. Under the "And then" Section, select "Use an extension" +2. Select the appropriate subordinate bot extension made [here](#assistant-custom-extension) +3. Select the Operation as "Change user's password" +4. Set the Parameters to: + * **userid**: The user's verify userid, if **returnUserRecord** is set to true above, the response json will include the user's userid. + * **content-type**: *application/scim+json* see this note + * **Accept**: \*/\* see the note below + * **schemas**: *["urn:ietf:params:scim:api:messages:2.0:PatchOp"]* see the note below + * **operations**: ```[{"op":"replace","value": {"password": "new_pass", "urn:ietf:params:scim:schemas:extension:ibm:2.0:Notification":{"notifyType":"EMAIL","notifyPassword": true}}}]``` replace new_pass with the new password you would like, and if you would like to disable email notifications set "notifyPassword" to false. For clarity on why it neeeds to be this way, see the note below + +#### A note on schemas, arrays, and watsonx assistant's general quirkiness +You may have noticed that some of the inputs to these extensions were unusual. Watsonx assistant's goal is to abstract away most of the complicated technical details away from the user so that they can focus on making the chatbot flows that they need. This works well for the most part, but when the user needs to add custom extensions with more complex request or response data formats the system starts to show its cracks. In particular, any input that involves a json array has to be manually typed as an expression, which is what happens for the **schema** inputs above as well as the **operations** input for the reset password action. Additionally, watsonx assistant offers no native method for accessing data in json arrays, so the user must determine the step that an array is receieved and use that to write a custom expression to access array values. + +With regards to the **Accept** and **content-type** fields, for whatever reason IBM Security Verify will not work if those values are not set as described. I do not know why this is, but trust me, it is necessary. diff --git a/docs/04-Create/06-Identity and Access Management/03-Simulated.mdx b/docs/04-Create/06-Identity and Access Management/03-Simulated.mdx new file mode 100644 index 0000000..9aa941f --- /dev/null +++ b/docs/04-Create/06-Identity and Access Management/03-Simulated.mdx @@ -0,0 +1,64 @@ +--- +title: Simulated +sidebar_position: 2 +description: How to setup a simulated user authentication process +custom_edit_url: null +--- +## Overview + +This section will go through how to simulate User and Access Management for an Assistant. The steps consist of: + +1. Create a variable assigned to an expression of a dictionary within the assistant platfrom and include the necessary users and their appropriate metadata within the dictionary. +2. Modify the "Greet Customer" action within the assistant + +### Variable Creation +1. Within the "Actions" section of the Assistant navigate to the "Created by you" variables and select "New Variable+" +2. Create a variable name and set variable type to "Any" +3. Update the "initial value": + - Toggle the "Use Expression" to on + - Create dictionary value: + + ex) + ``` + [ + { + "name":"martha", + "password":"MARTHA", + "access":"Admin", + "role":"Manager" + }, + { + "name":"robert", + "password":"ROBERT", + "access":"Employee", + "role":"Employee" + } + ] + ``` +4. Click "Save" + +### Modify "Greet Customer" action +- Within the "Actions" section of the Assistant, navigate to the "Greet Customer" action + +#### **Authenticate User** +1. Check valid user by setting a new boolean variable to an expression value of: + + ``` !(( [users variable from "Variable Creation" step above]).filter("user", "user.name == [current_user value]")).isEmpty()) ``` + + Example) + Valid User Assistant Example + + +2. Ensure valid password by creating conditional statement with expression value of : + + ``` [current_password value] == (users.filter("user", "user.name == [current_user value]"))[0].password ``` + + Example) + Valid Password Assistant Example + +#### **Change User's Password** +1. Set a new password for the current user with an expression like: + ```((( ${users}.filter(\"user\", \"user.name == ${current_user} \"))[0]).password) = ${new_pass}``` +2. Validate changed password with: + ```((( ${users}.filter(\"user\", \"user.name == ${current_user} \"))[0]).password) ``` + diff --git a/docs/04-Create/07-GenAI_routing_create.mdx b/docs/04-Create/07-GenAI_routing_create.mdx new file mode 100644 index 0000000..1451922 --- /dev/null +++ b/docs/04-Create/07-GenAI_routing_create.mdx @@ -0,0 +1,31 @@ +--- +title: GenAI Routing +sidebar_position: 1 +description: How to setup the necessary components for genAI routing +custom_edit_url: null +--- + +## Overview +This documentation walks through how to leverage generative AI to route a user's request to the most appropriate action/workflow or perform RAG on the most relevant data corpuses. This generative AI routing can be carried out by one of two methods. +- watsonx.ai +- watsonx.gov + +### watsonx.ai +#### **Create watsonx.ai custom extension** +1. In your assistant, navigate to Integrations page, click "Build custom extension" -> click "Next" -> Input Extension name `watsonx` -> click "Next" . +2. download json file: [watsonx-openapi.json](https://github.com/watson-developer-cloud/assistant-toolkit/blob/master/integrations/extensions/starter-kits/language-model-watsonx/watsonx-openapi.json) and import file to WA +3. click "Next" -> click "Finish" +4. Lower Right corner of the watsonx extension, click "Add" -> click "Add" -> click "Next" +5. In Authentication page, in the Authentication type dropdown, select "OAuth 2.0" + 1. For Apikey, create and copy a new API key from [API key](https://cloud.ibm.com/iam/apikeys) +6. Click "Next", click "Finish", click "Close" + +#### **Action configuration** +1. Create a new action leveraging the extension created above +2. Configure the model parameters to the appropriate values for the desired use case + + +### watsonx.gov Walkthrough +Follow the instructions from [here](/Create/Governance/watsonx_gov#create-custom-extension) + + diff --git a/docs/04-Takeaways/01-Takeaways.mdx b/docs/04-Takeaways/01-Takeaways.mdx new file mode 100644 index 0000000..aca6c97 --- /dev/null +++ b/docs/04-Takeaways/01-Takeaways.mdx @@ -0,0 +1,8 @@ +--- +title: Takeaways +description: Learnings from this watsonx Assistant pilot +custom_edit_url: null +--- + +# Closing Thoughts + diff --git a/docs/04-Takeaways/01-sample.mdx b/docs/04-Takeaways/01-sample.mdx deleted file mode 100644 index 0f6f512..0000000 --- a/docs/04-Takeaways/01-sample.mdx +++ /dev/null @@ -1,11 +0,0 @@ ---- -title: Sample Page -description: sample page -custom_edit_url: null ---- - -# Sample Page - -- What went well? -- What kind of challenges did we encounter? -- What did we learn from the challenges? diff --git a/docs/05-Resources.mdx b/docs/05-Resources.mdx new file mode 100644 index 0000000..da60260 --- /dev/null +++ b/docs/05-Resources.mdx @@ -0,0 +1,17 @@ +--- +title: Resources +sidebar_position: 5 +description: All resources for build +custom_edit_url: null +--- + +### RAG Document Search + + + +### Governance + + + + +### Security \ No newline at end of file diff --git a/docs/05-Resources/01-sample.mdx b/docs/05-Resources/01-sample.mdx deleted file mode 100644 index 741097e..0000000 --- a/docs/05-Resources/01-sample.mdx +++ /dev/null @@ -1,11 +0,0 @@ ---- -title: Sample Page -description: sample page -custom_edit_url: null ---- - -# Sample Page - -- Learning materials -- References -- Links \ No newline at end of file diff --git a/docs/homepage.mdx b/docs/homepage.mdx index 2b9df89..9cf273e 100644 --- a/docs/homepage.mdx +++ b/docs/homepage.mdx @@ -1,15 +1,16 @@ --- sidebar_position: 0 slug: / -title: 'Unified Virtual Agent' +title: 'Watsonx Orchestrate Unified Agent' custom_edit_url: null --- +**Contacts:** Eashan Thakuria, John Scott
+**Team:** FSM CE East
-#### Flight Path -This is a living document for an adoption journey that synthesizes the best practices from IBM when considering and implementing **`Unified Virtual Agent`** using IBM watsonx. +This is a living document for an adoption journey that synthesizes the best practices from IBM when considering and implementing a **`watsonx Orchestrate Unified Virtual Agent`** -#### Working In The Open +### Working In The Open The Flight Path approach embodies IBM Client Engineering's dedication to transparency and collaboration, which is evident through the creation of this accessible repository that showcases real-life customer experiences. By sharing this knowledge, IBM aims to develop user-friendly and scalable landing zones that encourage the adoption of IBM Technology while prioritizing innovation and user experience. This repository represents IBM Client Engineering's commitment to _working in the open_, where stakeholders and interested parties can participate, provide feedback and benefit from collective knowledge. From 3709817e83e67f912fc37a4608155497c87727c0 Mon Sep 17 00:00:00 2001 From: Eashan Thakuria Date: Wed, 16 Oct 2024 18:11:29 -0400 Subject: [PATCH 2/4] Delete 05-Resources.mdx --- docs/05-Resources.mdx | 17 ----------------- 1 file changed, 17 deletions(-) delete mode 100644 docs/05-Resources.mdx diff --git a/docs/05-Resources.mdx b/docs/05-Resources.mdx deleted file mode 100644 index da60260..0000000 --- a/docs/05-Resources.mdx +++ /dev/null @@ -1,17 +0,0 @@ ---- -title: Resources -sidebar_position: 5 -description: All resources for build -custom_edit_url: null ---- - -### RAG Document Search - - - -### Governance - - - - -### Security \ No newline at end of file From 23557d36b447998f767c22b17e55db97a9e2c691 Mon Sep 17 00:00:00 2001 From: Eashan Thakuria Date: Wed, 16 Oct 2024 18:12:58 -0400 Subject: [PATCH 3/4] updated takeaways folder structure --- docs/{04-Takeaways/01-Takeaways.mdx => 05-Takeaways.mdx} | 1 + 1 file changed, 1 insertion(+) rename docs/{04-Takeaways/01-Takeaways.mdx => 05-Takeaways.mdx} (86%) diff --git a/docs/04-Takeaways/01-Takeaways.mdx b/docs/05-Takeaways.mdx similarity index 86% rename from docs/04-Takeaways/01-Takeaways.mdx rename to docs/05-Takeaways.mdx index aca6c97..484d1b1 100644 --- a/docs/04-Takeaways/01-Takeaways.mdx +++ b/docs/05-Takeaways.mdx @@ -2,6 +2,7 @@ title: Takeaways description: Learnings from this watsonx Assistant pilot custom_edit_url: null +sidebar_position: 4 --- # Closing Thoughts From c3ba023c8471ec8f4c8647a8ff8b30e2e73df20e Mon Sep 17 00:00:00 2001 From: Eashan Thakuria Date: Wed, 16 Oct 2024 18:13:23 -0400 Subject: [PATCH 4/4] Update 05-Takeaways.mdx --- docs/05-Takeaways.mdx | 2 -- 1 file changed, 2 deletions(-) diff --git a/docs/05-Takeaways.mdx b/docs/05-Takeaways.mdx index 484d1b1..f8d91e1 100644 --- a/docs/05-Takeaways.mdx +++ b/docs/05-Takeaways.mdx @@ -5,5 +5,3 @@ custom_edit_url: null sidebar_position: 4 --- -# Closing Thoughts -