This application demonstrates how the Watson Assistant (formerly Conversation) service can be adapted to use Tone Analyzer's tone along with intents and entities in a simple chat interface.
Demo: http://food-coach.ng.bluemix.net/
For more information on the Assistant service, see the detailed documentation. For more information on the Tone Analyzer Service, see the detailed documentation.
If you want to experiment with the application or use it as a basis for building your own application, you need to deploy it in your own environment. You can then explore the files, make changes, and see how those changes affect the running application. After making modifications, you can deploy your modified version of the application to IBM Cloud.
- Sign up for an IBM Cloud account.
- Download the IBM Cloud CLI.
- Create an instance of the Watson Assistant service and get your credentials:
- Go to the Watson Assistant page in the IBM Cloud Catalog.
- Log in to your IBM Cloud account.
- Click Create.
- Click Show to view the service credentials.
- Copy the
apikey
value, or copy theusername
andpassword
values if your service instance doesn't provide anapikey
. - Copy the
url
value.
- Create an instance of the Tone Analyzer service and get your credentials:
- Go to the Tone Analyzer page in the IBM Cloud Catalog.
- Log in to your IBM Cloud account.
- Click Create.
- Click Show to view the service credentials.
- Copy the
apikey
value, or copy theusername
andpassword
values if your service instance doesn't provide anapikey
. - Copy the
url
value.
-
In your IBM Cloud console, open the Watson Assistant service instance
-
Click the Launch tool button in the service to launch the Watson Assistant tool.
-
Click Skills from the navigation menu at the top left corner of the page. Click the Create new button to create a new skill. Select Import skill and specify the location of the workspace JSON file in your local copy of the app project:
<project_root>/food-coach/training/food-coach-workspace.json
-
Select Everything (Intents, Entities, and Dialog) and then click Import. The workspace card is created in the Skills dashboard.
-
Select Skills from the navigation menu to return to the Skills dashboard.
-
Click the menu icon in the upper-right corner of the workspace tile, and then select View API details.
-
Copy the Workspace ID.
-
In the application folder, copy the .env.example file and create a file called .env
cp .env.example .env
-
Open the .env file and add the service credentials that you obtained in the previous step.
Example .env file that configures the
apikey
andurl
for a Watson Assistant service instance hosted in the US East region:ASSISTANT_IAM_APIKEY=X4rbi8vwZmKpXfowaS3GAsA7vdy17Qh7km5D6EzKLHL2 ASSISTANT_URL=https://gateway-wdc.watsonplatform.net/assistant/api
If your service instance uses
username
andpassword
credentials, add theASSISTANT_USERNAME
andASSISTANT_PASSWORD
variables to the .env file.Example .env file that configures the
username
,password
, andurl
for a Watson Assistant service instance hosted in the US South region:ASSISTANT_USERNAME=522be-7b41-ab44-dec3-g1eab2ha73c6 ASSISTANT_PASSWORD=A4Z5BdGENrwu8 ASSISTANT_URL=https://gateway.watsonplatform.net/assistant/api
-
Add the
WORKSPACE_ID
to the previous propertiesWORKSPACE_ID=522be-7b41-ab44-dec3-g1eab2ha73c6
-
Your
.env
file should looks like:# Environment variables WORKSPACE_ID=1c464fa0-2b2f-4464-b2fb-af0ffebc3aab ASSISTANT_IAM_APIKEY=_5iLGHasd86t9NddddrbJPOFDdxrixnOJYvAATKi1 ASSISTANT_URL=https://gateway-syd.watsonplatform.net/assistant/api TONE_ANALYZER_IAM_APIKEY=UdHqOFLzoOCFD2M50AbsasdYhOnLV6sd_C3ua5zah TONE_ANALYZER_URL=https://gateway-syd.watsonplatform.net/tone-analyzer/api
-
Install the dependencies
npm install
-
Run the application
npm start
-
View the application in a browser at
localhost:3000
-
Login to IBM Cloud with the IBM Cloud CLI
ibmcloud login
-
Target a Cloud Foundry organization and space.
ibmcloud target --cf
-
Edit the manifest.yml file. Change the name field to something unique.
For example,- name: my-app-name
. -
Deploy the application
ibmcloud app push
-
View the application online at the app URL.
For example: https://my-app-name.mybluemix.net
After you have the application installed and running, experiment with it to see how it responds to your input.
After you have the application deployed and running, you can explore the source files and make changes. Try the following:
-
Modify the .js files to change the application logic.
-
Modify the .html file to change the appearance of the application page.
-
Use the Assistant tool to train the service for new intents, or to modify the dialog flow. For more information, see the Assistant service documentation.
The application interface is designed for chatting with a coaching bot. Based on the time of day, it asks you if you've had a particular meal (breakfast, lunch, or dinner) and what you ate for that meal.
The chat interface is in the left panel of the UI, and the JSON response object returned by the Assistant service in the right panel. Your input is run against a small set of sample data trained with the following intents:
yes: acknowledgment that the specified meal was eaten
no: the specified meal was not eaten
help
exit
The dialog is also trained on two types of entities:
food items
unhealthy food items
These intents and entities help the bot understand variations your input.
After asking you what you ate (if a meal was consumed), the bot asks you how you feel about it. Depending on your emotional tone, the bot provides different feedback.
Below you can find some sample interactions:
In order to integrate the Tone Analyzer with the Assistant service, the following approach was taken:
- Intercept the user's message. Before sending it to the Assistant service, invoke the Tone Analyzer Service. See the call to
toneDetection.invokeToneAsync
in theinvokeToneConversation
function in app.js. - Parse the JSON response object from the Tone Analyzer Service, and add appropriate variables to the context object of the JSON payload to be sent to the Assistant service. See the
updateUserTone
function in tone_detection.js. - Send the user input, along with the updated context object in the payload to the Assistant service. See the call to
assistant.message
in theinvokeToneConversation
function in app.js.
You can see the JSON response object from the Assistant service in the right hand panel.
In the conversation template, alternative bot responses were encoded based on the user's emotional tone. For example:
This sample code is licensed under Apache 2.0. Full license text is available in LICENSE.
See CONTRIBUTING.
Find more open source projects on the IBM Github Page.