This project deploys a reference architecture with IBM Cloud Functions to execute code in response to messages or to handle streams of data records. No code runs until messages arrive via IBM Event Streams (powered by Apache Kafka). When that happens, function instances are started and automatically scale to match the load needed to handle the stream of messages.
You can learn more about the benefits of building a serverless architecture for this use case in the accompanying IBM Code Pattern.
Deploy this reference architecture:
- Through the IBM Cloud Functions user interface.
- Or by using command line tools on your own system.
If you haven't already, sign up for an IBM Cloud account then go to the Cloud Functions dashboard to explore other reference architecture templates and download command line tools, if needed.
- IBM Cloud Functions (powered by Apache OpenWhisk)
- IBM Event Streams (powered by Apache Kafka)
The application deploys two IBM Cloud Functions (based on Apache OpenWhisk) that read from and write messages to IBM Event Streams (based on Apache Kafka). This demonstrates how to work with data services and execute logic in response to message events.
One function, or action, is triggered by message streams of one or more data records. These records are piped to another action in a sequence (a way to link actions declaratively in a chain). The second action aggregates the message and posts a transformed summary message to another topic.
Choose "Start Creating" and select "Deploy template" then "Event Streams Events" from the list. A wizard will then take you through configuration and connection to event sources step-by-step.
Behind the scenes, the UI uses the wskdeploy
tool, which you can also use directly from the CLI by following the steps in the next section.
This approach will deploy the Cloud Functions actions, triggers, and rules using the runtime-specific manifest file available in this repository.
- Download the latest
ibmcloud
CLI and Cloud Functions plugin. - Download the latest
wskdeploy
CLI.
- Provision an IBM Event Streams instance and name it
kafka-broker
.
You can use the web console to create the instance or the CLI with a command like:
ibmcloud service create messagehub standard kafka-broker
- Create new service credentials for the
kafka-broker
instance named kafka-credentials.
This can be achieved using the "Service credentials" tab on the service instance web page in IBM Cloud or using the IBM Cloud CLI with the following command.
ibmcloud service key-create kafka-broker kafka-credentials
-
From the "Manage" tab on the Event Streams instance page in IBM Cloud, create the following topics:
- in-topic
- out-topic
-
Copy
template.local.env
to a new file namedlocal.env
and update theKAFKA_INSTANCE
,SRC_TOPIC
, andDEST_TOPIC
values for your instance if they differ.
# Clone a local copy of this repository
git clone https://github.com/IBM/ibm-cloud-functions-refarch-data-processing-message-hub.git
cd ibm-cloud-functions-refarch-data-processing-message-hub
# Make service credentials available to your environment
source local.env
ibmcloud fn package refresh
# Deploy the packages, actions, triggers, and rules using your preferred language
cd runtimes/nodejs # Or runtimes/[php|python|swift]
wskdeploy
# Deploy the packages, actions, triggers, and rules
wskdeploy undeploy
- Run the following command to poll for activations logs
ibmcloud wsk activation poll
- Send a test message to the input topic using the package action.
$ DATA=$( base64 events.json | tr -d '\n' | tr -d '\r' )
$ ibmcloud wsk action invoke Bluemix_${KAFKA_INSTANCE}_${KAFKA_CREDS}/messageHubProduce \
--param topic $SRC_TOPIC \
--param value "$DATA" \
--param base64DecodeValue true
- Review the output from the activation polling command to see the activation events from the trigger and actions.
Activation: 'transform-produce' (76b37762ec28417bb37762ec28317b43)
...
Activation: 'receive-consume' (ee0423e3c9d742918423e3c9d7329147)
...
Activation: 'message-processing-sequence' (4adcd36bf44b4cf39cd36bf44bfcf32f)
...
Activation: 'message-trigger' (8886c1ff06a04d4d86c1ff06a07d4d76)
...
Activation: 'messageHubProduce' (431bb7acb23e4cd99bb7acb23e2cd94c)
This approach shows you how to deploy individual the packages, actions, triggers, and rules with CLI commands. It helps you understand and control the underlying deployment artifacts.
This approach sets up a continuous delivery pipeline that redeploys on changes to a personal clone of this repository. It may be of interest to setting up an overall software delivery lifecycle around Cloud Functions that redeploys automatically when changes are pushed to a Git repository.