From cffa23ae72f23a8521429b7acc2755f6db155346 Mon Sep 17 00:00:00 2001 From: pneumacare Date: Mon, 5 Aug 2024 16:25:34 +0100 Subject: [PATCH] Adding formatter --- docs/build/triggers.md | 69 +++++++++++++++++++++++++----------------- docs/jobs/state.md | 9 +++--- 2 files changed, 46 insertions(+), 32 deletions(-) diff --git a/docs/build/triggers.md b/docs/build/triggers.md index a7e14c071d7..2442e9b0544 100644 --- a/docs/build/triggers.md +++ b/docs/build/triggers.md @@ -3,7 +3,7 @@ title: Triggers --- Triggers allow you to start the execution of workflows automatically. They come -in two types: Cron triggers and Webhook Event triggers. +in three types: Cron triggers, Webhook Event, and Kafka triggers. ## Webhook Event Triggers @@ -96,37 +96,50 @@ fn(state => { ## Kafka Triggers -The Kafka trigger allows OpenFn users to build workflows triggered by messages -published by a Kafka cluster. The triggers make use of Kafka consumer groups that are -set up on-demand to receive messages from the cluster and pass them to an OpenFn pipleine -that transforms the message into dataclips that used to initialize a workorder. +The Kafka trigger allows OpenFn users to build workflows triggered by messages +published by a Kafka cluster. The triggers make use of Kafka consumer groups +that are set up on-demand to receive messages from the cluster and pass them to +an OpenFn pipleine that transforms the message into dataclips that used to +initialize a workorder. ![Configuring Kafka Trigger](/img/configuring-kafka.png) ### how to configure a Kafka trigger + 1. Create a new workflow or Ooen an existing workflow -2. Click on the trigger step and change the trigger type to Kafka Consumer in the Trigger type dropdown -3. Fill out the required connection details: - - Hosts: Provide the URL of the host(s) your trigger should listen to for message. - - Topics: What specific topics should the Kafka consumers subscribe to. You need at - least one topic for a successful connection - - Some Kafka cluster require SSL connection. If you are connecting to an environment - that requires SSL connection, select `Enable SSL` - - Select the type of Authentication and provide the username and password for - connecting to the instance. (see tip below for more information on authenticating Kafka) - - Set the initial offset policy. The intial offset dictates where the consumer starts - reading messages from a topic when it subscribes for the first time. There are three - possible options here : `earliest`, `latest` or specific `timestamp` *for example: 1721889238000*. - - Set the Connect timeout. The connect timeout specified in secs is the duraiton of time - the consumer can wait before timing out when attempting to connect with a Kafka cluster. -4. If you have not completed budiling your workflow or you're not ready to start receiving messages -from the Kafka cluster, please disable the trigger until you're ready. - -:::warning Disable the trigger during workflw design -Once the required connection information is provided via the modal, the triggest start attempting to connect to the -cluster. We advice that you disable the workflow until your workflow is ready to consumer data from cluster. To stop -the trigger from receiving and processing messages, disable the trigger. +2. Click on the trigger step and change the trigger type to Kafka Consumer in + the Trigger type dropdown +3. Fill out the required connection details: + +- Hosts: Provide the URL of the host(s) your trigger should listen to for + message. +- Topics: What specific topics should the Kafka consumers subscribe to. You need + at least one topic for a successful connection +- Some Kafka cluster require SSL connection. If you are connecting to an + environment that requires SSL connection, select `Enable SSL` +- Select the type of Authentication and provide the username and password for + connecting to the instance. (see tip below for more information on + authenticating Kafka) +- Set the initial offset policy. The intial offset dictates where the consumer + starts reading messages from a topic when it subscribes for the first time. + There are three possible options here : `earliest`, `latest` or specific + `timestamp` _for example: 1721889238000_. +- Set the Connect timeout. The connect timeout specified in secs is the duraiton + of time the consumer can wait before timing out when attempting to connect + with a Kafka cluster. + +4. If you have not completed budiling your workflow or you're not ready to start + receiving messages from the Kafka cluster, please disable the trigger until + you're ready. + +:::warning Disable the trigger during workflow design + +Once the required connection information is provided via the modal, the triggest +start attempting to connect to the cluster. We advice that you disable the +workflow until your workflow is ready to consumer data from cluster. To stop the +trigger from receiving and processing messages, disable the trigger. + ::: -Learn how a Kafka trigger workflow's initial `state` gets built from a webhook trigger -[here](../jobs/state#kafka-triggered-runs). \ No newline at end of file +Learn how a Kafka trigger workflow's initial `state` gets built from a webhook +trigger [here](../jobs/state#kafka-triggered-runs). diff --git a/docs/jobs/state.md b/docs/jobs/state.md index ade0073d997..45888d1dc49 100644 --- a/docs/jobs/state.md +++ b/docs/jobs/state.md @@ -79,12 +79,13 @@ The input state will look something like this: } ``` -### Kafka triggered runs +### Kafka triggered runs -When a kafka message is received by the trigger, the input state contains important information for -auditing or recovering from any loss of connection or failure of the workorder. +When a kafka message is received by the trigger, the input state contains +important information for auditing or recovering from any loss of connection or +failure of the workorder. -The input state looks like this: +The input state looks like this: ```js {