Skip to content

Commit

Permalink
Bxmsdoc 7507 main (apache#181) (apache#4230)
Browse files Browse the repository at this point in the history
* Quality improvements

* Convert property lists to tables for AMQ Streams intergation doc

* Peer review 1

* Peer review 2

* Build fix
  • Loading branch information
mramendi authored and csherrar committed Jun 21, 2022
1 parent e8fa5f3 commit 18d7058
Show file tree
Hide file tree
Showing 8 changed files with 162 additions and 47 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -10,13 +10,18 @@ To use a custom class in your business application, use {CENTRAL} to upload the

Alternatively, if you deploy your application on SpringBoot, you can compile the classes separately and include them in the class path. In this case, do not complete this procedure.

.Prerequisites

* You are logged in to {CENTRAL} and have permission to edit business processes.
* You created a project for your business process.

.Procedure

. Prepare Java source files with the required custom classes, for example, `MyCustomSerializer`. Use the package name for your space and project, for example, `com.myspace.test`.
. In {CENTRAL}, enter your project and click the *Settings* -> *Dependencies* tab.
. Add any dependencies that your custom classes require, for example, `org.apache.kafka.kafka-clients`.
. In the *Dependencies* field, add dependencies that your custom classes require, for example, `org.apache.kafka.kafka-clients`, as a comma-separated list.
. Click the *Assets* tab.
. For each of the class source files, complete the following steps:
.. Click *Import Asset*.
.. In the *Please select a file to upload* field, select the location of the Java source file for the custom serializer class.
.. Click *Ok* to upload the file.
.. Click *Ok*.
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,12 @@ ifdef::PAM,DM[]
endif::PAM,DM[]
is a streaming platform. It acts as a message broker, passing messages, which are sorted into topics, between applications in a software environment.

Using {PRODUCT}, you can create business processes that send and receive Kafka messages in the following ways:
Using {PRODUCT}, you can create business processes that send and receive Kafka messages. You can use the following methods to create business processes that send and receive Kafka messages:

* Create a start event, intermediate catch event, or boundary event (attached to a human task) of the type `message`. {KIE_SERVER} automatically subscribes to the Kafka topic that is defined in the message. A message triggers the event. The event node acts as the consumer of the message and can pass the content of the message to the subsequent node in the process.
* Create a start event, intermediate catch event, or boundary event (attached to a human task) of the `message` type. {KIE_SERVER} automatically subscribes to the Kafka topic that is defined in the message. A message triggers the event. The event node acts as the consumer of the message and can pass the content of the message to the subsequent node in the process.

* Create an end event or intermediate throw event of the type `message`. When the process triggers the event, {KIE_SERVER} sends a Kafka message in the topic that is defined in the message. The message contains the data that is configured in the event. The event node acts as the producer of the message.

* Add the `KafkaPublishMessages` custom task to the process. This task does not require the {KIE_SERVER} Kafka capability but can be more complicated to configure than message events.

* Configure your service and {KIE_SERVER} to emit Kafka messages about every completed process, case, and task when transactions are committed.
* Configure your service and {KIE_SERVER} with an _emitter_ to automatically send Kafka messages about every completed process, case, and task when transactions are committed.
Original file line number Diff line number Diff line change
Expand Up @@ -20,5 +20,17 @@ You can download the interface definitons from the https://github.com/kiegroup/d
+
.. Provide the classes to your business application. For instructions, see xref:custom-class-provide-proc_{context}[].
. Set the following {KIE_SERVER} system properties to set the custom writer or reader:
** `org.kie.server.jbpm-kafka.ext.eventWriterClass`: the custom event writer class. Set this property to use a different format to send messages. If you want to use a custom format, set the property to the fully qualified name of your custom event writer class. If you want to use a raw JSON data format, set the property to `org.kie.server.services.jbpm.kafka.RawJsonEventWriter`.
** `org.kie.server.jbpm-kafka.ext.eventReaderClass`: the custom event reader class. Set this property to use a different format to receive messages. If you want to use a custom format, set the property to the fully qualified name of your custom event reader class. If you want to use a raw JSON data format, set the property to `org.kie.server.services.jbpm.kafka.RawJsonEventReader`.
+
.{KIE_SERVER} system properties for setting a custom writer or reader
[cols="45%,55%", options="header"]
|===
|Property
|Description

|`org.kie.server.jbpm-kafka.ext.eventWriterClass`
|The custom event writer class. Set this property to use a different format to send messages. If you want to use a custom format, set the property to the fully qualified name of your custom event writer class. If you want to use a raw JSON data format, set the property to `org.kie.server.services.jbpm.kafka.RawJsonEventWriter`.

|`org.kie.server.jbpm-kafka.ext.eventReaderClass`
|The custom event reader class. Set this property to use a different format to receive messages. If you want to use a custom format, set the property to the fully qualified name of your custom event reader class. If you want to use a raw JSON data format, set the property to `org.kie.server.services.jbpm.kafka.RawJsonEventReader`.

|===
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[id='kieserver-kafka-emit-proc_{context}']
= Configuring a service and {KIE_SERVER} to emit Kafka messages when a transaction is committed
= Configuring a service and {KIE_SERVER} to send Kafka messages when a transaction is committed

You can configure {KIE_SERVER} to emit Kafka messages automatically. In this case, {KIE_SERVER} sends a message every time a task, process, case, or variable is created, updated, or deleted. The Kafka message contains information about the modified object. {KIE_SERVER} sends the message when it commits the transaction with the change.
You can configure {KIE_SERVER} with an _emitter_ that sends Kafka messages automatically. In this case, {KIE_SERVER} sends a message every time a task, process, case, or variable is created, updated, or deleted. The Kafka message contains information about the modified object. {KIE_SERVER} sends the message when it commits the transaction with the change.

You can use this functionality with any business process or case. You do not need to change anything in the process design.

Expand All @@ -23,9 +23,13 @@ The published messages comply with the https://github.com/cloudevents/spec[Cloud
* `time`: The timestamp of the event, by default in the https://tools.ietf.org/html/rfc3339[RFC3339] format
* `data`: Information about the process, case, or task, presented in a JSON format

.Prerequisites

* A {KIE_SERVER} instance is installed.

.Procedure

. To enable emitting Kafka messages, complete one of the following steps:
. To send Kafka messages automatically, complete one of the following tasks:
.. If you deployed {KIE_SERVER} on {EAP} or another application server, complete the following steps:
ifdef::PAM,DM[]
... Download the `{PRODUCT_FILE}-maven-repository.zip` product deliverable file from the {PRODUCT_DOWNLOAD_LINK}[Software Downloads] page of the Red Hat Customer Portal.
Expand All @@ -49,24 +53,55 @@ endif::JBPM,DROOLS,OP[]
----
+
. Configure any of the following {KIE_SERVER} system properties as necessary:
* `org.kie.jbpm.event.emitters.kafka.bootstrap.servers`: The host and port of the Kafka broker. The default value is `localhost:9092`. You can use a comma-separated list of multiple host:port pairs.
* `org.kie.jbpm.event.emitters.kafka.date_format`: The timestamp format for the `time` field of the messages. The default value is `yyyy-MM-dd'T'HH:mm:ss.SSSZ` .
* `org.kie.jbpm.event.emitters.kafka.topic.processes`: The topic name for process event messages. The default value is `jbpm-processes-events`.
* `org.kie.jbpm.event.emitters.kafka.topic.cases`: The topic name for process event messages. The default value is `jbpm-cases-events`.
* `org.kie.jbpm.event.emitters.kafka.topic.tasks`: The topic name for process event messages. The default value is `jbpm-processes-tasks`.
* `org.kie.jbpm.event.emitters.kafka.client.id`: An identifier string to pass to the server when making requests. The server uses this string for logging.
* `org.kie.jbpm.event.emitters.kafka._property_name_`: You can set any {KAFKA_PRODUCT} consumer or producer property by using the `org.kie.jbpm.event.emitters.kafka` prefix. For example, to set a value for the `buffer.memory` producer property, set the `org.kie.jbpm.event.emitters.kafka.buffer.memory` {KIE_SERVER} system property.
+
This setting applies when {KIE_SERVER} emits Kafka messages automatically when completing transactions.
+
.{KIE_SERVER} system properties related to the Kafka emitter
[cols="35%,35%,30%", options="header"]
|===
|Property
|Description
|Default value

|`org.kie.jbpm.event.emitters.kafka.bootstrap.servers`:
|The host and port of the Kafka broker. You can use a comma-separated list of multiple host:port pairs.
|`localhost:9092`

|`org.kie.jbpm.event.emitters.kafka.date_format`:
|The timestamp format for the `time` field of the messages.
|`yyyy-MM-dd'T'HH:mm:ss.SSSZ`

|`org.kie.jbpm.event.emitters.kafka.topic.processes`
|The topic name for process event messages.
|`jbpm-processes-events`

|`org.kie.jbpm.event.emitters.kafka.topic.cases`
|The topic name for case event messages.
|`jbpm-cases-events`

|`org.kie.jbpm.event.emitters.kafka.topic.tasks`
|The topic name for task event messages.
|`jbpm-processes-tasks`

|`org.kie.jbpm.event.emitters.kafka.client.id`
|An identifier string to pass to the server when making requests. The server uses this string for logging.
|

|`org.kie.jbpm.event.emitters.kafka._property_name_`
|Set any {KAFKA_PRODUCT} consumer or producer property by using this prefix. For example, to set a value for the `buffer.memory` producer property, set the `org.kie.jbpm.event.emitters.kafka.buffer.memory` {KIE_SERVER} system property.

This setting applies when {KIE_SERVER} is configured with an emitter to send Kafka messages automatically when completing transactions.

For a list of {KAFKA_PRODUCT} consumer and producer properties, see
ifdef::PAM,DM[]
the _Consumer configuration parameters_ and _Producer configuration parameters_ appendixes in https://access.redhat.com/documentation/en-us/red_hat_amq/{AMQ_URL_QUARTERLY}/html-single/using_amq_streams_on_rhel/index[_Using AMQ Streams on RHEL_].
endif::PAM,DM[]
ifdef::JBPM,DROOLS,OP[]
the _Consumer Configs_ and _Producer Configs_ sections in https://kafka.apache.org/documentation/[the Apache Kafka documentation].
endif::JBPM,DROOLS,OP[]
+
* `org.kie.jbpm.event.emitters.eagerInit`: By default, {KIE_SERVER} initializes the Kafka emitter only when sending a message. If you want to initialize the Kafka emitter when {KIE_SERVER} starts, set this property to `true`.
+
|

|`org.kie.jbpm.event.emitters.eagerInit`
|By default, {KIE_SERVER} initializes the Kafka emitter only when sending a message. If you want to initialize the Kafka emitter when {KIE_SERVER} starts, set this property to `true`.

When {KIE_SERVER} initializes the Kafka emitter, it logs any errors in Kafka emitter configuration and any Kafka communication errors. If you set the `org.kie.jbpm.event.emitters.eagerInit` property to `true`, any such errors appear in the log output when {KIE_SERVER} starts.
|`false`
|===
Original file line number Diff line number Diff line change
@@ -1,7 +1,11 @@
[id='kieserver-kafka-proc_{context}']
= Configuring a {KIE_SERVER} to send and receive Kafka messages from the process
= Configuring {KIE_SERVER} to send and receive Kafka messages from the process

To run a process that sends or receives Kafka messages using events, you must use a {KIE_SERVER}. You must configure this {KIE_SERVER} to integrate with {KAFKA_PRODUCT}.
To run a process that sends or receives Kafka messages using events, you must use {KIE_SERVER}. You must configure the {KIE_SERVER} instance to integrate with {KAFKA_PRODUCT}.

.Prerequisites

* A {KIE_SERVER} instance is installed.

.Procedure

Expand All @@ -10,24 +14,69 @@ To run a process that sends or receives Kafka messages using events, you must us
** If you are using Spring Boot, set the `kieserver.kafka.enabled` system property to `true`.
+
. To configure the connection to the Kafka broker, set the `org.kie.server.jbpm-kafka.ext.bootstrap.servers` system property to the host and port of the broker. The default value is `localhost:9092`. You can use a comma-separated list of multiple host:port pairs.
. Optional: Set any of the following system properties to configure sending and receiving Kafka messages:
** `org.kie.server.jbpm-kafka.ext.client.id`: An identifier string to pass to the broker when making requests. {KAFKA_PRODUCT} uses this string for logging.
** `org.kie.server.jbpm-kafka.ext.topics.*`: Mapping of message names to topic names. For example, if you want to send or receive a message in the `ExampleTopic` topic when `ExampleName` is the name of the message, set the `org.kie.server.jbpm-kafka.ext.topics.ExampleName` system property to `ExampleTopic`. You can set any number of such system properties. If a message name is not mapped using a system property, the {PROCESS_ENGINE} uses this name as the topic name.
** `org.kie.server.jbpm-kafka.ext._property_name_`: You can set any {KAFKA_PRODUCT} consumer or producer property by using the `org.kie.server.jbpm-kafka.ext` prefix. For example, to set a value for the `buffer.memory` producer property, set the `org.kie.server.jbpm-kafka.ext.buffer.memory` {KIE_SERVER} system property.
. Optional: Set any of the following system properties related to both sending and receiving Kafka messages:
+
.Optional {KIE_SERVER} system properties related to both sending and receiving Kafka messages
[cols="45%,55%", options="header"]
|===
|Property
|Description

|`org.kie.server.jbpm-kafka.ext.client.id`
|An identifier string to pass to the broker when making requests. {KAFKA_PRODUCT} uses this string for logging.

|`org.kie.server.jbpm-kafka.ext.topics.*`
|Mapping of message names to topic names. For example, if you want to send or receive a message in the `ExampleTopic` topic when `ExampleName` is the name of the message, set the `org.kie.server.jbpm-kafka.ext.topics.ExampleName` system property to `ExampleTopic`. You can set any number of such system properties. If a message name is not mapped using a system property, the {PROCESS_ENGINE} uses this name as the topic name.

|`org.kie.server.jbpm-kafka.ext._property_name_`
|Set any {KAFKA_PRODUCT} consumer or producer property by using the `org.kie.server.jbpm-kafka.ext` prefix. For example, to set a value for the `buffer.memory` producer property, set the `org.kie.server.jbpm-kafka.ext.buffer.memory` {KIE_SERVER} system property.

This setting applies to all processes that send or receive Kafka messages using events on this {KIE_SERVER}.
+

For a list of {KAFKA_PRODUCT} consumer and producer properties, see
ifdef::PAM,DM[]
the _Consumer configuration parameters_ and _Producer configuration parameters_ appendixes in https://access.redhat.com/documentation/en-us/red_hat_amq/{AMQ_URL_QUARTERLY}/html-single/using_amq_streams_on_rhel/index[_Using AMQ Streams on RHEL_].
endif::PAM,DM[]
ifdef::JBPM,DROOLS,OP[]
the _Consumer Configs_ and _Producer Configs_ sections in https://kafka.apache.org/documentation/[the Apache Kafka documentation].
endif::JBPM,DROOLS,OP[]
|===
+
. Optional: Set any of the following system properties related to receiving Kafka messages:
+
. Optional: Set any of the following system properties to configure receiving Kafka messages:
** `org.kie.server.jbpm-kafka.ext.allow.auto.create.topics`: Allow automatic topic creation. Enabled by default.
** `org.kie.server.jbpm-kafka.ext.group.id`: A unique string that identifies the group to which this Kafka message consumer belongs. The default value is `jbpm-consumer`.
. Optional: Set any of the following system properties to configure sending Kafka messages:
** `org.kie.server.jbpm-kafka.ext.acks`: The number of acknowledgements that the Kafka leader must receive before marking the request as complete. The default value is `1`, which means the leader writes the record to its local log and then responds to the {PROCESS_ENGINE}, without waiting for full acknowledgement from all followers.
** `org.kie.server.jbpm-kafka.ext.max.block.ms`: The number of milliseconds for which the publish method blocks. After this time, the {PROCESS_ENGINE} can resume execution of the business process. The default value is `2000` (2 seconds).
.Optional {KIE_SERVER} system properties related to receiving Kafka messages
[cols="35%,35%,30%", options="header"]
|===
|Property
|Description
|Default value

|`org.kie.server.jbpm-kafka.ext.allow.auto.create.topics`
|Allow automatic topic creation.
|`true`

|`org.kie.server.jbpm-kafka.ext.group.id`
|A unique string that identifies the group to which this Kafka message consumer belongs.
|`jbpm-consumer`.

|===
+
. Optional: Set any of the following system properties related to sending Kafka messages:
+
.Optional {KIE_SERVER} system properties related to sending Kafka messages
+
[cols="35%,35%,30%", options="header"]
|===
|Property
|Description
|Default value

|`org.kie.server.jbpm-kafka.ext.acks`
|The number of acknowledgements that the Kafka leader must receive before marking the request as complete.
|`1`. This value means the leader writes the record to its local log and then responds to the {PROCESS_ENGINE}, without waiting for full acknowledgement from all followers.

|`org.kie.server.jbpm-kafka.ext.max.block.ms`
|The number of milliseconds for which the publish method blocks. After this time, the {PROCESS_ENGINE} can resume execution of the business process.
|`2000` (2 seconds).

|===
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,13 @@

You can add a `KafkaPublishMessages` custom task to your process. This task sends Kafka messages. It does not use the {KIE_SERVER} Kafka capability, so you can use this task in processes that do not run on a {KIE_SERVER}. However, this task can be more complicated to configure than other {KAFKA_PRODUCT} integration options.

.Prerequisites

* You are logged in to {CENTRAL} as an administrative user.

.Procedure

. In the {CENTRAL} administrative settings menu, as the administrative user, select *Custom Tasks Administration*.
. In the {CENTRAL} administrative settings menu, select *Custom Tasks Administration*.
. Ensure that *KafkaPublishMessages* is set to *On*.
. In {CENTRAL}, select *Menu* -> *Design* -> *Projects* and then click the space name and the project name.
. Select the *Settings* -> *Custom Tasks* tab.
Expand Down
Loading

0 comments on commit 18d7058

Please sign in to comment.