GBDI 4.1

Producing Events/Data to be sent to Kafka

GBDI can produce data and publish it to Kafka topics. Select any pipeline that you want to generate Kafka messages from, and ensure that you project fields called key and value. The value field should format the data as you wish to push it to the topic. Then schedule/dispatch it as a job selecting Kafka as the type and defining what Kafka nodes and topics to publish it to as follows:

  1. Enter a name in the 'Name' field.

  2. Enter a valid cron string in the 'Cron string' field.

  3. Select 'Kafka' as the 'Output type'.

  4. In the 'Kafka Servers' field enter an ip:port (for example, for Kafka servers with no authentication. Alternatively, in order to authenticate using Kerberos or SSL, an alias must be defined with the relevant parameters in /etc/sonar/sonar_kafka_aliases.yaml (the file includes examples of all optional aliases) on the GBDI server. Once such an alias is defined, enter the alias name enclosed in brackets in the 'Kafka Servers' field (for example, [test-alias]). You can also create aliases for non-authenticated kafka servers.

  5. Enter a list of topics separated by commas in the 'Destinations to copy to' field.

Example form: