Kafka integration

Overview

Verta supports near real-time/asynchronous models with Kafka integration, allowing system admins to seamlessly connect their Kafka streaming servers to Verta. Once integrated, Verta endpoints can be configured to read data from a Kafka topic, perform predictions, and write the output to another Kafka topic. In case of any error scenarios, Verta can send the errors to an error topic for further analysis or handling.

Setup Kafka integration

Step 1

If you are a system admin, go to the Integrations tab in the system admin panel to set up Kafka integration.

Step 2

During the setup process, you will be prompted to provide the following information for the Kafka integration:

  • Integration Name: Choose a descriptive name for future reference. Broker Address: Enter the address of the Kafka broker, typically in the format "BROKER1:PORT1". The broker allows consumers to fetch messages based on topic, partition, and offset.

  • TLS (Transport Layer Security): Optionally enable TLS for server authentication and encryption.

  • Kerberos: Optionally enable Kerberos network authentication protocol to establish a connection between Verta and the Kafka cluster.

These details will be requested when setting up the Kafka integration in Verta and given below is sample integration.

Setup TLS

When TLS (Transport Layer Security) is enabled, you will need to upload the certificate for the Verta system to establish a secure connection. You have the option to either upload the certificate file or paste the certificate content directly into the provided field.

Setup Kerberos

When Kerberos is enabled, you will need to provide the following information:

  • Client Name: Enter the client name associated with the Kerberos authentication.

  • Service Name: Specify the service name to be used for the connection.

  • krb5.conf File: This file contains the configuration information for Kerberos. You can either upload the krb5.conf file or paste its contents into the provided field.

  • Keytab: The Keytab file stores your keys and is typically a binary file. You can either upload the Keytab file or paste its contents.

Step 3

Test the connection before saving.

Note: To ensure a successful connection between Verta and the Kafka cluster, you can use the "Run Test" button. The integration cannot be saved unless the test is successful, indicating a valid connection.

After successfully testing and saving the setup, the Kafka integration will become active. This means that all users within your organization will have the ability to create endpoints with the Kafka setup. They can utilize the integration to establish connections with Kafka topics, perform predictions, and write output.

To learn more about deploying a model with Kafka, you can click on this link.

Note: Every endpoint should be uniquely associated with an input, output and error topic. Topics should not be re-used across endpoints.

Last updated