Skip to main content

How to use external kafka services

How to make use of an centralized external Kafka service instead of using local Kafka service inside a KOBIL Shift hosting namespace. Kafka service are required to run KOBIL Shift services. 

Limitations:

  • KOBIL functionality only support one single Kafka user that is used by all shift services.
  • KOBIL functionality support SASL/SCRAM-SHA-512 authentication only.
  • The Kafka user's password must be available in an existing Kubernetes secret. It cannot be configured in values.yaml.
  • The TLS trust store must be available in an existing Kubernetes secret. It cannot be configured in values.yaml.

Authentication and TLS should work seamlessly with Kafka clusters created by Strimzi Kafka operator. This simplifies testing and allows us to use these features when creating the Kafka cluster with Strimzi.

External Kafka clusters

KOBIL Shift supports external Kafka clusters for the components used in Shift-lite (ast services, scp-notifier, idp services).

The required topics must be manually created in the external Kafka cluster. See file topics.yaml for a list of required topics and their configuration (partitions, retention times). The file topics.yaml is part of the KOBIL Shift Chart package and may differ due to KOBIL Shift release functionality. Only the topics from sections common:, asts:, scp:, and idp: must be created.

To configure Shift to use an external Kafka cluster, the following parameters must be set:

Disable KOBIL Shift internal Kafka setup

  • Disable the creation of the custom resources for Strimzi Kafka operator by KOBIL Shift deployment (integrated Strimzi Operator API usage). By default the KOBIL Shift deployment covers internal kafka-chart to completly create the kafka-related objects when a Strimzi Operator API is available).

    strimzi:
    # disalbe strimzi requires pre-allocated kafkatopics in the KOBIL Shift Namespace
    enabled: false

  • Enable usage of an external Kafka cluster and provide hostname and port which is propagated to the KOBIL Shift services

    common:
    datastores:
    kafka:
    external:
    # when strimzi.enabled:false is true it is required to enable here
    enabled: true
    broker:
    host: kafka-broker
    port: 9092

Configure Authentication to external Kafka

KOBIL Shift supports only one user for all connections to Kafka. Only the SASL mechanism SCRAM-SHA-512 is supported. Use the following parameters to enable authentication and configure the username.

common:
datastores:
kafka:
auth:
enabled: true
username: shift-kafka-username

The password must be provided in an existing Kubernetes secret. The name of the secret must match the username. The password must be contained in the key password.

For username shift-kafka-username and password shift-kafka-password, the required Kubernetes secret can be generated using the following command:

kubectl create secret generic shift-kafka-username \
--from-literal=password=shift-kafka-password

This is how the resulting secret should look like:

apiVersion: v1
kind: Secret
metadata:
name: shift-kafka-username
type: Opaque
data:
password: c2hpZnQta2Fma2EtcGFzc3dvcmQ=

Using TLS connection configuration

KOBIL Shift supports TLS for Kafka connections. When TLS is used, authentication must also be enabled. The TLS trust store must be provided in an existing Kubernetes secret. Use the following parameters to enable TLS and configure the name of the existing Kubernetes secret containing the trust store.

common:
datastores:
kafka:
external:
tls:
enabled: true
trustStoreSecret: shift-kafka-tls-truststore

The existing Kubernetes secret must contain the trust store in two formats. A file containing all required certificates in PEM format must be provided in the key ca.crt encoded as a base64 string. A file containing all required certificates in PKCS#12 format must be provided in the key ca.p12 encoded as a base64 string. The import password for the PKCS#12 file must be provided in the key ca.password encoded as a base64 string.

Given files

  • /path/to/ca.crt containing all required certificates in PEM format
  • /path/to/ca.p12 containing all required certificates in PKCS#12

as well as import password ca-import-password, the required Kubernetes secret can be generated using the following command:

kubectl create secret generic shift-kafka-tls-truststore \
--from-file=ca.crt=/path/to/ca.crt \
--from-file=ca.p12=/path/to/ca.p12 \
--from-literal=ca.password=ca-import-password

This is how the resulting secret should look like:

apiVersion: v1
kind: Secret
metadata:
name: shift-kafka-tls-truststore
type: Opaque
data:
ca.crt: LS0tLS1...LS0tLS0K
ca.p12: MIIGogI...xAgInEA==
ca.password: Y2EtaW1wb3J0LXBhc3N3b3Jk

Set Kafka Topics prefix

KOBIL Shift supports a single prefix to add to all Kafka topics. This must be used when running multiple Shift deployments against the same external Kafka cluster to ensure each deployment uses unique topics. KOBIL strongly recommend to use the prefix configuration for external Kafka services. The prefix must contain only lowercase alphanumeric characters and dashes ('-'). The prefix must start and end with an alphanumeric character and consist of no more than 16 characters. Dot character ('.') will be inserted automatically as delimiter between prefix and internal topic name. For example, when using prefix prod, topic com.kobil.audit becomes prod.com.kobil.audit.

Use the following parameter to configure a topics prefix.

common:
datastores:
kafka:
external:
topics:
prefix: ""

Full example

Below is a full example to configure Shift against an external Kafka cluster using TLS, authentication, and a topics prefix 'test'.


# Deploy only Shift lite services
smartscreen:
enabled: false
smartdashboard:
enabled: false
scpAddressbook:
enabled: false
scpPresence:
enabled: false
scpMessenger:
enabled: false
scpMedia:
enabled: false
scpGateway:
enabled: false

# Disable custom resources for Strimzi Kafka operator
strimzi:
enabled: false

# Configure Shift against external Kafka using authentication, TLS, and topics prefix.
common:
datastores:
kafka:

auth:
enabled: true
username: shift-kafka-username

external:
enabled: true
broker:
host: kafka-broker
port: 9092

topics:
prefix: "test"

tls:
enabled: true
trustStoreSecret: shift-kafka-tls-truststore