rmoff's random ramblings
about talks

Putting Kafka Connect passwords in a separate file / externalising secrets

Published May 24, 2019 by in Kafka Connect at https://rmoff.net/2019/05/24/putting-kafka-connect-passwords-in-a-separate-file-/-externalising-secrets/

Kafka Connect configuration is easy - you just write some JSON! But what if you’ve got credentials that you need to pass? Embedding those in a config file is not always such a smart idea. Fortunately with KIP-297 which was released in Apache Kafka 2.0 there is support for external secrets. It’s extendable to use your own ConfigProvider, and ships with its own for just putting credentials in a file - which I’ll show here. You can read more here.

Kafka Connect connector secrets management 🔗

  1. Set up your credentials file, e.g. data/foo_credentials.properties

    FOO_USERNAME="rick"
    FOO_PASSWORD="n3v3r_g0nn4_g1ve_y0u_up"
  2. Add the ConfigProvider to your Kafka Connect worker. I run mine with Docker Compose so the config looks like this. I’m also mounting the credentials file folder to the container

      kafka-connect:
        image: confluentinc/cp-kafka-connect:5.2.1
        […]
        environment:
        […]
          # External secrets config
          # See https://docs.confluent.io/current/connect/security.html#externalizing-secrets
          CONNECT_CONFIG_PROVIDERS: 'file'
          CONNECT_CONFIG_PROVIDERS_FILE_CLASS: 'org.apache.kafka.common.config.provider.FileConfigProvider'
        […]
      volumes:
        - ./data:/data

    Restart your Kafka Connect worker

  3. Now simply replace the credentials in your connector config with placeholders for the values:

    • Before:

      '{"name": "source-activemq-01",
        "config": {
          "connector.class": "io.confluent.connect.activemq.ActiveMQSourceConnector",
          "activemq.username": "rick",
          "activemq.password": "n3v3r_g0nn4_g1ve_y0u_up",
          […]
    • After:

      '{"name": "source-activemq-01",
        "config": {
          "connector.class": "io.confluent.connect.activemq.ActiveMQSourceConnector",
          "activemq.username": "${file:/data/foo_credentials.properties:FOO_USERNAME}",
          "activemq.password": "${file:/data/foo_credentials.properties:FOO_PASSWORD}",
          […]

Kafka Connect worker secrets management 🔗

You can use the same approach to externalise sensitive values from the worker configuration file itself too. Whilst the worker masks sensitive values in the logfile, you still have the plaintext stored in the worker configuration file. Moving that to another file as shown below may not be so useful (unless the file is secured differently) but when combined with a configuration provider such as a password vault will be very useful.

  1. As above, in the worker configuration, define the config provider. For a file provider it looks like this:

    • Properties file

      config.providers=file
      config.providers.file.class=org.apache.kafka.common.config.provider.FileConfigProvider
    • Docker environment variables

      CONNECT_CONFIG_PROVIDERS: 'file'
      CONNECT_CONFIG_PROVIDERS_FILE_CLASS: 'org.apache.kafka.common.config.provider.FileConfigProvider'
  2. For the file provider, create a file with the key/value configuration items:

    SSL_KEYSTORE_PASSWORD=nevergonnagiveyouup
    GROUP_ID=my_connect_group
  3. In the worker configuration specify the configuration items you’d like to source from the configuration provider, just the same you would for a connector itself. For example, to override the group id and the SSL keystore password using the config specified in the sample file above:

    • Properties file

      group.id=${file:/data/connect_external.properties:GROUP_ID}
      ssl.keystore.password=${file:/data/connect_external.properties:SSL_KEYSTORE_PASSWORD}
    • Docker environment variables

      Note the double $$, since one it’s own will give you the error `Invalid interpolation format`

      CONNECT_GROUP_ID: $${file:/data/connect_external.properties:GROUP_ID}
      CONNECT_SSL_KEYSTORE_PASSWORD: $${file:/data/connect_external.properties:SSL_KEYSTORE_PASSWORD}

When the Kafka Connect worker launches you’ll see it uses the new values. Since the SSL credentials are already masked you just see that it’s a hidden value.

[2020-06-16 13:03:09,721] INFO DistributedConfig values:
…
group.id = my_connect_group
ssl.keystore.password = [hidden]

Robin Moffatt

Robin Moffatt works on the DevRel team at Confluent. He likes writing about himself in the third person, eating good breakfasts, and drinking good beer.

Story logo

© 2025