All Products
Search
Document Center

E-MapReduce:Use SSL to encrypt Kafka data

Last Updated:Mar 26, 2026

SSL encrypts data in transit between producers, consumers, and brokers in a Dataflow cluster, protecting against eavesdropping and tampering.

By default, SSL is disabled for Kafka clusters. E-MapReduce (EMR) supports two methods to enable it: using the default EMR certificate (faster to set up) or providing a custom certificate.

Prerequisites

Before you begin, ensure that you have:

Step 1: Enable SSL

EMR controls SSL configuration through the kafka.ssl.config.type item in server.properties. Set it to DEFAULT to use the EMR-managed certificate, or CUSTOM to supply your own.

Method 1: Use the default certificate

  1. Go to the Configure tab of the Kafka service page.

    1. Log on to the EMR console. In the left-side navigation pane, click EMR on ECS.

    2. In the top navigation bar, select the region where your cluster resides and select a resource group.

    3. On the EMR on ECS page, find your cluster and click Services in the Actions column.

    4. On the Services tab, find the Kafka service and click Configure.

  2. On the Configure tab, click the server.properties tab. Change the value of kafka.ssl.config.type to DEFAULT.

    Change kafka.ssl.config.type to DEFAULT

  3. Click Save. In the dialog box that appears, enter an execution reason and click Save.

  4. Restart the Kafka service: choose More > Restart in the upper-right corner. Enter an execution reason, click OK, then click OK in the Confirm dialog box.

Method 2: Use a custom certificate

Use this method when you need to supply your own keystore and truststore files for compliance or organizational PKI requirements.

  1. Go to the Configure tab of the Kafka service page.

    1. Log on to the EMR console. In the left-side navigation pane, click EMR on ECS.

    2. In the top navigation bar, select the region where your cluster resides and select a resource group.

    3. On the EMR on ECS page, find your cluster and click Services in the Actions column.

    4. On the Services tab, find the Kafka service and click Configure.

  2. On the Configure tab, click the server.properties tab. Change the value of kafka.ssl.config.type to CUSTOM.

  3. Click Save. In the dialog box that appears, enter an execution reason and click Save.

  4. Configure the SSL-related items (all except listeners) based on your requirements:

    Configuration item Description
    ssl.keystore.location Path to the keystore file, which holds the broker's own identity certificate
    ssl.keystore.password Password to unlock the keystore
    ssl.key.password Password for the private key within the keystore
    ssl.keystore.type Keystore file format (for example, JKS)
    ssl.truststore.location Path to the truststore file, which holds the certificates the broker trusts
    ssl.truststore.password Password to unlock the truststore
    ssl.truststore.type Truststore file format (for example, JKS)
  5. Restart the Kafka service: choose More > Restart in the upper-right corner. Enter an execution reason, click OK, then click OK in the Confirm dialog box.

Step 2: Access Kafka over SSL

To connect to an SSL-enabled Kafka cluster, configure a client properties file with the minimum required settings: security.protocol, ssl.truststore.location, and ssl.truststore.password.

The following example uses the Kafka built-in producer and consumer tools to verify the SSL connection from the cluster master node.

  1. Log on to the master node of your cluster over SSH. See Log on to a cluster.

  2. Create ssl.properties with the following content:

    Note

    To run jobs outside the Kafka cluster, copy the truststore and keystore files from /var/taihao-security/ssl/ssl/ on a cluster node to your runtime environment, then update the paths in ssl.properties accordingly.

    security.protocol=SSL
    ssl.truststore.location=/var/taihao-security/ssl/ssl/truststore
    ssl.truststore.password=${password}
    ssl.keystore.location=/var/taihao-security/ssl/ssl/keystore
    ssl.keystore.password=${password}
    ssl.endpoint.identification.algorithm=

    On the Configure tab of the Kafka service page in the EMR console, you can view the values of the preceding configuration items.

  3. Create a test topic:

    kafka-topics.sh --partitions 10 --replication-factor 2 --bootstrap-server core-1-1:9092 --topic test --create --command-config ssl.properties
  4. Produce test messages:

    export IP=<your_InnerIP>
    kafka-producer-perf-test.sh --topic test --num-records 123456 --throughput 10000 --record-size 1024 --producer-props bootstrap.servers=${IP}:9092 --producer.config ssl.properties
  5. Consume the test messages:

    export IP=<your_InnerIP>
    kafka-consumer-perf-test.sh --broker-list ${IP}:9092 --messages 100000000 --topic test --consumer.config ssl.properties

    Replace <your_InnerIP> with the internal IP address of the master-1-1 node.

What's next

To add user identity verification on top of SSL encryption, configure Simple Authentication and Security Layer (SASL) for your cluster. See Log on to a Kafka cluster by using SASL.