5 Security - SSL Encryption with SASL-SCRAM Authentication

This topic describes about Security - SSL Encryption with SASL-SCRAM authentication.

Generate Keystore

The items highlighted in bold are placeholders and should be replaced with suitable values when running the command.

keytool -genkeypair -alias alias -keyalg keyalg -keysize keysize -sigalg 
sigalg -validity valDays -keystore keystore

Table 5-1 Generate Keystore - Keyword Details

Keyword Description
alias Used to identify the public and private key pair created.
keyalg It is a key algorithm used to generate the public and private key pair.

The RSA key algorithm is recommended.

keysize It is the size of the public and private key pairs generated.

A key size of 1024 or more is recommended. Please consult with your CA on the key size support for different types of certificates.

sigalg It is the algorithm used to generate the signature.

This algorithm should be compatible with the key algorithm and should be one of the values specified in the Java Cryptography API Specification and Reference.

valdays It is the number of days for which the certificate is to be considered valid.

Please consult with your CA on this period.

keystore It is used to specify the location of the JKS file.

If no JKS file is present in the path provided, one will be created.

The command prompts for the following attributes of the certificate and Keystore:

Table 5-2 Generate Keystore - Attributes

Attributes Description
Keystore Password Specify a password used to access the Keystore.

This password needs to be specified later when configuring the identity store in Kafka server.

Key Password Specify a password used to access the private key stored in the Keystore.

This password needs to be specified later when configuring the SSL attributes of the Kafka Server.

First and Last Name (CN) Enter the domain name of the machine used to access Oracle Banking Liquidity Management. For example, www.example.com.
Name of your Organizational Unit The name of the department or unit making the request.

Use this field to further identify the SSL Certificate you are creating, for example, by department or by physical server.

Name of your Organization The name of the organization making the certificate request. For example, Oracle Financial Services.

It is recommended to use the company or organization's formal name, and this name entered here must match the name found in official records.

Name of your City or Locality The city in which your organization is physically located. For example, Bengaluru.
Name of your State or Province The state/province in which your organization is physically located. For example, Karnataka.
Two-letter Country Code for this Unit The country in which your organization is physically located. For example, US, UK, IN, etc.

Example 5-1 Sample Execution

Listed below is the result of a sample execution.

keytool -genkeypair -alias OBLMcert -keyalg RSA -keysize 1024 -sigalg SHA512withRSA 
-validity 365 -keystore D:\kafka\securityKeys\KafkaServerKeystore.jks

Enter keystore password:<Enter a password to protect the keystore>

Re-enter new password:<Confirm the password keyed above>

What is your first and last name?

[Unknown]: name.oracle.com

What is the name of your organizational unit?

[Unknown]: OBLM

What is the name of your organization?

[Unknown]: Oracle Financial Services

What is the name of your City or Locality?

[Unknown]: Bengaluru

What is the name of your State or Province?

[Unknown]: Karnataka

What is the two-letter country code for this unit?

[Unknown]: IN

Is CN= name.oracle.com, OU=OBLM, O=Oracle Financial Services, L= Bengaluru, ST= Karnataka, C=IN correct? [no]: yes

Enter key password for < OBLMcert >

RETURN if same as keystore password): <Enter a password to protect the key>

Re-enter new password: <Confirm the password keyed above>

Export Private Key as Certificate

Export private key as certificate command is mentioned below:

keytool -export -alias <alias_name> -file <export_certificate_file_name_with_location.cer> 
-keystore <keystore_name.jks> -keypass <Private key Password> -storepass <Store Password>

Example:

keytool -export -alias OBLMcert -file D:\kafka\securityKeys\KafkaCert.cer 
-keystore D:\kafka\securityKeys\KafkaServerKeystore.jks -keypass oracle123 -storepass oracle123

If successful, the following message will be displayed:

Certificate stored in file < KafkaCert.cer>

Import the Certificate and Generate Trust Store

To import the certificate and generate Trust store, the command is mentioned below:

keytool -import -alias alias -file cert_file -keystore truststore –storepass storepass

Table 5-3 Generate Trust Store - Keyword Details

Keyword Description
alias It is used to identify the public and private key pair.

Specify the alias of the key pair used to create the CSR in the earlier step.

cert_file It is the location of the file containing the PKCS#7 formatted reply from the CA, containing the signed certificate.
truststore It is the location where the TrustStore should be generated.
storepass It is the password for the TrustStore.
The user can generate two TrustStores from the same cert.
  • One used for Kafka server
  • One used for Clients

Example:

keytool -import -alias OBLMcert -file D:\kafka\securityKeys\KafkaCert.cer 
–keystore D:\kafka\securityKeys\KafkaServerTrustStore.jks -storepass oracle123
keytool -import -alias OBLMcert -file D:\kafka\securityKeys\KafkaCert.cer 
-keystore D:\kafka\securityKeys\KafkaClientTrustStore.jks -storepass oracle123
Three Keystore files are required for this method as given in the table below:

Table 5-4 Keystore Files

File Name Description
KafkaServerKeystore.jks Keystore file for Kafka brokers
KafkaServerTrustStore.jks TrustStore file for server
KafkaClientTrustStore.jks TrustStore file for client

To validate the server, each client should import the KafkaClientTrustStore.jks file.

Note:

The truststore files should be generated using the same CA. The user can generate and place these files on all the different servers of Kafka so that they can be accessed by server*.properties file. The KafkaClientTrustStore.jks should be placed on the server, which is accessible by the microservices also.

Create Users in Zookeeper

To create users in Zookeeper, follow below steps:
  1. Start the zookeeper.

    Note:

    Refer to Zookeeper Setup topic.
  2. Follow the below steps for user creation.
    1. Execute the admin command for admin user creation.
      ./kafka-configs.sh --zookeeper localhost:2181,localhost:2182 --alter --add-config 
      “SCRAM-SHA-256=[password=admin-secret],SCRAM-SHA-512=[password=admin-secret]” 
      --entity-type users --entity-name admin

      Note:

      The user created with admin as username and password is setup for the user for each scram mechanism. Here, the user admin is used for Kafka broker auth.
    2. Execute the test command for test user creation.
      ./kafka-configs.sh --zookeeper localhost:2181,localhost:2182 --alter --add-config 
      “SCRAM-SHA-256=[iterations=8192,password=alice-secret],SCRAM-SHA-512=[password=alice-secret]” 
      --entity-type users --entity-name alice

      Note:

      The user created with alice as username and password is setup for the user for each scram mechanism. Here, the user alice is used for client auth. For multiple zookeeper nodes, use comma separated serverIP:port like in the above example(localhost:2181,localhost:2182).

Configure Brokers

Some modifications need to be made in the server*.properties file of kafka server. The following properties need to be added in server1.properties file of kafka.

############################# SSL-SCRAM Settings #############################
ssl.endpoint.identification.algorithm=
ssl.truststore.location=D:\\kafka\\securityKeys\\KafkaServerTrustStore.jks
ssl.truststore.password=oracle123
ssl.keystore.location=D:\\kafka\\securityKeys\\KafkaServerKeystore.jks
ssl.keystore.password=oracle123
ssl.key.password=oracle123
sasl.enabled.mechanisms= SCRAM-SHA-256
sasl.mechanism.inter.broker.protocol= SCRAM-SHA-256
security.inter.broker.protocol=SASL_SSL
listeners=SASL_SSL://HOSTNAME:9092
advertised.listeners=SASL_SSL://IP:9091
listener.name.sasl_ssl.scram-sha-256
.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required 
username="admin" password="admin-secret";

Note:

In the highlighted section, give the absolute path of the Kafka Server Truststore and keystore, and its respective passwords. Modify the hostname and IP in the listeners and advertised.listeners properties field accordingly

Copy the above properties into the server2.properties file and modify the hostname/IP and port in the listeners and advertised.listeners properties field. Sample properties files can be downloaded through the below link.

Download server1.properties and server1.properties and save to the local.

Start the kafka servers.

Note:

Refer to Kafka Setup topic.

Changes to Clients

For the microservices which publish/consume data through kafka, insert the following values in the PROPERTIES table in PLATO schema before deployment.

Table 5-5 PLATO PROPERTIES Table - Key Values

KEY VALUE
plato.services.kafka.brokers <comma separated kafka hostname:port>
plato.services.zknodes <comma separated Zookeeper hostname:port>
plato.services.kafka.security.protocol SASL_SSL
plato.services.kafka.truststore.location <absolute path of client truststore>
plato.services.kafka.truststore.password <encrypted truststore password>
spring.cloud.stream.kafka.binder.configuration.sasl.mechanism SCRAM-SHA-256
spring.cloud.stream.kafka.binder.jaas.loginModule org.apache.kafka.common.security.scram.ScramLoginModule
spring.cloud.stream.kafka.binder.jaas.options.username <Zookeeper SCRAM user created for clients>
spring.cloud.stream.kafka.binder.jaas.options.password <Zookeeper SCRAM user encrypted password for clients>
To encrypt the password, use the following api of plato-config-service of Oracle Banking Liquidity Management:

API: http://hostname:port/config-service/encrypt
Request Type: Text
Request Body: Password

Example 1:

Once the above API is hit for the following passwords, the response of encrypted value is received.
alice-secret : 2f32dc1770acec085105e3ba585cc44c71534451b88b6047504f11191ad8cc1f
oracle123 : 7ec1250634259a1af12f74a7e4705ade7493a4695cc1efd3b713571453fda266

Example 2:

When inserting to properties table, append the encrypted values with the keyword {cipher} to get it decrypted by the config-service during fetch as given in example below.
insert into PROPERTIES (ID,APPLICATION,PROFILE,LABEL,KEY,VALUE) values  (10110,'oblm-structure-services','jdbc','jdbc','plato.services.kafka.brokers','localhost:9092,localhost:9093');
insert into PROPERTIES (ID,APPLICATION,PROFILE,LABEL,KEY,VALUE) values (10111,'oblm-structure-services','jdbc','jdbc','plato.services.zknodes','localhost:2181');
insert into PROPERTIES (ID,APPLICATION,PROFILE,LABEL,KEY,VALUE) values (10112,'oblm-structure-services','jdbc','jdbc','plato.services.kafka.security.protocol','SASL_SSL');

insert into PROPERTIES (ID,APPLICATION,PROFILE,LABEL,KEY,VALUE) values (10113,'oblm-structure-services','jdbc','jdbc','plato.services.kafka.truststore.location','D:\kafka\securityKeys\KafkaClientTrustStore.jks');
insert into PROPERTIES (ID,APPLICATION,PROFILE,LABEL,KEY,VALUE) values (10114,'oblm-structure-services','jdbc','jdbc','plato.services.kafka.truststore.password','{cipher}7ec1250634259a1af12f74a7e4705ade7493a4695cc1efd3b713571453fda266');
insert into PROPERTIES (ID,APPLICATION,PROFILE,LABEL,KEY,VALUE) values (10115,'oblm-structure-services','jdbc','jdbc','spring.cloud.stream.kafka.binder.configuration.sasl.mechanism','SCRAM-SHA-256');
insert into PROPERTIES (ID,APPLICATION,PROFILE,LABEL,KEY,VALUE) values (10116,'oblm-structure-services','jdbc','jdbc','spring.cloud.stream.kafka.binder.jaas.loginModule','org.apache.kafka.common.security.scram.ScramLoginModule');
insert into PROPERTIES (ID,APPLICATION,PROFILE,LABEL,KEY,VALUE) values (10117,'oblm-structure-services','jdbc','jdbc','spring.cloud.stream.kafka.binder.jaas.options.username','alice');
insert into PROPERTIES (ID,APPLICATION,PROFILE,LABEL,KEY,VALUE) values (10118,'oblm-structure-services','jdbc','jdbc','spring.cloud.stream.kafka.binder.jaas.options.password','{cipher}2f32dc1770acec085105e3ba585cc44c71534451b88b6047504f11191ad8cc1f');

Important Commands

Create Topics manually is same as the command mentioned in Create Kafka Topics Manually. If the user want to view the messages getting sent in kafka, then store the below lines in a file and name it as ssl.properties.

ssl.truststore.location=D:\\kafka\\securityKeys\\KafkaClientTrustStore.jks
ssl.truststore.password=oracle123
security.protocol=SASL_SSL
ssl.endpoint.identification.algorithm=
sasl.mechanism=SCRAM-SHA-256
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required
\username="alice"
\password="alice-secret";

Note:

Update the trust store location and password.

Download ssl.properties file and save to the local.

Command to view the messages being published:

./kafka-console-consumer.sh --bootstrap-server kafka-server --topic topicName --consumer.config absolute-path-of-consumer-config --from-beginning

Example:

./kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic oblm --consumer.config D:\kafka\kafka_2.12-2.3.1\config\ssl.properties --from-beginning