Kafka setup guide
Ingesting Kafka messages into Propel.
This guide covers how to:
- Create a user in your Kafka cluster
- Make sure your Kafka cluster is accessible from Propel IPs
- Create a Kafka Data Pool in Propel
Requirements
- A Propel account.
- A Kafka cluster with the topics to ingest.
- Access to create users and grant permissions in your Kafka cluster.
1. Create a user in your Kafka cluster
First, you’ll need to create a user with the necessary permissions for Propel to connect to your Kafka cluster.
Create the user
Kafka doesn’t manage users directly; it relies on the underlying authentication system. So if you’re using SASL/PLAIN for authentication, you would add the user to the JAAS configuration file.
- Open the JAAS configuration file (e.g.,
kafka_server_jaas.conf
) in a text editor. - Add the following entry to create the user “propel” with the password:
Replace YOUR_SUPER_SECURE_PASSWORD
with a secure password.
- Save the file.
Set environment variable
Set the KAFKA_OPTS
environment variable to point to your JAAS config file:
Restart the Kafka server for the changes to take effect.
Grant permissions
Now, you’ll use Kafka’s Access Control Lists (ACLs) to grant permissions to the “propel” user.
Use the kafka-acls
CLI to add ACLs for the “propel” user so that it can operate on the propel-*
consumer groups.
For each topic you need to ingest to Propel, run the following command:
Make sure to replace localhost:2181
with your Zookeeper server.
These commands grant Describe
and Read
access to the topics for the user “propel”.
Verify the ACLs
Verify that the ACLs have been correctly set by listing the ACLs for the topics you authorized.
You should see the ACLs you added for the user “propel”.
Create the user
Kafka doesn’t manage users directly; it relies on the underlying authentication system. So if you’re using SASL/PLAIN for authentication, you would add the user to the JAAS configuration file.
- Open the JAAS configuration file (e.g.,
kafka_server_jaas.conf
) in a text editor. - Add the following entry to create the user “propel” with the password:
Replace YOUR_SUPER_SECURE_PASSWORD
with a secure password.
- Save the file.
Set environment variable
Set the KAFKA_OPTS
environment variable to point to your JAAS config file:
Restart the Kafka server for the changes to take effect.
Grant permissions
Now, you’ll use Kafka’s Access Control Lists (ACLs) to grant permissions to the “propel” user.
Use the kafka-acls
CLI to add ACLs for the “propel” user so that it can operate on the propel-*
consumer groups.
For each topic you need to ingest to Propel, run the following command:
Make sure to replace localhost:2181
with your Zookeeper server.
These commands grant Describe
and Read
access to the topics for the user “propel”.
Verify the ACLs
Verify that the ACLs have been correctly set by listing the ACLs for the topics you authorized.
You should see the ACLs you added for the user “propel”.
These instructions set up a API Key and secret with READ
and DESCRIBE
permissions in Confluent Cloud.
Create an API Key in the Confluent Cloud console
In Confluent Cloud, you generally use API keys for authentication rather than user/password combinations.
- Log into your Confluent Cloud account.
- Go to the “Environments” section in the sidebar and click on your environment.
- Click on the “Clusters” tab and select your cluster.
- Click on “API Keys” and then on “Create key”.
- Choose the “Service account”, then “Create new one” and name it “Propel”.
- In the “Add ACLs to service account” section, assign:
DESCRIBE
andREAD
operations to the topic you need to ingest to Propel.DESCRIBE
,READ
andDELETE
operations to the consumer grouppropel-*
.
- Click “Next” and get your Key and Secret that you will use as a user and password to connect to your Kafka cluster.
Verify the permissions
After setting the permissions, you can verify them by clicking on the API key in the “API Keys” section and reviewing the roles and resources it has access to.
These instructions set up an IAM user with READ
and DESCRIBE
permissions in AWS MSK.
Create the IAM user and policy
-
Sign in to the AWS IAM Management Console.
-
In the navigation pane, choose “Users” and then choose “Create User”.
-
For “Username”, enter “propel” and click “Next”.
-
Select “Attach policies directly”, search and select the
AmazonMSKReadOnlyAccess
policy, and click “Next”. -
Then, create a custom policy that grants describe, read, and delete permissions on MSK consumer groups prefixed with “propel-”:
- Click “Create policy”, then choose the JSON tab.
- Paste the following JSON policy into the editor:
- Replace
<region>
,<account-id>
, and<cluster-name>
with your actual AWS region, account ID, and MSK cluster name. - Click “Review policy”, give your policy a name (e.g., “PropelMSKPolicy”), and click “Create policy”.
Create the security credentials for the user
- Click on the user “propel” you just created.
- Click on the “Security credentials” tab and click on “Create access key”.
- Select “Other” and click “Next”.
- Enter any tags and click “Create access key” (optional: add metadata to the user by attaching tags as key-value pairs).
- You now have the access key and secret you can use as user and password to connect to your Kafka cluster. Save these credentials securely, as you will not have access to the secret access key again after this step.
Enable authentication
Follow these steps to enable SASL/SCRAM for authentication with Redpanda.
-
Edit the Redpanda configuration file (usually located at
/etc/redpanda/redpanda.yaml
). -
Enable SASL/SCRAM by adding or updating the following configuration:
See Redpanda docs for enabling SASL/SCRAM.
- Enable TLS encryption.
SASL provides authentication, but not encryption. To enable SASL authentication with TLS encryption for the Kafka API, in redpanda.yaml, enter:
See Redpanda docs for enabling TLS encryption.
- Confirm the SCRAM mechanism is enabled
To check if SASL/SCRAM is enabled, run the following command:
You should see SCRAM
in the output.
See Redpanda docs for checking SASL/SCRAM.
- Restart Redpanda to apply the changes:
Restart your Redpanda server to apply the changes.
Create the user
Redpanda uses rpk
, a command-line tool, to manage users and ACLs.
Create the user “propel”:
Replace <YOUR_SUPER_SECURE_PASSWORD>
with a secure password.
Grant permissions
Grant DESCRIBE
, READ
, and DELETE
permissions to the “propel” user for the topics you need to ingest.
Use the rpk acl
command to add ACLs for the “propel” user so that it can operate on the propel-*
consumer groups.
For each topic you need to ingest to Propel, run the following command:
These commands grant DESCRIBE
and READ
access to the topic “YOUR_TOPIC” for the user “propel”.
Verify the ACLs
Verify that the ACLs have been correctly set by listing the ACLs for the topic “YOUR_TOPIC”:
You should see the ACLs you added for the user “propel”.
2. Make sure your Kafka cluster is accessible from Propel IPs.
To ensure that Propel can connect to your Kafka cluster, you need to authorize access from the following IP addresses:
3. Create a Kafka Data Pool
Create a Kafka Data Pool
Go to the “Data Pools” section in the Console, click “Create Data Pool” and click on the “Kafka” tile.
If you create a Kafka Data Pool for the first time, you must create your Kafka credentials for Propel to connect to your Kafka servers.
Create your Kafka credentials
To create your Kafka credentials, you will need the following details:
- Bootstrap servers: The list of addresses for your Kafka cluster’s brokers.
- Authentication type: The authentication protocol used by your Kafka cluster: SASL/SCRAM-SHA-256, SASL/SCRAM-SHA-512, SASL/PLAIN, or NONE.
- TLS: Whether your Kafka cluster uses TLS for secure communication.
- Username: The username for the user you created in your Kafka cluster.
- Password: The password for the user you created in your Kafka cluster.
Test your credentials
After entering your Kafka credentials, click “Create and test credentials” to ensure Propel can successfully connect to your Kafka cluster. If the connection is successful, you will see a confirmation message. If not, check your entered credentials and try again.
Introspect your Kafka topics
Here, you will see a list of topics available to ingest. If you don’t see the topic you want to ingest, make sure your user has the right permissions to access the topic.
Select the topic to ingest and timestamp
Here, you will see a list of topics available to ingest. Select the topic you want to ingest into this Data Pool. You will see the schema of the Data Pool.
Next, you need to select the timestamp column. This is the column that will be used to order the data in the Data Pool. By default, Propel selects the _timestamp
generated by Kafka.
Name your Data Pool and start ingesting
After you’ve selected the topic, provide a name for your Data Pool. This name will be used to identify the Data Pool in Propel. Once you’ve named your Data Pool, click “Create Data Pool”. Propel will then start ingesting data from the selected Kafka topics into your Data Pool.
Look at the data in your Data Pool
Once you’ve started ingesting data, you can view the data in your Data Pool. Go to the “Data Pools” section in the Console, click on your Kafka Data Pool, and click on the “Preview Data” tab. Here, you can see the data that has been ingested from your Kafka topic.
Create a Kafka Data Pool
Go to the “Data Pools” section in the Console, click “Create Data Pool” and click on the “Kafka” tile.
If you create a Kafka Data Pool for the first time, you must create your Kafka credentials for Propel to connect to your Kafka servers.
Create your Kafka credentials
To create your Kafka credentials, you will need the following details:
- Bootstrap servers: The list of addresses for your Kafka cluster’s brokers.
- Authentication type: The authentication protocol used by your Kafka cluster: SASL/SCRAM-SHA-256, SASL/SCRAM-SHA-512, SASL/PLAIN, or NONE.
- TLS: Whether your Kafka cluster uses TLS for secure communication.
- Username: The username for the user you created in your Kafka cluster.
- Password: The password for the user you created in your Kafka cluster.
Test your credentials
After entering your Kafka credentials, click “Create and test credentials” to ensure Propel can successfully connect to your Kafka cluster. If the connection is successful, you will see a confirmation message. If not, check your entered credentials and try again.
Introspect your Kafka topics
Here, you will see a list of topics available to ingest. If you don’t see the topic you want to ingest, make sure your user has the right permissions to access the topic.
Select the topic to ingest and timestamp
Here, you will see a list of topics available to ingest. Select the topic you want to ingest into this Data Pool. You will see the schema of the Data Pool.
Next, you need to select the timestamp column. This is the column that will be used to order the data in the Data Pool. By default, Propel selects the _timestamp
generated by Kafka.
Name your Data Pool and start ingesting
After you’ve selected the topic, provide a name for your Data Pool. This name will be used to identify the Data Pool in Propel. Once you’ve named your Data Pool, click “Create Data Pool”. Propel will then start ingesting data from the selected Kafka topics into your Data Pool.
Look at the data in your Data Pool
Once you’ve started ingesting data, you can view the data in your Data Pool. Go to the “Data Pools” section in the Console, click on your Kafka Data Pool, and click on the “Preview Data” tab. Here, you can see the data that has been ingested from your Kafka topic.
First, you need to create a Data Source with your Kafka credentials.
To create the Data Pool, you need to:
- Take the
id
of the Data Source to create the Data Pool replacing the<DATA_SOURCE_ID>
in the example below. - Provide the name of the topic to ingest in the
table
field. - Do not add any columns as Kafka Data Pools have a set schema.