seo | hero | ||||||||
---|---|---|---|---|---|---|---|---|---|
|
|
In this tutorial, you will build C client applications which produce and consume messages from an Apache Kafka® cluster.
As you're learning how to run your first Kafka application, we recommend using Confluent Cloud so that you don't have to run your own Kafka cluster and can focus on the client development. If you do not already have an account, be sure to sign up. New signups receive $400 to spend within Confluent Cloud during their first 30 days. To avoid having to enter a credit card, navigate to Billing & payment, scroll to the bottom, and add the promo code CONFLUENTDEV1
. With this promo code, you will not have to enter your credit card info for 30 days or until your credits run out.
If you prefer to set up a local Kafka cluster, the tutorial will walk you through those steps as well.
Using Windows? You'll need to download Windows Subsystem for Linux.
This guide assumes that you already have a C compiler installed. The code in this guide has been tested with GCC and Clang/LLVM.
You’ll also need to install librdkafka, pkg-config and glibc. These libraries are widely available - search your package manager for librdkafka
, pkg-config
and glib
.
Later in this tutorial you will set up a new Kafka cluster or connect to an existing one.
Create a new directory anywhere you’d like for this project:
mkdir kafka-c-getting-started && cd kafka-c-getting-started
Create the following Makefile
for the project:
We are going to need a Kafka cluster for our client application to operate with. This dialog can help you configure your Confluent Cloud cluster, create a Kafka cluster for you, or help you input an existing cluster bootstrap server to connect to.
Kafka location
From within the Confluent Cloud Console, creating a new cluster is just a few clicks:
Next, note your Confluent Cloud bootstrap server as we will need it to configure the producer and consumer clients in upcoming steps. You can obtain your Confluent Cloud Kafka cluster bootstrap server configuration using the Confluent Cloud Console:
Next, choose the authentication mechanism that the producer and consumer client applications will use to access Confluent Cloud: either basic authentication or OAuth.
Basic authentication is quicker to implement since you only need to create an API key in Confluent Cloud, whereas OAuth requires that you have an OAuth provider, as well as an OAuth application created within it for use with Confluent Cloud, in order to proceed.
Select your authentication mechanism:
Authentication mechanism
You can use the Confluent Cloud Console to create a key for
you by navigating to the API Keys
section under Cluster Overview
.
Note the API key and secret as we will use them when configuring the producer and consumer clients in upcoming steps.
You can use the Confluent Cloud Console to add an OAuth/OIDC identity provider and create an identity pool with your OAuth/OIDC identity provider.
Note the following OAuth/OIDC-specific configuration values, which we will use to configure the producer and consumer clients in upcoming steps:
OAUTH2 CLIENT ID
: The public identifier for your client. In Okta, this is a 20-character alphanumeric string.OAUTH2 CLIENT SECRET
: The secret corresponding to the client ID. In Okta, this is a 64-character alphanumeric string.OAUTH2 TOKEN ENDPOINT URL
: The token-issuing URL that your OAuth/OIDC provider exposes. E.g., Okta's token endpoint URL format ishttps://<okta-domain>.okta.com/oauth2/default/v1/token
OAUTH2 SCOPE
: The name of the scope that you created in your OAuth/OIDC provider to restrict access privileges for issued tokens. In Okta, you or your Okta administrator provided the scope name when configuring your authorization server. In the navigation bar of your Okta Developer account, you can find this by navigating toSecurity > API
, clicking the authorization server name, and finding the defined scopes under theScopes
tab.LOGICAL CLUSTER ID
: Your Confluent Cloud logical cluster ID of the formlkc-123456
. You can view your Kafka cluster ID in the Confluent Cloud Console by navigating toCluster Settings
in the left navigation of your cluster homepage.IDENTITY POOL ID
: Your Confluent Cloud identity pool ID of the formpool-1234
. You can find this in the Confluent Cloud Console by navigating toAccounts & access
in the top right menu, selecting theIdentity providers
tab, clicking your identity provider, and viewing theIdentity pools
section of the page.
This guide runs Kafka in Docker via the Confluent CLI.
First, install and start Docker Desktop or Docker Engine if you don't already have it. Verify that Docker is set up properly by ensuring that no errors are output when you run docker info
in your terminal.
Install the Confluent CLI if you don't already have it. In your terminal:
brew install confluentinc/tap/cli
If you don't use Homebrew, you can use a different installation method.
This guide requires version 3.34.1 or later of the Confluent CLI. If you have an older version, run confluent update
to get the latest release (or brew upgrade confluentinc/tap/cli
if you installed the CLI with Homebrew).
Now start the Kafka broker:
confluent local kafka start
Note the Plaintext Ports
printed in your terminal, which you will need to configure the producer and consumer clients in upcoming steps.
Note your Kafka cluster bootstrap server URL as you will need it to configure the producer and consumer clients in upcoming steps.
A topic is an immutable, append-only log of events. Usually, a topic is comprised of the same kind of events, e.g., in this guide we create a topic for retail purchases.
Create a new topic, purchases
, which you will use to produce and consume events.
When using Confluent Cloud, you can use the Confluent Cloud Console to create a topic. Create a topic with 1 partition and defaults for the remaining settings.
confluent local kafka topic create purchases
Depending on your available Kafka cluster, you have multiple options for creating a topic. You may have access to Confluent Control Center, where you can create a topic with a UI. You may have already installed a Kafka distribution, in which case you can use the kafka-topics command. Note that, if your cluster is centrally managed, you may need to request the creation of a topic from your operations team.
Let's create the producer application by first adding a utility method for setting configuration in a file named common.c
:
Next, paste the following C code into a file named producer.c
.
Fill in the appropriate bootstrap.servers
endpoint and any additional security configuration needed inline where the set_config
function is called.
Next, create the consumer application by pasting the following C code into a file named consumer.c
.
Fill in the appropriate bootstrap.servers
endpoint and any additional security configuration needed inline where the set_config
function is called.
Make the producer executable and run it:
make producer
./producer
You should see output resembling this:
** Message: 13:42:43.513: Produced event to topic purchases: key = eabara value = batteries
** Message: 13:42:43.514: Produced event to topic purchases: key = htanaka value = t-shirts
** Message: 13:42:43.514: Produced event to topic purchases: key = jbernard value = t-shirts
** Message: 13:42:43.514: Produced event to topic purchases: key = eabara value = batteries
** Message: 13:42:43.514: Produced event to topic purchases: key = eabara value = gift card
** Message: 13:42:43.514: Produced event to topic purchases: key = eabara value = book
** Message: 13:42:43.514: Produced event to topic purchases: key = jbernard value = book
** Message: 13:42:43.514: Produced event to topic purchases: key = awalther value = t-shirts
** Message: 13:42:43.514: Produced event to topic purchases: key = jsmith value = batteries
** Message: 13:42:43.514: Produced event to topic purchases: key = eabara value = book
** Message: 13:42:43.514: Flushing final messages..
** Message: 13:42:44.520: 10 events were produced to topic purchases.
Make the consumer executable and run it:
make consumer
./consumer
You should see output resembling this:
** Message: 13:48:09.293: Consumed event from topic purchases: key = htanaka value = gift card
** Message: 13:48:09.293: Consumed event from topic purchases: key = awalther value = alarm clock
** Message: 13:48:09.293: Consumed event from topic purchases: key = htanaka value = alarm clock
** Message: 13:48:09.293: Consumed event from topic purchases: key = eabara value = book
** Message: 13:48:09.293: Consumed event from topic purchases: key = awalther value = t-shirts
** Message: 13:48:09.293: Consumed event from topic purchases: key = sgarcia value = book
** Message: 13:48:09.293: Consumed event from topic purchases: key = htanaka value = batteries
** Message: 13:48:09.293: Consumed event from topic purchases: key = eabara value = batteries
** Message: 13:48:09.294: Consumed event from topic purchases: key = jsmith value = book
** Message: 13:48:09.294: Consumed event from topic purchases: key = eabara value = t-shirts
** Message: 13:48:09.895: Waiting...
** Message: 13:48:10.399: Waiting...
** Message: 13:48:10.900: Waiting...
Rerun the producer to see more events, or feel free to modify the code as necessary to create more or different events.
Once you are done with the consumer, enter Ctrl-C
to terminate the consumer application.
Shut down Kafka when you are done with it:
confluent local kafka stop
- For the librdkafka client API, check out the librdkafka documentation.
- For information on testing in the Kafka ecosystem, check out Testing Event Streaming Apps.
- If you're interested in using streaming SQL for data creation, processing, and querying in your applications, check out the ksqlDB 101 course.
- Interested in performance tuning of your event streaming applications? Check out the Kafka Performance resources.