Skip to content

Latest commit

 

History

History
356 lines (252 loc) · 16.3 KB

README.md

File metadata and controls

356 lines (252 loc) · 16.3 KB
seo hero
title description
Apache Kafka and .NET - Getting Started Tutorial
How to develop your first Kafka client application in .NET, which produces and consumes messages from a Kafka cluster, complete with configuration instructions.
title description
Getting Started with Apache Kafka and .NET
Step-by-step guide to building a .NET client application for Kafka

Getting Started with Apache Kafka and .NET

Introduction

In this tutorial, you will build C# client applications which produce and consume messages from an Apache Kafka® cluster.

As you're learning how to run your first Kafka application, we recommend using Confluent Cloud so that you don't have to run your own Kafka cluster and can focus on the client development. If you do not already have an account, be sure to sign up. New signups receive $400 to spend within Confluent Cloud during their first 30 days. To avoid having to enter a credit card, navigate to Billing & payment, scroll to the bottom, and add the promo code CONFLUENTDEV1. With this promo code, you will not have to enter your credit card info for 30 days or until your credits run out.

If you prefer to set up a local Kafka cluster, the tutorial will walk you through those steps as well.

Prerequisites

Using Windows? We recommend using PowerShell, as not all of these commands will work with Command Prompt.

This guide assumes that you already have .NET Core (>= 8.0) installed.

Create Project

Create a new directory anywhere you’d like for this project:

mkdir kafka-dotnet-getting-started 
cd kafka-dotnet-getting-started
mkdir producer
mkdir consumer

Next we’ll create two different C# project files, one for the producer and one for the consumer. The project files specify the output type of project artifact which is an executable for both the producer and consumer. It also specifies the required dependencies that the .NET platform needs for the project.

Copy the following into a project file named producer.csproj in the producer subdirectory:

Copy the following into a project file named consumer.csproj in the consumer subdirectory:

Kafka Setup

We are going to need a Kafka Cluster for our client application to operate with. This dialog can help you configure your Confluent Cloud cluster, create a Kafka cluster for you, or help you input an existing cluster bootstrap server to connect to.

Kafka location

Confluent Cloud Local I have a cluster already!

From within the Confluent Cloud Console, creating a new cluster is just a few clicks:

Next, note your Confluent Cloud bootstrap server as we will need it to configure the producer and consumer clients in upcoming steps. You can obtain your Confluent Cloud Kafka cluster bootstrap server configuration using the Confluent Cloud Console:

Next, choose the authentication mechanism that the producer and consumer client applications will use to access Confluent Cloud: either basic authentication or OAuth.

Basic authentication is quicker to implement since you only need to create an API key in Confluent Cloud, whereas OAuth requires that you have an OAuth provider, as well as an OAuth application created within it for use with Confluent Cloud, in order to proceed.

Select your authentication mechanism:

Authentication mechanism

Basic OAuth

You can use the Confluent Cloud Console to create a key for you by navigating to the API Keys section under Cluster Overview.

Note the API key and secret as we will use them when configuring the producer and consumer clients in upcoming steps.

You can use the Confluent Cloud Console to add an OAuth/OIDC identity provider and create an identity pool with your OAuth/OIDC identity provider.

Note the following OAuth/OIDC-specific configuration values, which we will use to configure the producer and consumer clients in upcoming steps:

  • OAUTH2 CLIENT ID: The public identifier for your client. In Okta, this is a 20-character alphanumeric string.
  • OAUTH2 CLIENT SECRET: The secret corresponding to the client ID. In Okta, this is a 64-character alphanumeric string.
  • OAUTH2 TOKEN ENDPOINT URL: The token-issuing URL that your OAuth/OIDC provider exposes. E.g., Okta's token endpoint URL format is https://<okta-domain>.okta.com/oauth2/default/v1/token
  • OAUTH2 SCOPE: The name of the scope that you created in your OAuth/OIDC provider to restrict access privileges for issued tokens. In Okta, you or your Okta administrator provided the scope name when configuring your authorization server. In the navigation bar of your Okta Developer account, you can find this by navigating to Security > API, clicking the authorization server name, and finding the defined scopes under the Scopes tab.
  • LOGICAL CLUSTER ID: Your Confluent Cloud logical cluster ID of the form lkc-123456. You can view your Kafka cluster ID in the Confluent Cloud Console by navigating to Cluster Settings in the left navigation of your cluster homepage.
  • IDENTITY POOL ID: Your Confluent Cloud identity pool ID of the form pool-1234. You can find this in the Confluent Cloud Console by navigating to Accounts & access in the top right menu, selecting the Identity providers tab, clicking your identity provider, and viewing the Identity pools section of the page.

This guide runs Kafka in Docker via the Confluent CLI.

First, install and start Docker Desktop or Docker Engine if you don't already have it. Verify that Docker is set up properly by ensuring that no errors are output when you run docker info in your terminal.

Install the Confluent CLI if you don't already have it. In your terminal:

brew install confluentinc/tap/cli

If you don't use Homebrew, you can use a different installation method.

This guide requires version 3.34.1 or later of the Confluent CLI. If you have an older version, run confluent update to get the latest release (or brew upgrade confluentinc/tap/cli if you installed the CLI with Homebrew).

Now start the Kafka broker:

confluent local kafka start

Note the Plaintext Ports printed in your terminal, which you will need to configure the producer and consumer clients in upcoming steps.

Note your Kafka cluster bootstrap server URL as you will need it to configure the producer and consumer clients in upcoming steps.

Create Topic

A topic is an immutable, append-only log of events. Usually, a topic is comprised of the same kind of events, e.g., in this guide we create a topic for retail purchases.

Create a new topic, purchases, which you will use to produce and consume events.

When using Confluent Cloud, you can use the Confluent Cloud Console to create a topic. Create a topic with 1 partition and defaults for the remaining settings.

confluent local kafka topic create purchases

Depending on your available Kafka cluster, you have multiple options for creating a topic. You may have access to Confluent Control Center, where you can create a topic with a UI. You may have already installed a Kafka distribution, in which case you can use the kafka-topics command. Note that, if your cluster is centrally managed, you may need to request the creation of a topic from your operations team.

Build Producer

Let's create the .NET producer application by pasting the following C# code into a file named producer/producer.cs.

Fill in the appropriate BootstrapServers endpoint and any additional security configuration needed inline where the ProducerConfig object is instantiated.

You can test the syntax before preceding by compiling with:

cd producer
dotnet build producer.csproj

Build Consumer

Next, create the .NET consumer application by pasting the following C# code into a file named consumer/consumer.cs.

Fill in the appropriate BootstrapServers endpoint and any additional security configuration needed inline where the ConsumerConfig object is instantiated.

You can test the syntax before preceding by compiling with:

cd ../consumer
dotnet build consumer.csproj
cd ..

Produce Events

The dotnet command line tool gives us a handy run command we can use to execute the programs we just built.

In order to run the producer, use the dotnet run command:

cd producer
dotnet run

You should see output resembling this:

Produced event to topic purchases: key = jsmith     value = alarm clock
Produced event to topic purchases: key = htanaka    value = book
Produced event to topic purchases: key = eabara     value = batteries
Produced event to topic purchases: key = htanaka    value = t-shirts
Produced event to topic purchases: key = htanaka    value = t-shirts
Produced event to topic purchases: key = htanaka    value = gift card
Produced event to topic purchases: key = sgarcia    value = gift card
Produced event to topic purchases: key = jbernard   value = gift card
Produced event to topic purchases: key = awalther   value = alarm clock
Produced event to topic purchases: key = htanaka    value = book
10 events were produced to topic purchases

Consume Events

From another terminal, run the following command to run the consumer application which will read the events from the purchases topic and write the information to the terminal.

cd consumer
dotnet run

The consumer application will start and print any events it has not yet consumed and then wait for more events to arrive. On startup of the consumer, you should see output resembling this:

Consumed event from topic purchases: key = jsmith     value = alarm clock
Consumed event from topic purchases: key = htanaka    value = book
Consumed event from topic purchases: key = eabara     value = batteries
Consumed event from topic purchases: key = htanaka    value = t-shirts
Consumed event from topic purchases: key = htanaka    value = t-shirts
Consumed event from topic purchases: key = htanaka    value = gift card
Consumed event from topic purchases: key = sgarcia    value = gift card
Consumed event from topic purchases: key = jbernard   value = gift card
Consumed event from topic purchases: key = awalther   value = alarm clock
Consumed event from topic purchases: key = htanaka    value = book

Rerun the producer to see more events, or feel free to modify the code as necessary to create more or different events.

Once you are done with the consumer, enter Ctrl-C to terminate the consumer application.

Shut down Kafka when you are done with it:

confluent local kafka stop

Where next?