Today we are going to bridge two worlds: Serverless Kafka and Serverless Functions.
This tutorial demonstrates how to generate Kafka messages that trigger AWS Lambda functions. The Kafka instance will run on Upstash, and we will use AWS for the Lambda functions.
Upstash is an on-demand, pay-as-you-go solution that avoids managing hardware, virtual machines, or Docker containers. It costs nothing when not in use— it literally scales down to 0, which means no cost at all.
Several steps must be completed before we can send messages from our local machine to Kafka on Upstash, which will then be received by AWS and trigger our Lambda function.
If you prefer to use your own Kafka instance, adapt the steps of this tutorial accordingly.
The steps are:
- Create a Kafka cluster and topic on Upstash
- Start a producer and a consumer for testing the cluster
- Create a secret
- Create a Lambda function
- Create a Lambda role
- Create a trigger
- Test the setup
There is a lot to do, so let’s get started.
Create a Kafka Cluster and Topic on Upstash
If you haven’t already, create an account on Upstash. Then go to the Console and create a new cluster.
First, provide the cluster name and region.

Next, create the topic that will receive our messages. I simply provided a name and kept the defaults.

When finished, the cluster overview page is shown. This page provides all the information required to connect to the cluster. We will need it in the next section.

Now that our Kafka cluster is in place, we will test it.
Start a producer and a consumer for testing the cluster
Kafka provides command-line tools that are useful for querying and testing a Kafka cluster. These can be installed locally or used via the official Kafka Docker image.
We will use the commands kafka-console-producer and kafka-console-consumer. To keep things simple, I show them as if they are run on a local machine.
Here is an example of how to start a consumer if Kafka is installed locally:
kafka-console-consumer.sh --topic planes --from-beginning --bootstrap-server kafka:9092Doing the same, but this time using the official Kafka Docker image, it looks like this:
docker-compose exec kafka bash -c 'kafka-console-consumer.sh --topic planes --from-beginning --bootstrap-server kafka:9092'For Upstash we need to add a bit of configuration because we must authenticate. You don’t want everyone to be able to connect to your cluster, right?
In your local working directory, create a file called upstash.config and paste the properties from the Upstash~~](http://upstash.com/?utm_source=tobias_tobias4)property how itthatfrom mine~~for your account.
sasl.mechanism=SCRAM-SHA-256
security.protocol=SASL_SSL
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username=\"YXBwYXJlbnQtYXNwLTExNjU2JMPd9pk9KbaNkvMiqW7h0l3FOHgPks2Ia3lgup4\" password=\"WcsQuBndTvo1zRYIHffSj5ZwN5A8G6sryjeG2k_1JyThzOzMDTURls2Fo-pboUX4hoza9g==\";"Then start the producer.
kafka-console-producer --bootstrap-server apparent-asp-11656-eu1-kafka.upstash.io:9092 --producer.config upstash.config --topic tutorial-topicIn another console, start the consumer.
kafka-console-consumer --bootstrap-server apparent-asp-11656-eu1-kafka.upstash.io:9092 --consumer.config upstash.config --from-beginning --topic tutorial-topicFinally, send some messages from the producer and receive them in the consumer.

Now head over to Amazon AWS to create and set up the pieces needed to trigger our Lambda function with a Kafka message.
Create AWS Secret
Go to AWS Secrets Manager and click “Store a new secret”.

Choose “Other type of secret” and add the following key/value pairs for your Upstash username and password.

Click “Next,” provide a name for the secret,

then click “Next” and, on the “Store a new secret” page, click “Next” again and then “Store” without changing anything.
With that done, let’s move on to creating our Lambda function.
Create the Lambda Function
In AWS Lambda, click “Create function.”

Create a Node.js function called “kafka-consumer.” Click “Create function” when ready.

Replace the contents of index.js with
exports.handler = async (event) => {
for (let key in event.records) {
console.log('Key: ', key)
event.records[key].map((record) => {
console.log('Record: ', record)
const msg = Buffer.from(record.value, 'base64').toString()
console.log('Message:', msg)
})
}
}This function will print the contents of the Kafka event that triggered it.

Then click “Deploy”.
Create Lambda Role
Click on “Configuration” then “Permissions” and select the role name, here “kafka-consumer-role-eov2kffe”.

The “Identity and Access Management (IAM)” console opens.
Under “Add permissions” select “Create inline policy”.

In the “JSON” tab add the following:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"secretsmanager:GetSecretValue"
],
"Resource": "arn:aws:secretsmanager:eu-west-1:450055871228:secret:upstash-xcYDmy"
}
]
}Make sure to replace the secret ARN with the one for your secret (you can find it on the Secrets Manager overview page ).

When you are done, the policy should look like this.

Click “Review policy”.
Under “Review Policy” add a name (for example “SelfHostedKafkaPolicy”) and click “Create policy”.
If successful, the policy is now listed under the IAM roles page.

This was the trickiest part of the tutorial. Well done if you managed to get this far. There is only one small step left before we can test everything.
Create Trigger
Click “Add Trigger” and select the “Apache Kafka” trigger.
Add the endpoint name of your Kafka cluster under “Bootstrap servers” and the topic name.
Under “Authentication” click “Add” and provide the secret created above.

Then click “Add”. If everything is successful, you return to the function page.

We are done setting everything up. I hope this worked for you.
It is time to test everything.
Testing the Setup
While still in AWS, click “Monitor”.
On your local machine start the kafka-console-producer again if it’s not running.
kafka-console-producer --bootstrap-server apparent-asp-11656-eu1-kafka.upstash.io:9092 --producer.config upstash.config --topic tutorial-topicThen send one or more messages.

Under your AWS CloudWatch metrics you should see invocations of the Lambda function.

Click “View logs in CloudWatch” to go to the logs page.

Select the log stream to open the page showing the log contents.

There it is! In the third-to-last line is the message I sent, “Hello”. Depending on what you sent, you will find your message there as well.
Conclusion
What a trip! We started by generating Kafka messages on a local machine, sent them to a Kafka instance on Upstash, and finally received them with an AWS Lambda function to perform an action.
I hope you enjoyed this tutorial and found it useful. Feel free to ping me in the comments with questions, suggestions for improvement, or ideas for new topics or tutorials.
Thank you for reading!