Giter VIP home page Giter VIP logo

azure-functions-kafka-extension-sample-confluent's Introduction

page_type languages products description
sample
csharp
azure-functions
This is a simple sample which shows how to set up and write a function app which writes to a kafka topic

Azure Functions Kafka extension sample using Confluent Cloud

This sample shows how to set up an write a .NET Function app which writes to a Kafka Topic. It is using Confluent Cloud for the Kafka cluster. It also shows how to deploy this app on a Premium Function app.

Prerequisites

Steps to set up

Setup the Kafka Cluster

After you create a Confluent Cloud account follow these steps to get set up. Some of the main ones are also highlighted below.

  • Log into in your Confluent Cloud account and create a new Kafka cluster. To minimize your data transfer costs, you should provision a cluster in the same Azure region where your Functions App will run.

CreateConfluentCluster

  • Create a new Kafka Topic called "users" using the default topic settings. CreateKafkaTopic

  • Create a new API Key and Secret - note these values

Update the code

  • Clone this repository using Git to a folder

  • Change the code in kafka_example.cs to point to your Kafka cluster that you set up in the previous step

public static class kafka_example
    {
        [FunctionName("kafkaApp")]
        public static void ConfluentCloudStringTrigger(
             [KafkaTrigger(
                "BootstrapServer",
                "users",
                ConsumerGroup = "<ConsumerGroup>",
                Protocol = BrokerProtocol.SaslSsl,
                AuthenticationMode = BrokerAuthenticationMode.Plain,
                Username = "<APIKey>",
                Password = "<APISecret>",
                SslCaLocation = "confluent_cloud_cacert.pem")]
        KafkaEventData<string> kafkaEvent,
        ILogger logger)
        {	    
            logger.LogInformation(kafkaEvent.Value.ToString());
        }
    }

Replace the following values:

  • BootstrapServer: should contain the value of Bootstrap server found in Confluent Cloud settings page. Will be something like "pkc-xyzxy.westeurope.azure.confluent.cloud:9092".

  • Set any string for your ConsumerGroup

  • APIKey: This is your API access key, obtained from the Confluent Cloud web portal.

  • APISecret: This is your API secret, obtained from the Confluent Cloud web portal.

  • Note about the CA certificate: As described in Confluent documentation, the .NET library does not have the capability to access root CA certificates.
    Missing this step will cause your function to raise the error "sasl_ssl://pkc-xyzxy.westeurope.azure.confluent.cloud:9092/bootstrap: Failed to verify broker certificate: unable to get local issuer certificate"
    To overcome this, you need to:

    • Download CA certificate (i.e. from https://curl.haxx.se/ca/cacert.pem).
    • Rename the certificate file to anything other than cacert.pem to avoid any conflict with existing EventHubs Kafka certificate that is part of the extension.
    • Include the file in the project, setting "copy to output directory" and set the SslCaLocation trigger attribute property.
    • In the example we have already downloaded this file and named it to confluent_cloud_cacert.pem

Running the sample

  • Send some messages to the users Topic. You can do so using the sample application given in the quick start, the ccloud CLI, or using the Confluent Cloud interface. Instructions for producing messages with the ccloud CLI can be found in the "Tools & Client Configuration" tab in the Confluent Cloud web portal.

For instructions using the sample application, see Step 5 and 6 in the quickstart

CreateKafkaMessages

  • Run the following from the folder where you cloned the project to start the Function app locally
func host start

The Function app starts executing and should connect to your Confluent Cloud Kafka cluster.

You should see the Partitions of your Topic that have been assigned to this client show up and messages that were sent before being processed.

CreateKafkaMessages

  • Note: You may notice that we have 6 partitions on the Kafka Topic "Users" but this client has been only assigned 3 of them, this is because I have another client listening to the same Topic and Kafka has load balanced the partitions among the clients.

Deploying the sample to a Azure Functions Premium Plan

  • Now you are ready to deploy this Function app to a Azure Functions Premium Plan. Use the following link for instructions on how to first create an Azure Functions Premium plan Function app. Note the name of the Function app.

  • To enable scaling in the Premium Function app currently you have to toggle a property on the Function app.

You can use the Azure Portal to toggle the Runtime Scale Monitoring setting under Function runtime settings ChangeSettings

You can use the Azure CLI

az resource update -g <resource_group> -n <NameOfFunctionApp>/config/web --set properties.functionsRuntimeScaleMonitoringEnabled=1 --resource-type Microsoft.Web/sites
  • You can now deploy your locally created Function app to the app created in Azure by using the following func command by replacing the NameOfFunctionApp with the name of the Function app created in Azure in the previous step.
    Note: To use this command you have to be logged into Azure using Azure CLI
func azure function publish <NameOfFunctionApp>
  • Finally, you can head over to the portal and for example use the Live Metrics view to see the logs and requests.

KafkaPortal

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

azure-functions-kafka-extension-sample-confluent's People

Contributors

anirudhgarg avatar caleb-grillo avatar jeffhollan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

azure-functions-kafka-extension-sample-confluent's Issues

Same Messages from Kafka is being picked again in next run of Azure Function and is resulting in Duplicates

We have created a Azure Function on lines of Sample shared. We are using Batch Mode for processing multiple messages in 1 go and found same messages are being picked across multiple Azure Function runs.

Can you suggest some settings in host.json or code with which only unique messages are picked in next Azure Function Runs. Please refer enclosed code file for Azure Function. Kindly review and suggest.
ProcessMessages.txt

How do I externalise the parameters for the KafkaTrigger?

I need to move (e.g.) UserName and Password to configuration, but cant find the correct syntax. I have been looking at IBinder and it makes complete sense to me how this works for output bindings - but I cant see how to wire it up to provide the trigger parameters?

I do see that there are Kafka Attributes in the repo, but could do with an example of how to do that (cant find much on this one the web generally)...

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.