Giter VIP home page Giter VIP logo

connector-kafka's People

Stargazers

 avatar  avatar

Watchers

 avatar  avatar

connector-kafka's Issues

[Apache Kafka] pre-release tesing on Pre-Prod

Hi QA Team,

As a part of release testing, I would kindly ask to test the Apache Kafka OOTB connector on Pre-Prod.

You can use the following instance for testing:

  • Bootstrap Server: pkc-lzvrd.us-west4.gcp.confluent.cloud:9092
  • Username: KRFPEZMLK6EAR44Q
  • Password: /qVQ+Rl7bSewwxyhpkuQ/DDRz/N/zQYke9oeyd8nVyaMs4/F5zL4QsHpSXdzdnUi
  • Topic: QATestTopic
  • Key: any string or JSON (FEEL)
  • Value: any string or JSON (FEEL)
  • Additional properties: = {"client.id":"MyDemoClient"} or other.

Alternatively, you can create your own cluster: either local, or at https://confluent.cloud/ (free tier eligible).

Doc page:

Epic:

Kafka: validate implementation with customer

What should we do?

We would like to validate existing Outbound Kafka Connector implementation with 1-2 customers.

Why should we do it?

Kafka supports an extreme variety of configuration options. The implementation we currently done is supposed to be generic for all however it is all based on assumption and anecdotal evidence.

Provide infrastructure to test Kafka connector with multiple bootstrap servers

What should we do?

An infrastructure needs to be provided for the QA to test the scenario of Kafka connector supporting multiple bootstrap servers

Why should we do it?

This is a valid scenario a per the original Epic of Kafka connectors and since we do not have the necessary infrastructure the QA could not proceed with testing the same

Hence the following test case is blocked due to the lack of infrastructure : https://camunda.testrail.com/index.php?/tests/view/45309&group_by=cases:section_id&group_order=asc&group_id=1387

[Kafka] [Discover]: Kafka MVP for producer and consumer

Parent of:

What should we do?

  • Discover MVP to push Kafka message
  • Discover MVP to pull Kafka message
  • Dive deep into realistic use-cases: real architectures, authn/z

โ• Disclaimer: due to Kafka being extremely advance technology, some of the following statements are based on assumptions.

Kafka APIs

Producer API

Allows to send data to Kafka topic. See reference. This is recommended API to be implemented for Camunda SaaS Connector first.

Consumer API

Allows to consume data from Kafka. See reference. It is required to validate with customers whether this API is required on Camunda SaaS Connector due to nature of SaaS Connectors - they are short lived cloud functions, while typical consumer polls data constantly.

Streams API

Allows to transform data. See reference. It is likely not applicable for Camunda SaaS use-case.

Kafka Connect

Similar to Consumer and Producer API but designed for continuous pull/push data from/into Kafka.

Types of Kafka setup

Challenge: we need to identify optimal / most common use-cases that customers opt for.

Currently, there are 3 known Kafka setup types:

  • Cloud-native Kafka, e.g. Amazon MSK, Confluent Cloud Kafka. Can be supplied as serverless. It can be reached from Camunda SaaS when applied correct correct ACL policies.
  • DMZ Kafka. A private Kafka that is isolated from a public internet. It can be reached from Camunda SaaS via proxies - either direct protocol level or HTTP(S) Kafka plugins.
  • Open-world Kafka. A private Kafka, exposed to the internet. It can be reached from Camunda Saas directly.

Kafka Security Mechanisms

Challenge: we need to identify optimal / most common use-cases that customers opt for.

Transit Encryption

It is highly recommended to use SSL transit encryption. However, there might be cases when Kafka may stay unprotected if it's stays in customer's DMZ.

Authentication

There are currently 3 major ways to authenticate Kafka clients:

  • SASL: a set of mechanisms, integrated on Kafka via JAAS:
    • Kerberos/AD: integration with Kerberos or AD. Likely, not applicable for Camunda SaaS Connectors.
    • OAuthBearer: OAuth-based authn mechanism, that works with external token providers. Not recommended to use for prod systems.
    • Plain: login/password provided as plain text. It provides fair security but not safe against replay attacks. It can be quite common mean of authn for customers.
    • Delegation tokens: temporary delegation tokens for light weight authentication. Won't fit to Camunda SaaS Connectors due to short-living nature of a token - up to 7 days.
    • LDAP: integration with AD. Likely, not applicable for Camunda SaaS Connectors.
  • mTLS: a mechanism, when a client and Kafka identify each other based on certificates. This goes very naturally with transit encryption. It is not clear, whether it can be applied to Camunda SaaS Connectors due to clients loading certificates via JKS storage. There is a know KIP-651 / kafka#9345 issue that is supposed to solve real-time certificates loading as PEM (not JKS).
  • HTTP(s) Proxy: a collection of different plugins, such as Confluent REST Proxy, that play role of a proxy client that take heavy load on native Kafka AAA mechanisms, providing instead interface for HTTP-based authentication. Some of those proxies go with attached monetary and license strings attached.

ACLs

Kafka may use both OOTB, as well as custom AuthZ plugins. A default OOTB ACL mechanism is rule-based and can be applied for both user-password and certificate CN (including wild cards) use-cases therefore it is out of scope for Camunda SaaS Connectors due there's nothing required to configure for all 3 authn mechanisms.

Kafka: implement an ability to produce message

What should we do?

As per #3, we need to implement Kafka producer.
Producer has to be a Kafka client compatible with confluent cloud with an ability to connect directly via broker.

# Required connection configs for Kafka producer, consumer, and admin
bootstrap.servers=pkc-6ojv2.us-west4.gcp.confluent.cloud:9092
security.protocol=SASL_SSL
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule   required username='{{ CLUSTER_API_KEY }}'   password='{{ CLUSTER_API_SECRET }}';
sasl.mechanism=PLAIN
# Required for correctness in Apache Kafka clients prior to 2.6
client.dns.lookup=use_all_dns_ips

# Best practice for higher availability in Apache Kafka clients prior to 3.0
session.timeout.ms=45000

# Best practice for Kafka producer to prevent data loss
acks=all

# Required connection configs for Confluent Cloud Schema Registry
schema.registry.url=https://{{ SR_ENDPOINT }}
basic.auth.credentials.source=USER_INFO
basic.auth.user.info={{ SR_API_KEY }}:{{ SR_API_SECRET }}

Screenshot 2022-10-14 at 19 47 19

Kafka: remove mandatory username/password

What should we do?

Some customers want use Kafka Connector in a self-managed manner, thus the connector can be placed in the DMZ docker container. Which means, username/password might not always be necessary.

Remove both mandatory attribute for username/password for Kafka on UI and backend.

Make sure it still works for self-managed.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.