camunda / connector-kafka Goto Github PK
View Code? Open in Web Editor NEWLicense: Other
License: Other
Hi QA Team,
As a part of release testing, I would kindly ask to test the Apache Kafka OOTB connector on Pre-Prod.
You can use the following instance for testing:
Bootstrap Server
: pkc-lzvrd.us-west4.gcp.confluent.cloud:9092
Username
: KRFPEZMLK6EAR44Q
Password
: /qVQ+Rl7bSewwxyhpkuQ/DDRz/N/zQYke9oeyd8nVyaMs4/F5zL4QsHpSXdzdnUi
Topic
: QATestTopic
Key
: any string or JSON (FEEL)Value
: any string or JSON (FEEL)Additional properties
: = {"client.id":"MyDemoClient"}
or other.Alternatively, you can create your own cluster: either local, or at https://confluent.cloud/ (free tier eligible).
Doc page:
Epic:
Parent:
As per:
The MVP model has been defined. Now we need to convert the model into the element template.
What should we do?
Add Kafka documentation at https://docs.camunda.io/docs/components/connectors/out-of-the-box-connectors/available-connectors-overview/
What should we do?
Make Kafka connector publicly available
What should we do?
We would like to validate existing Outbound Kafka Connector implementation with 1-2 customers.
Why should we do it?
Kafka supports an extreme variety of configuration options. The implementation we currently done is supposed to be generic for all however it is all based on assumption and anecdotal evidence.
What should we do?
An infrastructure needs to be provided for the QA to test the scenario of Kafka connector supporting multiple bootstrap servers
Why should we do it?
This is a valid scenario a per the original Epic of Kafka connectors and since we do not have the necessary infrastructure the QA could not proceed with testing the same
Hence the following test case is blocked due to the lack of infrastructure : https://camunda.testrail.com/index.php?/tests/view/45309&group_by=cases:section_id&group_order=asc&group_id=1387
Parent of:
What should we do?
โ Disclaimer: due to Kafka being extremely advance technology, some of the following statements are based on assumptions.
Allows to send data to Kafka topic. See reference. This is recommended API to be implemented for Camunda SaaS Connector first.
Allows to consume data from Kafka. See reference. It is required to validate with customers whether this API is required on Camunda SaaS Connector due to nature of SaaS Connectors - they are short lived cloud functions, while typical consumer polls data constantly.
Allows to transform data. See reference. It is likely not applicable for Camunda SaaS use-case.
Similar to Consumer and Producer API but designed for continuous pull/push data from/into Kafka.
Challenge: we need to identify optimal / most common use-cases that customers opt for.
Currently, there are 3 known Kafka setup types:
Challenge: we need to identify optimal / most common use-cases that customers opt for.
It is highly recommended to use SSL transit encryption. However, there might be cases when Kafka may stay unprotected if it's stays in customer's DMZ.
There are currently 3 major ways to authenticate Kafka clients:
Kafka may use both OOTB, as well as custom AuthZ plugins. A default OOTB ACL mechanism is rule-based and can be applied for both user-password and certificate CN (including wild cards) use-cases therefore it is out of scope for Camunda SaaS Connectors due there's nothing required to configure for all 3 authn mechanisms.
What should we do?
Why should we do it?
Align repository setup
What should we do?
As per #3, we need to implement Kafka producer.
Producer has to be a Kafka client compatible with confluent cloud with an ability to connect directly via broker.
# Required connection configs for Kafka producer, consumer, and admin
bootstrap.servers=pkc-6ojv2.us-west4.gcp.confluent.cloud:9092
security.protocol=SASL_SSL
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username='{{ CLUSTER_API_KEY }}' password='{{ CLUSTER_API_SECRET }}';
sasl.mechanism=PLAIN
# Required for correctness in Apache Kafka clients prior to 2.6
client.dns.lookup=use_all_dns_ips
# Best practice for higher availability in Apache Kafka clients prior to 3.0
session.timeout.ms=45000
# Best practice for Kafka producer to prevent data loss
acks=all
# Required connection configs for Confluent Cloud Schema Registry
schema.registry.url=https://{{ SR_ENDPOINT }}
basic.auth.credentials.source=USER_INFO
basic.auth.user.info={{ SR_API_KEY }}:{{ SR_API_SECRET }}
What should we do?
Some customers want use Kafka Connector in a self-managed manner, thus the connector can be placed in the DMZ docker container. Which means, username/password might not always be necessary.
Remove both mandatory attribute for username/password for Kafka on UI and backend.
Make sure it still works for self-managed.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.