Giter VIP home page Giter VIP logo

Comments (5)

tomekl007 avatar tomekl007 commented on September 16, 2024 1

When starting the connector, you need to have a proper key.converter and value.converter defined. If you have Json data in the Kafka topic you can base on this:
https://github.com/datastax/kafka-examples/blob/master/producers/src/main/java/json/connect-distributed-json.properties. If avro see this:
https://github.com/datastax/kafka-examples/blob/master/producers/src/main/java/avro/connect-distributed-avro.properties#L4
Please read this blogpost about setting converters:
https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained

from kafka-examples.

tomekl007 avatar tomekl007 commented on September 16, 2024

Hi,
What do you mean by insert to the payload column? Is this payload a UDT that has two fields:
first_name and last_name?

from kafka-examples.

mradha1 avatar mradha1 commented on September 16, 2024

Hi,
The below is my requirement.

For example , if i am publishing the following records to kafka (Avro record)
Key = null
Value = {"id":"123", "first_name":"Mathew"}

Below is my Cassandra table structure:
CREATE TABLE kafka_examples.avro_udt_table
(
id text,
payload text,
PRIMARY KEY (id)
)

if I wanted to save whole of the Value , {"id":"123", "first_name":"Mathew"} to the "payload" column in cassandra... what should be the below mapping value ???

{
"name": "dse-connector-avro-example",
"config": {
"connector.class": "com.datastax.kafkaconnector.DseSinkConnector",
"tasks.max": "10",
"topics": "avro-stream",
"contactPoints": "xx.xx.xx.xx",
"loadBalancing.localDc": "Cassandra",
"topic.avro-stream.kafka_examples.avro_udt_table.mapping": "?????????"
}
}

Hope the requirement is clear.

Thanks !!!

from kafka-examples.

tomekl007 avatar tomekl007 commented on September 16, 2024

To make it work you need to create a UDT in Cassandra that has id and first_name. Let's assume that you are creating this udt column and name it payload.
Once you have that, you can create this mapping:
*.mapping = "payload=value"
The whole value will be inserted into the payload column. For more details please see this example:
https://github.com/datastax/kafka-examples/tree/master/producers/src/main/java/json/udt
of saving value as a UDT in the Cassandra.

from kafka-examples.

mradha1 avatar mradha1 commented on September 16, 2024

Thanks for help..I am seeing the below exception
com.datastax.driver.core.exceptions.CodecNotFoundException: Codec not found for requested operation: [TEXT <-> org.apache.kafka.connect.data.Struct]

from kafka-examples.

Related Issues (1)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.