Giter VIP home page Giter VIP logo

Comments (8)

jag959 avatar jag959 commented on July 24, 2024

I am running into similar issue (not sure if user error on my side.)

Config:

{
"name": "ElasticSource_RPA",
"config": {
"name": "ElasticSource_RPA",
"connector.class": "com.github.dariobalinzo.ElasticSourceConnector",
"errors.log.enable": "true",
"errors.log.include.messages": "true",
"es.host": "REDACTED",
"es.port": "9200",
"index.names": "finance-invoices",
"incrementing.field.name": "date_time",
"poll.interval.ms": "60000",
"topic.prefix": "rpa_"
}
}

Error:

[2022-09-14 19:20:10,793] ERROR error (com.github.dariobalinzo.task.ElasticSourceTask)
ElasticsearchStatusException[Elasticsearch exception [type=search_phase_execution_exception, reason=all shards failed]]
at org.elasticsearch.rest.BytesRestResponse.errorFromXContent(BytesRestResponse.java:178)
at org.elasticsearch.client.RestHighLevelClient.parseEntity(RestHighLevelClient.java:2484)
at org.elasticsearch.client.RestHighLevelClient.parseResponseException(RestHighLevelClient.java:2461)
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:2184)
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:2137)
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:2105)
at org.elasticsearch.client.RestHighLevelClient.search(RestHighLevelClient.java:1367)
at com.github.dariobalinzo.elastic.ElasticRepository.executeSearch(ElasticRepository.java:176)
at com.github.dariobalinzo.elastic.ElasticRepository.searchAfter(ElasticRepository.java:90)
at com.github.dariobalinzo.task.ElasticSourceTask.poll(ElasticSourceTask.java:205)
at org.apache.kafka.connect.runtime.WorkerSourceTask.poll(WorkerSourceTask.java:307)
at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:263)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:200)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:255)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Suppressed: org.elasticsearch.client.ResponseException: method [POST], host [http://REDACTED:9200], URI [/finance-invoices/_search?typed_keys=true&max_concurrent_shard_requests=5&search_type=query_then_fetch&batched_reduce_size=512], status line [HTTP/1.1 400 Bad Request]
{"error":{"root_cause":[{"type":"query_shard_exception","reason":"No mapping found for [] in order to sort on","index_uuid":"6Z3jeKV4Q5WUJ6p3zEgdXw","index":"finance-invoices"}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":0,"index":"finance-invoices","node":"Wyczj20LRY6-6O8Vgi4qww","reason":{"type":"query_shard_exception","reason":"No mapping found for [] in order to sort on","index_uuid":"6Z3jeKV4Q5WUJ6p3zEgdXw","index":"finance-invoices"}}]},"status":400}
at org.elasticsearch.client.RestClient.convertResponse(RestClient.java:331)
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:301)
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:276)
at org.elasticsearch.client.RestHighLevelClient.performClientRequest(RestHighLevelClient.java:2699)
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:2171)
... 15 more

from kafka-connect-elasticsearch-source.

dahal4 avatar dahal4 commented on July 24, 2024

Hi there i don't know why you are using this for but in my case i used it to migrate from old ES to new ES ....But i was unaware of elastic search feature of re-indexing which helped in migrating from one es to another easily...Can you tell me why you need this connector?

from kafka-connect-elasticsearch-source.

jag959 avatar jag959 commented on July 24, 2024

I am trying to read data from ES as part of an integration allowing non-ES users to ingest data from ES using Kafka. I actually got it working now on my side this is is no longer an open issue for me.

from kafka-connect-elasticsearch-source.

dahal4 avatar dahal4 commented on July 24, 2024

can you share how you managed to solve this issue

from kafka-connect-elasticsearch-source.

DarioBalinzo avatar DarioBalinzo commented on July 24, 2024

I think that probably you need an incrementing field which support sorting, so you need to configure it if this is not the case (e.g. you need that field to be indexed)

from kafka-connect-elasticsearch-source.

jag959 avatar jag959 commented on July 24, 2024

Yes, I added a timestamp field to the index that is automatically updated with each new indexed document using a pipeline in ES. My config for this connector which is working for me appears below...

{

"name": "ElasticSource_RPA",
"config": {
"name": "ElasticSource_RPA",
"connector.class": "com.github.dariobalinzo.ElasticSourceConnector",
"tasks.max": "1",
"errors.log.enable": "true",
"errors.log.include.messages": "true",
"es.host": "REDACTED_IP_ADDRESS",
"es.scheme": "http",
"es.port": "9200",
"index.names": "finance-invoices",
"incrementing.field.name": "timestamp",
"poll.interval.ms": "60000",
"topic.prefix": "rpa_"
}
}

from kafka-connect-elasticsearch-source.

DarioBalinzo avatar DarioBalinzo commented on July 24, 2024

As discussed in #77 try first to create the index with the schema mapping and some date. Then after start the connector.
In the future I will try to implement a better error handling to silently ignore the missing index schema mapping until data is not yet ready. Right now however this workaround should fix the issue.

from kafka-connect-elasticsearch-source.

dahal4 avatar dahal4 commented on July 24, 2024

i think this connector is only valid for timestamp data ...what if i don't have timestamp in my data ...which we can find commonly in old types to data .Is there any way for those type of data ???

from kafka-connect-elasticsearch-source.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.