Comments (8)
I am running into similar issue (not sure if user error on my side.)
Config:
{
"name": "ElasticSource_RPA",
"config": {
"name": "ElasticSource_RPA",
"connector.class": "com.github.dariobalinzo.ElasticSourceConnector",
"errors.log.enable": "true",
"errors.log.include.messages": "true",
"es.host": "REDACTED",
"es.port": "9200",
"index.names": "finance-invoices",
"incrementing.field.name": "date_time",
"poll.interval.ms": "60000",
"topic.prefix": "rpa_"
}
}
Error:
[2022-09-14 19:20:10,793] ERROR error (com.github.dariobalinzo.task.ElasticSourceTask)
ElasticsearchStatusException[Elasticsearch exception [type=search_phase_execution_exception, reason=all shards failed]]
at org.elasticsearch.rest.BytesRestResponse.errorFromXContent(BytesRestResponse.java:178)
at org.elasticsearch.client.RestHighLevelClient.parseEntity(RestHighLevelClient.java:2484)
at org.elasticsearch.client.RestHighLevelClient.parseResponseException(RestHighLevelClient.java:2461)
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:2184)
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:2137)
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:2105)
at org.elasticsearch.client.RestHighLevelClient.search(RestHighLevelClient.java:1367)
at com.github.dariobalinzo.elastic.ElasticRepository.executeSearch(ElasticRepository.java:176)
at com.github.dariobalinzo.elastic.ElasticRepository.searchAfter(ElasticRepository.java:90)
at com.github.dariobalinzo.task.ElasticSourceTask.poll(ElasticSourceTask.java:205)
at org.apache.kafka.connect.runtime.WorkerSourceTask.poll(WorkerSourceTask.java:307)
at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:263)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:200)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:255)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Suppressed: org.elasticsearch.client.ResponseException: method [POST], host [http://REDACTED:9200], URI [/finance-invoices/_search?typed_keys=true&max_concurrent_shard_requests=5&search_type=query_then_fetch&batched_reduce_size=512], status line [HTTP/1.1 400 Bad Request]
{"error":{"root_cause":[{"type":"query_shard_exception","reason":"No mapping found for [] in order to sort on","index_uuid":"6Z3jeKV4Q5WUJ6p3zEgdXw","index":"finance-invoices"}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":0,"index":"finance-invoices","node":"Wyczj20LRY6-6O8Vgi4qww","reason":{"type":"query_shard_exception","reason":"No mapping found for [] in order to sort on","index_uuid":"6Z3jeKV4Q5WUJ6p3zEgdXw","index":"finance-invoices"}}]},"status":400}
at org.elasticsearch.client.RestClient.convertResponse(RestClient.java:331)
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:301)
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:276)
at org.elasticsearch.client.RestHighLevelClient.performClientRequest(RestHighLevelClient.java:2699)
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:2171)
... 15 more
from kafka-connect-elasticsearch-source.
Hi there i don't know why you are using this for but in my case i used it to migrate from old ES to new ES ....But i was unaware of elastic search feature of re-indexing which helped in migrating from one es to another easily...Can you tell me why you need this connector?
from kafka-connect-elasticsearch-source.
I am trying to read data from ES as part of an integration allowing non-ES users to ingest data from ES using Kafka. I actually got it working now on my side this is is no longer an open issue for me.
from kafka-connect-elasticsearch-source.
can you share how you managed to solve this issue
from kafka-connect-elasticsearch-source.
I think that probably you need an incrementing field which support sorting, so you need to configure it if this is not the case (e.g. you need that field to be indexed)
from kafka-connect-elasticsearch-source.
Yes, I added a timestamp field to the index that is automatically updated with each new indexed document using a pipeline in ES. My config for this connector which is working for me appears below...
{
"name": "ElasticSource_RPA",
"config": {
"name": "ElasticSource_RPA",
"connector.class": "com.github.dariobalinzo.ElasticSourceConnector",
"tasks.max": "1",
"errors.log.enable": "true",
"errors.log.include.messages": "true",
"es.host": "REDACTED_IP_ADDRESS",
"es.scheme": "http",
"es.port": "9200",
"index.names": "finance-invoices",
"incrementing.field.name": "timestamp",
"poll.interval.ms": "60000",
"topic.prefix": "rpa_"
}
}
from kafka-connect-elasticsearch-source.
As discussed in #77 try first to create the index with the schema mapping and some date. Then after start the connector.
In the future I will try to implement a better error handling to silently ignore the missing index schema mapping until data is not yet ready. Right now however this workaround should fix the issue.
from kafka-connect-elasticsearch-source.
i think this connector is only valid for timestamp data ...what if i don't have timestamp in my data ...which we can find commonly in old types to data .Is there any way for those type of data ???
from kafka-connect-elasticsearch-source.
Related Issues (20)
- Authentication Methods
- Question: Filtering fields of elastic document.
- question: keystore and truststore file formats
- Avro parsing error StringIndexOutOfBoundsException HOT 1
- Index Prefix is not working as intended, it is copying all the indices
- org.apache.kafka.connect.errors.DataException: xxx is not a valid field name
- Connector failed to run
- The connector starts but fails to connect to OpenSearch nodes
- Update dependencies
- How to disable schema Avro? HOT 1
- Caused by: org.apache.http.ContentTooLongException: entity content is too long [107962506] for the configured buffer limit [104857600] HOT 5
- How to use queries to poll only certain documents into the Kafka Topic? HOT 3
- Now that ElasticSourceConnector is working, how to consume messages using Java from Kafka Topic? HOT 5
- converting list: type not supported HOT 3
- Can I limit the bandwidth of data obtained from ES? HOT 2
- Replay data which already exist on elastic HOT 1
- Connector Vulnerabilities HOT 3
- Connector Vulnerabilities HOT 6
- org.apache.kafka.connect.errors.DataException: Invalid type for INT64: class java.lang.Double HOT 3
- number of tasks, their state and relation to the connector state HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from kafka-connect-elasticsearch-source.