Giter VIP home page Giter VIP logo

Comments (9)

aecio avatar aecio commented on August 16, 2024

This feature is not supported yet. Other users were able to modify the source code easily to support it (see e.g. issue #232 for details). But we, unfortunately, didn't have time to implement and test it yet.

from ache.

aecio avatar aecio commented on August 16, 2024

I just added this to the dev branch. You configure the username and password by adding the following to ache.yml file:

target_storage.data_format.elasticsearch.rest.username: myusername
target_storage.data_format.elasticsearch.rest.password: mypasswd

It should also be available for tests using in the vidanyu/ache:dev docker image.

from ache.

chanwitkepha avatar chanwitkepha commented on August 16, 2024

I just added this to the dev branch. You configure the username and password by adding the following to ache.yml file:

target_storage.data_format.elasticsearch.rest.username: myusername
target_storage.data_format.elasticsearch.rest.password: mypasswd

It should also be available for tests using in the vidanyu/ache:dev docker image.

Thank you so much. I will try again at ache:dev branch.

from ache.

chanwitkepha avatar chanwitkepha commented on August 16, 2024

Before test, I use curl to check if it can access OpenSearch or not.

curl -XGET -u admin:aADSHAShihIznflkEkkyhIIkkk http://192.168.11.23:9200/_cat/indices?v

Result of curl.

health status index                                       uuid                   pri rep docs.count docs.deleted store.size pri.store.size
yellow open   security-auditlog-2022.01.28                wH4VA8r3QEmySqW_leEkDQ   1   1        404            0    548.7kb        548.7kb
green  open   .opendistro_security                        X7BFYmEbQBeHaNm9Z5uogw   1   0          9            0     55.7kb         55.7kb
yellow open   kibana_sample_data_flights                  6lRPznIgTvylsXtavRLS1g   1   1      13059            0      6.8mb          6.8mb
yellow open   thailand_map                                _eoY2YyGTqiZQTn70Agr9g   1   1         77            0    191.6kb        191.6kb
yellow open   all_index                                   E1F1Fj_vTYGKZcGoWBj2Tw   1   1      31839            0    400.5mb        400.5mb
green  open   opensearch_dashboards_sample_data_ecommerce nlJazP0nSo27_vODeOfBNg   1   0       4675            0      4.4mb          4.4mb
green  open   .kibana_1560130546_devteam_1                6Pa4niatRCygZO-RWZLEEg   1   0         16            2     51.8kb         51.8kb
yellow open   backup_all_index                            cvQP9t2OQp2N8H6KtH2dIw   1   1      39892            0    244.6mb        244.6mb
green  open   .kibana_92668751_admin_1                    q1cXFM9vSCSu35Bh-r7biA   1   0          1            0        5kb            5kb
yellow open   all_index_accounts                          jea1f0TQTfG_jMElNkVXBw   1   1        616            0      2.4mb          2.4mb
yellow open   security-auditlog-2022.01.11                TzyR0KPzRMyD1M1bxE5Duw   1   1         25            0    115.2kb        115.2kb
green  open   .kibana_1                                   v6HllIqTThm3y39zQ3w6AA   1   0          3            1     31.6kb         31.6kb
yellow open   security-auditlog-2021.12.24                HpWDoaZzRcai3fd_I38ZfQ   1   1        134            0    292.1kb        292.1kb
yellow open   security-auditlog-2021.12.21                0oDaPjs6QCyR-2tF7Wh1mg   1   1        671            0    664.1kb        664.1kb
yellow open   security-auditlog-2021.12.20                sD1T8EFPSqGJbYHRV8aiuw   1   1       3083            0      1.6mb          1.6mb

After test with docker image vidanyu/ache:dev

Config file ache.yml (My system already use VPN Service that support Tor.)

target_storage.data_formats:
   - ELASTICSEARCH

target_storage.data_format.elasticsearch.rest.hosts:
   - http://192.168.11.23:9200

target_storage.data_format.elasticsearch.rest.username: admin
target_storage.data_format.elasticsearch.rest.password: aADSHAShihIznflkEkkyhIIkkk

target_storage.data_format.elasticsearch.rest.connect_timeout: 30000
target_storage.data_format.elasticsearch.rest.socket_timeout: 30000
target_storage.data_format.elasticsearch.rest.max_retry_timeout_millis: 90000

link_storage.max_pages_per_domain: 10000
link_storage.link_strategy.use_scope: true
link_storage.link_strategy.outlinks: true
link_storage.scheduler.host_min_access_interval: 5000

link_storage.link_classifier.type: MaxDepthLinkClassifier
link_storage.link_classifier.max_depth: 2

Result is error like that.

chanwit@test-server-1013:~/ache-tor-crawler$ docker-compose logs -f ache
Attaching to ache
ache         | ----------------------------
ache         | ACHE Crawler 0.14.0-SNAPSHOT
ache         | ----------------------------
ache         |
ache         | [2022-01-28 13:41:07,858] INFO [main] (Log.java:170) - Logging initialized @2018ms to org.eclipse.jetty.util.log.Slf4jLog
ache         | [2022-01-28 13:41:07,903] INFO [main] (JavalinLogger.kt:22) - Static file handler added: StaticFileConfig(hostedPath=/, directory=/public, location=CLASSPATH, precompress=false, aliasCheck=null, headers={Cache-Control=max-age=0}, skipFileFunction=Function1<javax.servlet.http.HttpServletRequest, java.lang.Boolean>). File system location: 'jar:file:/ache/lib/ache-0.14.0-SNAPSHOT.jar!/public'
ache         | [2022-01-28 13:41:08,303] INFO [main] (RestServer.java:137) - ---------------------------------------------
ache         | [2022-01-28 13:41:08,304] INFO [main] (RestServer.java:138) - ACHE server available at http://0.0.0.0:8080
ache         | [2022-01-28 13:41:08,304] INFO [main] (RestServer.java:139) - ---------------------------------------------
ache         | [2022-01-28 13:41:40,247] INFO [JettyServerThreadPool-17] (LinkFilter.java:115) - Loading link patterns from link_whitelist.txt and link_blacklist.txt at /data/DarknetLive-01/config
ache         | [2022-01-28 13:41:40,249] WARN [JettyServerThreadPool-17] (RegexMatcher.java:90) - Couldn't load patterns from file: /data/DarknetLive-01/config/link_whitelist.txt Using a empty list.
ache         | [2022-01-28 13:41:40,250] WARN [JettyServerThreadPool-17] (RegexMatcher.java:90) - Couldn't load patterns from file: /data/DarknetLive-01/config/link_blacklist.txt Using a empty list.
ache         | [2022-01-28 13:41:40,251] INFO [JettyServerThreadPool-17] (FrontierManagerFactory.java:36) - LINK_SELECTOR: achecrawler.link.frontier.selector.MaximizeWebsitesLinkSelector
ache         | [2022-01-28 13:41:40,281] INFO [JettyServerThreadPool-17] (CrawlScheduler.java:92) - Loading more links from frontier into the scheduler...
ache         | [2022-01-28 13:41:40,286] INFO [JettyServerThreadPool-17] (CrawlScheduler.java:177) - Loaded 0 links.
ache         | [2022-01-28 13:41:40,405] INFO [JettyServerThreadPool-17] (FrontierManager.java:236) - Adding 1 seed URL(s)...
ache         | [2022-01-28 13:41:40,499] INFO [JettyServerThreadPool-17] (FrontierManager.java:248) - Added seed URL: http://darkzzx4avcsuofgfez5zq75cqc4mprjvfqywo45dfcaxrwqg6qrlfid.onion/
ache         | WARNING: An illegal reflective access operation has occurred
ache         | WARNING: Illegal reflective access by com.esotericsoftware.kryo.util.UnsafeUtil (file:/ache/lib/kryo-4.0.2.jar) to constructor java.nio.DirectByteBuffer(long,int,java.lang.Object)
ache         | WARNING: Please consider reporting this to the maintainers of com.esotericsoftware.kryo.util.UnsafeUtil
ache         | WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
ache         | WARNING: All illegal access operations will be denied in a future release
ache         | [2022-01-28 13:41:40,582] INFO [JettyServerThreadPool-17] (FrontierManager.java:256) - Number of seeds added: 1
ache         | [2022-01-28 13:41:40,582] INFO [JettyServerThreadPool-17] (FrontierManager.java:260) - Using scope of following domains:
ache         | [2022-01-28 13:41:40,582] INFO [JettyServerThreadPool-17] (FrontierManager.java:262) - darkzzx4avcsuofgfez5zq75cqc4mprjvfqywo45dfcaxrwqg6qrlfid.onion
ache         | [2022-01-28 13:41:40,589] INFO [JettyServerThreadPool-17] (TargetRepositoryFactory.java:57) - Loading repository with data_format=ELASTICSEARCH from /data/DarknetLive-01/data_pages
ache         | [2022-01-28 13:41:40,634] INFO [JettyServerThreadPool-17] (ElasticSearchClientFactory.java:43) - Configured Elasticsearch client to use HTTP BASIC auth credentials.
ache         | [2022-01-28 13:41:40,744] INFO [JettyServerThreadPool-17] (ElasticSearchClientFactory.java:55) - Initialized Elasticsearch REST client for hosts: [http://192.168.11.23:9200]
ache         | log4j:WARN No appenders could be found for logger (org.apache.http.impl.nio.client.MainClientExec).
ache         | log4j:WARN Please initialize the log4j system properly.
ache         | log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
ache         | [2022-01-28 13:41:40,898] INFO [JettyServerThreadPool-17] (ElasticSearchClientFactory.java:84) - [Content-Length: 577,Chunked: false]
ache         | [2022-01-28 13:41:40,911] INFO [JettyServerThreadPool-17] (ElasticSearchRestTargetRepository.java:65) - Elasticsearch version: 1
ache         | [2022-01-28 13:41:40,963]ERROR [JettyServerThreadPool-17] (CrawlerResource.java:143) - Failed to start crawler.
ache         | java.lang.RuntimeException: Failed to create index in Elasticsearch.
ache         |  at achecrawler.target.repository.ElasticSearchRestTargetRepository.createIndexMapping(ElasticSearchRestTargetRepository.java:122)
ache         |  at achecrawler.target.repository.ElasticSearchRestTargetRepository.<init>(ElasticSearchRestTargetRepository.java:47)
ache         |  at achecrawler.target.TargetRepositoryFactory.createRepository(TargetRepositoryFactory.java:87)
ache         |  at achecrawler.target.TargetRepositoryFactory.create(TargetRepositoryFactory.java:34)
ache         |  at achecrawler.target.TargetStorage.create(TargetStorage.java:131)
ache         |  at achecrawler.crawler.async.AsyncCrawler.create(AsyncCrawler.java:117)
ache         |  at achecrawler.crawler.CrawlersManager.createCrawler(CrawlersManager.java:104)
ache         |  at achecrawler.crawler.CrawlersManager.createCrawler(CrawlersManager.java:89)
ache         |  at achecrawler.crawler.CrawlersManager.createCrawler(CrawlersManager.java:70)
ache         |  at achecrawler.rest.resources.CrawlerResource.lambda$new$4(CrawlerResource.java:134)
ache         |  at io.javalin.core.security.SecurityUtil.noopAccessManager(SecurityUtil.kt:20)
ache         |  at io.javalin.http.JavalinServlet.addHandler$lambda-5(JavalinServlet.kt:113)
ache         |  at io.javalin.http.JavalinServlet$service$tryBeforeAndEndpointHandlers$1.invoke(JavalinServlet.kt:44)
ache         |  at io.javalin.http.JavalinServlet$service$tryBeforeAndEndpointHandlers$1.invoke(JavalinServlet.kt:39)
ache         |  at io.javalin.http.JavalinServlet.service$tryWithExceptionMapper(JavalinServlet.kt:129)
ache         |  at io.javalin.http.JavalinServlet.service$tryBeforeAndEndpointHandlers(JavalinServlet.kt:39)
ache         |  at io.javalin.http.JavalinServlet.service(JavalinServlet.kt:87)
ache         |  at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
ache         |  at io.javalin.jetty.JavalinJettyServlet.service(JavalinJettyServlet.kt:58)
ache         |  at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
ache         |  at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799)
ache         |  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:550)
ache         |  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
ache         |  at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624)
ache         |  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
ache         |  at io.javalin.jetty.JettyServer$start$wsAndHttpHandler$1.doHandle(JettyServer.kt:52)
ache         |  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
ache         |  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:501)
ache         |  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594)
ache         |  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
ache         |  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1349)
ache         |  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
ache         |  at org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:179)
ache         |  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
ache         |  at org.eclipse.jetty.server.Server.handle(Server.java:516)
ache         |  at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:400)
ache         |  at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:645)
ache         |  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:392)
ache         |  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)
ache         |  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
ache         |  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
ache         |  at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
ache         |  at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)
ache         |  at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)
ache         |  at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)
ache         |  at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)
ache         |  at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)
ache         |  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
ache         |  at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)
ache         |  at java.base/java.lang.Thread.run(Unknown Source)
ache         | Caused by: org.elasticsearch.client.ResponseException: PUT http://192.168.11.23:9200/ache-data: HTTP/1.1 400 Bad Request
ache         | {"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"Root mapping definition has unsupported parameters:  [page : {properties={isRelevant={index=not_analyzed, type=string}, crawlerId={index=not_analyzed, type=string}, domain={index=not_analyzed, type=string}, words={index=not_analyzed, type=string}, wordsMeta={index=not_analyzed, type=string}, retrieved={format=dateOptionalTime, type=date}, text={type=string}, title={type=string}, url={index=not_analyzed, type=string}, relevance={type=double}, topPrivateDomain={index=not_analyzed, type=string}}}]"}],"type":"mapper_parsing_exception","reason":"Failed to parse mapping [_doc]: Root mapping definition has unsupported parameters:  [page : {properties={isRelevant={index=not_analyzed, type=string}, crawlerId={index=not_analyzed, type=string}, domain={index=not_analyzed, type=string}, words={index=not_analyzed, type=string}, wordsMeta={index=not_analyzed, type=string}, retrieved={format=dateOptionalTime, type=date}, text={type=string}, title={type=string}, url={index=not_analyzed, type=string}, relevance={type=double}, topPrivateDomain={index=not_analyzed, type=string}}}]","caused_by":{"type":"mapper_parsing_exception","reason":"Root mapping definition has unsupported parameters:  [page : {properties={isRelevant={index=not_analyzed, type=string}, crawlerId={index=not_analyzed, type=string}, domain={index=not_analyzed, type=string}, words={index=not_analyzed, type=string}, wordsMeta={index=not_analyzed, type=string}, retrieved={format=dateOptionalTime, type=date}, text={type=string}, title={type=string}, url={index=not_analyzed, type=string}, relevance={type=double}, topPrivateDomain={index=not_analyzed, type=string}}}]"}},"status":400}
ache         |  at org.elasticsearch.client.RestClient$1.completed(RestClient.java:354)
ache         |  at org.elasticsearch.client.RestClient$1.completed(RestClient.java:343)
ache         |  at org.apache.http.concurrent.BasicFuture.completed(BasicFuture.java:122)
ache         |  at org.apache.http.impl.nio.client.DefaultClientExchangeHandlerImpl.responseCompleted(DefaultClientExchangeHandlerImpl.java:177)
ache         |  at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.processResponse(HttpAsyncRequestExecutor.java:436)
ache         |  at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.inputReady(HttpAsyncRequestExecutor.java:326)
ache         |  at org.apache.http.impl.nio.client.InternalRequestExecutor.inputReady(InternalRequestExecutor.java:83)
ache         |  at org.apache.http.impl.nio.DefaultNHttpClientConnection.consumeInput(DefaultNHttpClientConnection.java:265)
ache         |  at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:81)
ache         |  at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:39)
ache         |  at org.apache.http.impl.nio.reactor.AbstractIODispatch.inputReady(AbstractIODispatch.java:114)
ache         |  at org.apache.http.impl.nio.reactor.BaseIOReactor.readable(BaseIOReactor.java:162)
ache         |  at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvent(AbstractIOReactor.java:337)
ache         |  at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvents(AbstractIOReactor.java:315)
ache         |  at org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:276)
ache         |  at org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104)
ache         |  at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:588)
ache         |  ... 1 common frames omitted

from ache.

aecio avatar aecio commented on August 16, 2024

Your log shows that ACHE is reading the configs from the ache.yml, so I'm not sure what is going on...

ache         | [2022-01-28 13:41:40,634] INFO [JettyServerThreadPool-17] (ElasticSearchClientFactory.java:43) - Configured Elasticsearch client to use HTTP BASIC auth credentials.

I did some debugging here using tcpflow to see the raw request content being sent over the wire and found ACHE is sending the correct credentials. For the same credentials in your config file, I see a request that includes the header Authorization: Basic YWRtaW46YUFEU0hBU2hpaEl6bmZsa0Vra3loSUlra2s= (which decodes to admin:aADSHAShihIznflkEkkyhIIkkk). Here is the raw request I got with tcpdump:

172.017.000.001.36774-172.017.000.002.09200: GET / HTTP/1.1
Content-Length: 0
Host: localhost:9200
Connection: Keep-Alive
User-Agent: Apache-HttpAsyncClient/4.1.2 (Java/11.0.13)
Authorization: Basic YWRtaW46YUFEU0hBU2hpaEl6bmZsa0Vra3loSUlra2s=

Can you debug your setup to see if the correct request is being sent? Is there any chance you may have a proxy that is not forwarding the correct headers or something else?

Here I used the following command: sudo tcpflow -p -c -i docker0 port 9200, but you might need to modify it depending on your setup.

from ache.

chanwitkepha avatar chanwitkepha commented on August 16, 2024

Here is the result from tcpflow.

chanwit@test-server-1013:~/ache-tor-crawler$ sudo tcpflow -p -c -i br-d649aeb48662 port 9200
reportfilename: ./report.xml
tcpflow: listening on br-d649aeb48662
010.250.250.003.44324-192.168.011.023.09200: GET / HTTP/1.1
Content-Length: 0
Host: 192.168.11.23:9200
Connection: Keep-Alive
User-Agent: Apache-HttpAsyncClient/4.1.2 (Java/11.0.13)
Authorization: Basic YWRtaW46YUFEU0hBU2hpaEl6bmZsa0Vra3loSUlra2s=


192.168.011.023.09200-010.250.250.003.44324: HTTP/1.1 200 OK
content-type: application/json; charset=UTF-8
content-length: 577

{
  "name" : "test-server-1123",
  "cluster_name" : "opensearch-cluster",
  "cluster_uuid" : "5cmPDozUSvuOp3lzcfvBVQ",
  "version" : {
    "distribution" : "opensearch",
    "number" : "1.2.4",
    "build_type" : "tar",
    "build_hash" : "e505b10357c03ae8d26d675172402f2f2144ef0f",
    "build_date" : "2022-01-14T03:38:06.881862Z",
    "build_snapshot" : false,
    "lucene_version" : "8.10.1",
    "minimum_wire_compatibility_version" : "6.8.0",
    "minimum_index_compatibility_version" : "6.0.0-beta1"
  },
  "tagline" : "The OpenSearch Project: https://opensearch.org/"
}

010.250.250.003.44324-192.168.011.023.09200: HEAD /ache-data HTTP/1.1
Host: 192.168.11.23:9200
Connection: Keep-Alive
User-Agent: Apache-HttpAsyncClient/4.1.2 (Java/11.0.13)
Authorization: Basic YWRtaW46YUFEU0hBU2hpaEl6bmZsa0Vra3loSUlra2s=


192.168.011.023.09200-010.250.250.003.44324: HTTP/1.1 404 Not Found
content-type: application/json; charset=UTF-8
content-length: 383


010.250.250.003.44324-192.168.011.023.09200: GET / HTTP/1.1
Content-Length: 0
Host: 192.168.11.23:9200
Connection: Keep-Alive
User-Agent: Apache-HttpAsyncClient/4.1.2 (Java/11.0.13)
Authorization: Basic YWRtaW46YUFEU0hBU2hpaEl6bmZsa0Vra3loSUlra2s=


192.168.011.023.09200-010.250.250.003.44324: HTTP/1.1 200 OK
content-type: application/json; charset=UTF-8
content-length: 577

{
  "name" : "test-server-1123",
  "cluster_name" : "opensearch-cluster",
  "cluster_uuid" : "5cmPDozUSvuOp3lzcfvBVQ",
  "version" : {
    "distribution" : "opensearch",
    "number" : "1.2.4",
    "build_type" : "tar",
    "build_hash" : "e505b10357c03ae8d26d675172402f2f2144ef0f",
    "build_date" : "2022-01-14T03:38:06.881862Z",
    "build_snapshot" : false,
    "lucene_version" : "8.10.1",
    "minimum_wire_compatibility_version" : "6.8.0",
    "minimum_index_compatibility_version" : "6.0.0-beta1"
  },
  "tagline" : "The OpenSearch Project: https://opensearch.org/"
}

010.250.250.003.44324-192.168.011.023.09200: PUT /ache-data HTTP/1.1
Content-Length: 721
Content-Type: application/json; charset=UTF-8
Host: 192.168.11.23:9200
Connection: Keep-Alive
User-Agent: Apache-HttpAsyncClient/4.1.2 (Java/11.0.13)
Authorization: Basic YWRtaW46YUFEU0hBU2hpaEl6bmZsa0Vra3loSUlra2s=

{  "mappings": {    "page": {  "properties": {    "domain":           {"type": "string","index": "not_analyzed"},    "words":            {"type": "string","index": "not_analyzed"},    "wordsMeta":        {"type": "string","index": "not_analyzed"},    "retrieved":        {"type": "date","format": "dateOptionalTime"},    "text":             {"type": "string"},    "title":            {"type": "string"},    "url":              {"type": "string","index": "not_analyzed"},    "topPrivateDomain": {"type": "string","index": "not_analyzed"},    "isRelevant":       {"type": "string","index": "not_analyzed"},    "crawlerId":        {"type": "string","index": "not_analyzed"},    "relevance":        {"type": "double"}  }}  }}
192.168.011.023.09200-010.250.250.003.44324: HTTP/1.1 400 Bad Request
content-type: application/json; charset=UTF-8
content-length: 1722

{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"Root mapping definition has unsupported parameters:  [page : {properties={isRelevant={index=not_analyzed, type=string}, crawlerId={index=not_analyzed, type=string}, domain={index=not_analyzed, type=string}, words={index=not_analyzed, type=string}, wordsMeta={index=not_analyzed, type=string}, retrieved={format=dateOptionalTime, type=date}, text={type=string}, title={type=string}, url={index=not_analyzed, type=string}, relevance={type=double}, topPrivateDomain={index=not_analyzed, type=string}}}]"}],"type":"mapper_parsing_exception","reason":"Failed to parse mapping [_doc]: Root mapping definition has unsupported parameters:  [page : {properties={isRelevant={index=not_analyzed, type=string}, crawlerId={index=not_analyzed, type=string}, domain={index=not_analyzed, type=string}, words={index=not_analyzed, type=string}, wordsMeta={index=not_analyzed, type=string}, retrieved={format=dateOptionalTime, type=date}, text={type=string}, title={type=string}, url={index=not_analyzed, type=string}, relevance={type=double}, topPrivateDomain={index=not_analyzed, type=string}}}]","caused_by":{"type":"mapper_parsing_exception","reason":"Root mapping definition has unsupported parameters:  [page : {properties={isRelevant={index=not_analyzed, type=string}, crawlerId={index=not_analyzed, type=string}, domain={index=not_analyzed, type=string}, words={index=not_analyzed, type=string}, wordsMeta={index=not_analyzed, type=string}, retrieved={format=dateOptionalTime, type=date}, text={type=string}, title={type=string}, url={index=not_analyzed, type=string}, relevance={type=double}, topPrivateDomain={index=not_analyzed, type=string}}}]"}},"status":400}

from ache.

aecio avatar aecio commented on August 16, 2024

So, it looks like the authentication is working for you as well: ES returned HTTP 200 and 400 codes, and none of them are 401 Unauthorized. The last request in your log is a PUT mapping request that might be failing for a different reason. Maybe due to issue #206? Is the OpenSearch version you are using compatible with Elasticsearch 6.x? If you're using a modified ElasticSearchRestTargetRepository it might still have some bug in the createIndexMapping() method.

from ache.

chanwitkepha avatar chanwitkepha commented on August 16, 2024

So, it looks like the authentication is working for you as well: ES returned HTTP 200 and 400 codes, and none of them are 401 Unauthorized. The last request in your log is a PUT mapping request that might be failing for a different reason. Maybe due to issue #206? Is the OpenSearch version you are using compatible with Elasticsearch 6.x? If you're using a modified ElasticSearchRestTargetRepository it might still have some bug in the createIndexMapping() method.

Now I use OpenSearch version 1.2.4. I understand that is not compatible with both Elasticsearch 6.x and 7.x. Because even I use modified ElasticSearchRestTargetRepository in issue #206. It have same error too.

However for authentication function, I also test with OpenDistro for ElasticSearch 1.13.3 too, It's legacy version before OpenSearch which base on ElasticSearch 7.10.2, It's work well. (With modified ElasticSearchRestTargetRepository in issue #206)

So I think authentication function in ACHE latest dev version is work fine and can be close issue about this, However It would be better if ACHE Software can support both ElasticSearch 7.x and OpenSearch. Bacause OpenSearch is Open-Source Project that fork from ElasticSearch 7.x which is proprietary software.

Thank you.

from ache.

aecio avatar aecio commented on August 16, 2024

Thanks for the feedback, I'll close this issue then. Support for 7.x+ and other distros can be discussed in the other issues.

from ache.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.