Giter VIP home page Giter VIP logo

cortexdocs's Introduction

cortexdocs's People

Contributors

1earch avatar 3c7 avatar amr-cossi avatar arnydo avatar axpatito avatar christophetd avatar garanews avatar jeffrey-e avatar jeromeleonard avatar ldelavaissiere avatar mback2k avatar mcvic1rj avatar mdavis332 avatar megan201296 avatar nadouani avatar obikao avatar oxeeql avatar pettai avatar saadkadhi avatar shsauler avatar tnvo avatar to-om avatar tyliec avatar weslambert avatar xluek avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cortexdocs's Issues

binary installation issue

Hi,
i'm trying to install Cortex on ubuntu (virtualBox), but when i lunch command bin/cortex -Dconfig.file=/etc/cortex/application.conf getting this error:

[info] a.e.s.Slf4jLogger - Slf4jLogger started
ERROR StatusLogger Log4j2 could not find a logging implementation. Please add log4j-core to the classpath. Using SimpleLogger to log to the console...
Oops, cannot start the server.
Configuration error: Configuration error[
The application secret has not been set, and we are in prod mode. Your application is not secure.

tips to fix?
thanks

How to enable/update Analyzers in training VM ?

Seems stupid but in https://github.com/TheHive-Project/TheHiveDocs/blob/master/training-material.md, it is indicated that "With the new version, analyzers are disabled by default. The training VM is delivered with Abuse Finder, File_Info, Msg_Parser and MaxMind GeoIP enabled."

Ok, so I have to update them (https://github.com/TheHive-Project/CortexDocs/blob/master/installation/install-guide.md#updating)

But I have to go to a specific directory and update the git code:
/opt/Cortex-Analyzers

But this directory doesn't exist in the VM
image

Ok, maybe the git directory is located elsewhere, let's locate it:
image

Maybe there is an alternative directory configured in cortex:
image

In short I've used the training VM to play with the application, and I see only a few analyzers.

I suggest to modify the VM or to update the documentation to have the same path in the VM that in the doc.

Docker Container: Is it possible to use an docker socket proxy?

Hi,
I do not like the idea of integration of docker.socket file into the container.
Is it possible to use a network docker socket instead of?
For traefik for example it is possible to use a docker tcp socket which can be used via docker container: tecnativa/docker-socket-proxy
If that is possible which kind of docker-compose settings must be set?

Kind regards

"Add Custom Field" not working in cortex responder

I'm trying to add a custom field to a case in the function operations but it's not working
def operations(self, raw):
return[self.build_operation("AddCustomField",name="ticket_gir",value=raw["numeroIncident"],tpe= "string")]

Analyzer installation - confusion?!

Hi there,

I just installed cortex and try to find my way through the documentation about the following configuration of cortex and its analyzers.

The Cortex Analyzer Requirements Guide says it outlines the installation of the analyzers, but nothing is to be found.
analyzers.md state that:

They are included in the Cortex binary, RPM and DEB packages

Looking at the installed package, I would assume that this is the right location:

dpgk -L cortex
...
/opt/cortex/lib
/opt/cortex/lib/org.scala-stm.scala-stm_2.11-0.7.jar
/opt/cortex/lib/com.typesafe.play.play-logback_2.11-2.5.9.jar
/opt/cortex/lib/io.netty.netty-transport-native-epoll-4.0.41.Final-linux-x86_64.jar
/opt/cortex/lib/com.typesafe.play.play-functional_2.11-2.5.9.jar

So when I configure the application.conf file, this should be the right setup:

  # Absolute path where you have pulled the Cortex-Analyzers repository.
  path = "/opt/cortex/lib/

Is this correct? Sorry, I just got a little bit confused by the various manuals.

Best regards,
Tom

Cannot find download_hashes.py in Virusshare directory

Hello!
I can't find download_hashes.py script in Virusshare analysers directory (as mentioned here). I found get_hashes.sh and run it, but at the beginning I got following error:
curl: (35) error:1414D172:SSL routines:tls12_check_peer_sigalg:wrong signature type
Note: I am using Cortex 2.1.3 in Docker.

Cortex connection refused

Hello,
I am using elasticsearch 7.10 with cortex 3.1.0-1 and when I try to run it, I am getting these errors:

WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.google.inject.internal.cglib.core.$ReflectUtils$1 (file:/opt/cortex-3.0.1-1/lib/com.google.inject.guice-4.1.0.jar) to method java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
WARNING: Please consider reporting this to the maintainers of com.google.inject.internal.cglib.core.$ReflectUtils$1
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
[info] a.e.s.Slf4jLogger - Slf4jLogger started
[info] c.s.e.h.ElasticClient$ - Creating HTTP client on http://127.0.0.1:9200
[info] o.t.c.s.WorkerSrv - New worker list:

	

[warn] o.e.d.SearchWithScroll - Search error
com.sksamuel.elastic4s.http.JavaClientExceptionWrapper: java.net.ConnectException: Connexion refusée
	at com.sksamuel.elastic4s.http.ElasticsearchJavaRestClient$$anon$1.onFailure(ElasticsearchJavaRestClient.scala:63)
	at org.elasticsearch.client.RestClient$FailureTrackingResponseListener.onDefinitiveFailure(RestClient.java:850)
	at org.elasticsearch.client.RestClient$1.retryIfPossible(RestClient.java:588)
	at org.elasticsearch.client.RestClient$1.failed(RestClient.java:567)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:134)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.failed(AbstractClientExchangeHandler.java:419)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.connectionRequestFailed(AbstractClientExchangeHandler.java:335)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.access$100(AbstractClientExchangeHandler.java:62)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler$1.failed(AbstractClientExchangeHandler.java:378)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:134)
Caused by: java.net.ConnectException: Connexion refusée
	at java.base/sun.nio.ch.Net.pollConnect(Native Method)
	at java.base/sun.nio.ch.Net.pollConnectNow(Net.java:589)
	at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:839)
	at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvent(DefaultConnectingIOReactor.java:171)
	... 5 common frames omitted
[info] o.a.h.i.e.RetryExec - I/O exception (java.io.IOException) caught when processing request to {}->unix://localhost:80: No such file or directory
[info] o.a.h.i.e.RetryExec - Retrying request to {}->unix://localhost:80
[info] o.a.h.i.e.RetryExec - I/O exception (java.io.IOException) caught when processing request to {}->unix://localhost:80: No such file or directory
[info] o.a.h.i.e.RetryExec - Retrying request to {}->unix://localhost:80
[info] o.a.h.i.e.RetryExec - I/O exception (java.io.IOException) caught when processing request to {}->unix://localhost:80: No such file or directory
[info] o.a.h.i.e.RetryExec - Retrying request to {}->unix://localhost:80
[info] o.t.c.s.DockerJobRunnerSrv - Docker is not available
com.spotify.docker.client.exceptions.DockerException: java.util.concurrent.ExecutionException: javax.ws.rs.ProcessingException: java.io.IOException: No such file or directory
	at com.spotify.docker.client.DefaultDockerClient.propagate(DefaultDockerClient.java:2828)
	at com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:2692)
	at com.spotify.docker.client.DefaultDockerClient.info(DefaultDockerClient.java:595)
	at org.thp.cortex.services.DockerJobRunnerSrv.$anonfun$isAvailable$2(DockerJobRunnerSrv.scala:47)
	at play.api.LoggerLike.info(Logger.scala:160)
	at play.api.LoggerLike.info$(Logger.scala:157)
	at play.api.Logger.info(Logger.scala:251)
	at org.thp.cortex.services.DockerJobRunnerSrv.$anonfun$isAvailable$1(DockerJobRunnerSrv.scala:47)
	at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
	at scala.util.Try$.apply(Try.scala:213)
Caused by: java.util.concurrent.ExecutionException: javax.ws.rs.ProcessingException: java.io.IOException: No such file or directory
	at jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299)
	at jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286)
	at jersey.repackaged.com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
	at com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:2690)
	at com.spotify.docker.client.DefaultDockerClient.info(DefaultDockerClient.java:595)
	at org.thp.cortex.services.DockerJobRunnerSrv.$anonfun$isAvailable$2(DockerJobRunnerSrv.scala:47)
	at play.api.LoggerLike.info(Logger.scala:160)
	at play.api.LoggerLike.info$(Logger.scala:157)
	at play.api.Logger.info(Logger.scala:251)
	at org.thp.cortex.services.DockerJobRunnerSrv.$anonfun$isAvailable$1(DockerJobRunnerSrv.scala:47)
Caused by: javax.ws.rs.ProcessingException: java.io.IOException: No such file or directory
	at org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:481)
	at org.glassfish.jersey.apache.connector.ApacheConnector$1.run(ApacheConnector.java:491)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at jersey.repackaged.com.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:118)
	at jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:50)
	at jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:37)
	at org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:487)
	at org.glassfish.jersey.client.ClientRuntime$2.run(ClientRuntime.java:178)
Caused by: java.io.IOException: No such file or directory
	at jnr.unixsocket.UnixSocketChannel.doConnect(UnixSocketChannel.java:127)
	at jnr.unixsocket.UnixSocketChannel.connect(UnixSocketChannel.java:136)
	at jnr.unixsocket.UnixSocketChannel.connect(UnixSocketChannel.java:223)
	at com.spotify.docker.client.UnixConnectionSocketFactory.connectSocket(UnixConnectionSocketFactory.java:85)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:141)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88)
[warn] o.t.c.s.JobRunnerSrv - The package cortexutils for python hasn't been found
[warn] o.t.c.s.JobRunnerSrv - The package cortexutils for python2 hasn't been found
WARNING: Package(s) not found: cortexutils
[warn] o.t.c.s.JobRunnerSrv - The package cortexutils for python3 hasn't been found
[warn] o.e.d.SearchWithScroll - Search error
com.sksamuel.elastic4s.http.JavaClientExceptionWrapper: java.net.ConnectException: Connexion refusée
	at com.sksamuel.elastic4s.http.ElasticsearchJavaRestClient$$anon$1.onFailure(ElasticsearchJavaRestClient.scala:63)
	at org.elasticsearch.client.RestClient$FailureTrackingResponseListener.onDefinitiveFailure(RestClient.java:850)
	at org.elasticsearch.client.RestClient$1.retryIfPossible(RestClient.java:588)
	at org.elasticsearch.client.RestClient$1.failed(RestClient.java:567)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:134)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.failed(AbstractClientExchangeHandler.java:419)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.connectionRequestFailed(AbstractClientExchangeHandler.java:335)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.access$100(AbstractClientExchangeHandler.java:62)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler$1.failed(AbstractClientExchangeHandler.java:378)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:134)
Caused by: java.net.ConnectException: Connexion refusée
	at java.base/sun.nio.ch.Net.pollConnect(Native Method)
	at java.base/sun.nio.ch.Net.pollConnectNow(Net.java:589)
	at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:839)
	at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvent(DefaultConnectingIOReactor.java:171)
	... 5 common frames omitted
[info] play.api.Play - Application started (Prod)
[info] p.c.s.AkkaHttpServer - Enabling HTTP/2 on Akka HTTP server...
[info] p.c.s.AkkaHttpServer - Listening for HTTP on /[0:0:0:0:0:0:0:0]:9001
[info] o.t.c.s.ErrorHandler - GET /api/user/current returned 500
com.sksamuel.elastic4s.http.JavaClientExceptionWrapper: java.net.ConnectException: Connexion refusée
	at com.sksamuel.elastic4s.http.ElasticsearchJavaRestClient$$anon$1.onFailure(ElasticsearchJavaRestClient.scala:63)
	at org.elasticsearch.client.RestClient$FailureTrackingResponseListener.onDefinitiveFailure(RestClient.java:850)
	at org.elasticsearch.client.RestClient$1.retryIfPossible(RestClient.java:588)
	at org.elasticsearch.client.RestClient$1.failed(RestClient.java:567)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:134)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.failed(AbstractClientExchangeHandler.java:419)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.connectionRequestFailed(AbstractClientExchangeHandler.java:335)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.access$100(AbstractClientExchangeHandler.java:62)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler$1.failed(AbstractClientExchangeHandler.java:378)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:134)
Caused by: java.net.ConnectException: Connexion refusée
	at java.base/sun.nio.ch.Net.pollConnect(Native Method)
	at java.base/sun.nio.ch.Net.pollConnectNow(Net.java:589)
	at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:839)
	at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvent(DefaultConnectingIOReactor.java:171)
	... 5 common frames omitted
^C[info] p.c.s.AkkaHttpServer - Stopping server...

My configuration file is:

play.http.secret.key="Kx78DfIk0r2v6yozFzefA7wmSGs7KndLqkl1GKm7RtosrrU4MWSHMNKCuMSMpk54"
# Elasticsearch
search {
  # Name of the index
  index = cortex
  # Name of the Elasticsearch cluster
  cluster = "SIEM-ELK"
  # Address of the Elasticsearch instance
  host = ["X.X.X.X:9300"]
  # Scroll keepalive
  keepalive = 1m
  # Size of the page for scroll
  pagesize = 50
  # Number of shards
  nbshards = 5
  # Number of replicas
  nbreplicas = 1
  # Arbitrary settings
  settings {
    # Maximum number of nested fields
    mapping.nested_fields.limit = 100
  }

  ### XPack SSL configuration
  # Username for XPack authentication
  user = "elastic"
  # Password for XPack authentication
  password =  "password"
  # Enable SSL to connect to ElasticSearch
  ssl.enabled = true
  # Path to certificate authority file
  ssl.ca = "/opt/cortex-3.0.1-1/conf/certs/ca.crt"
  # Path to certificate file
  ssl.certificate = "/opt/cortex-3.0.1-1/conf/certs/CORTEX.crt"
  # Path to key file
  ssl.key = "/opt/cortex-3.0.1-1/conf/certs/CORTEX.key"

  ### SearchGuard configuration
  # Path to JKS file containing client certificate
  #guard.keyStore.path = ""
  # Password of the keystore
  #guard.keyStore.password = "" 
  # Path to JKS file containing certificate authorities
  #guard.trustStore.path = ""
  ## Password of the truststore
  #guard.trustStore.password = "" 
  # Enforce hostname verification
  #guard.hostVerification = ""
  # If hostname verification is enabled specify if hostname should be resolved
  #guard.hostVerificationResolveHostname = "" 
}

Thanks for your help

Unable to update

Hi,

I installed The Hive and Cortex using RPM. When i go to local web of both Hive and Cortex I get the same error:

My setup:
Linux lab.centos7 3.10.0-862.el7.x86_64 #1 SMP Fri Apr 20 16:44:24 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
cortex.noarch 2.0.4-1
elasticsearch.noarch 6.3.2-1
thehive.noarch 3.0.10-1

2018-08-20 17:37:40,950 [INFO] from org.thp.cortex.services.ErrorHandler in application-akka.actor.default-dispatcher-5 - POST /api/maintenance/migrate returned 400
org.elasticsearch.transport.RemoteTransportException: [fQRBr16][127.0.0.1:9300][indices:admin/create]
Caused by: java.lang.IllegalArgumentException: Rejecting mapping update to [cortex_1] as the final mapping would have more than 1 type: [artifact, dblist, data, audit, analyzer, organization, report, job, user, analyzerConfig]
        at org.elasticsearch.index.mapper.MapperService.internalMerge(MapperService.java:408)
        at org.elasticsearch.index.mapper.MapperService.internalMerge(MapperService.java:356)
        at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:280)
        at org.elasticsearch.cluster.metadata.MetaDataCreateIndexService$IndexCreationTask.execute(MetaDataCreateIndexService.java:443)
        at org.elasticsearch.cluster.ClusterStateUpdateTask.execute(ClusterStateUpdateTask.java:45)
        at org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:630)
        at org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:267)
        at org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:197)
        at org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:132)
        at org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:150)
        at org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:188)
        at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:626)
        at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:244)
        at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:207)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

Looks like the mapping removal in ES 6+ disabled updating and with that any usage of Hive and Cortex. Any ideas? Thank you for help.

Better Describe Proxy Scenarios

I just spent the better part of 2 hours trying to figure this out and didn't see it anywhere in the documentation, so I'm putting it here in the hopes that it might help someone else going forward.

If your Cortex analyzer is behind a proxy that is utilizing a custom certificate (e.g. for SSL-Breakout) the "CA Certs" field in the analyzer options does NOT accept a

-----BEGIN CERTIFICATE------
<base_64_blob>
-----END` CERTIFICATE-----  

Instead, you need to add that blob to a file (in my case I added it to /opt/Cortex-Analyzers/certs/cacert.pem) and you must then put that full path in the "CA Certs" field of the configuration.

If you don't do this and you're utilizing SSL Breakout you'll continue to get an HTTPSConnectionPool Error - SSLError - CERTIFICATE_VERIFY_FAILED

ElasticSearch Docker instructions don't work on Ubuntu 20.04

Copy & pasting and modifying the ELK instructions:

cat elastic-install.sh
docker run \
  --name elasticsearch \
  --hostname elasticsearch \
  --rm \
  --publish 127.0.0.1:9200:9200 \
  --volume /data/elastic:/usr/share/elasticsearch/data \
	-e "http.host=0.0.0.0" \
	-e "xpack.security.enabled=false" \
	-e "cluster.name=hive" \
  -e "script.inline=true" \
  -e "thread_pool.search.queue_size=100000" \
	docker.elastic.co/elasticsearch/elasticsearch:7.9.1

Gives (note the ** java.lang.IllegalArgumentException: unknown setting [script.inline] please check that any required plugins are installed, or check the breaking changes documentation for removed settings",** part ):

aaron@NANU:~/work/projects/cortex$ sh elastic-install.sh 
Unable to find image 'docker.elastic.co/elasticsearch/elasticsearch:7.9.1' locally
7.9.1: Pulling from elasticsearch/elasticsearch
f1feca467797: Pull complete 
dcfca94e7428: Pull complete 
d2bf8b28bdf5: Pull complete 
5efd10fdc328: Pull complete 
71948c71bf56: Pull complete 
3d79fd8021d0: Pull complete 
3561742200e5: Pull complete 
2811408f56d0: Pull complete 
cb5a557b51ee: Pull complete 
Digest: sha256:0a5308431aee029636858a6efe07e409fa699b02549a78d7904eb931b8c46920
Status: Downloaded newer image for docker.elastic.co/elasticsearch/elasticsearch:7.9.1
{"type": "server", "timestamp": "2021-01-17T08:32:26,812Z", "level": "INFO", "component": "o.e.n.Node", "cluster.name": "hive", "node.name": "elasticsearch", "message": "version[7.9.1], pid[7], build[default/docker/083627f112ba94dffc1232e8b42b73492789ef91/2020-09-01T21:22:21.964974Z], OS[Linux/5.8.0-36-generic/amd64], JVM[AdoptOpenJDK/OpenJDK 64-Bit Server VM/14.0.1/14.0.1+7]" }
{"type": "server", "timestamp": "2021-01-17T08:32:26,814Z", "level": "INFO", "component": "o.e.n.Node", "cluster.name": "hive", "node.name": "elasticsearch", "message": "JVM home [/usr/share/elasticsearch/jdk]" }
{"type": "server", "timestamp": "2021-01-17T08:32:26,814Z", "level": "INFO", "component": "o.e.n.Node", "cluster.name": "hive", "node.name": "elasticsearch", "message": "JVM arguments [-Xshare:auto, -Des.networkaddress.cache.ttl=60, -Des.networkaddress.cache.negative.ttl=10, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -XX:-OmitStackTraceInFastThrow, -XX:+ShowCodeDetailsInExceptionMessages, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dio.netty.allocator.numDirectArenas=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -Djava.locale.providers=SPI,COMPAT, -Xms1g, -Xmx1g, -XX:+UseG1GC, -XX:G1ReservePercent=25, -XX:InitiatingHeapOccupancyPercent=30, -Djava.io.tmpdir=/tmp/elasticsearch-13677390955176096374, -XX:+HeapDumpOnOutOfMemoryError, -XX:HeapDumpPath=data, -XX:ErrorFile=logs/hs_err_pid%p.log, -Xlog:gc*,gc+age=trace,safepoint:file=logs/gc.log:utctime,pid,tags:filecount=32,filesize=64m, -Des.cgroups.hierarchy.override=/, -XX:MaxDirectMemorySize=536870912, -Des.path.home=/usr/share/elasticsearch, -Des.path.conf=/usr/share/elasticsearch/config, -Des.distribution.flavor=default, -Des.distribution.type=docker, -Des.bundled_jdk=true]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,834Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [aggs-matrix-stats]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,834Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [analysis-common]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,834Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [constant-keyword]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,834Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [flattened]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,834Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [frozen-indices]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,834Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [ingest-common]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,834Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [ingest-geoip]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,834Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [ingest-user-agent]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,835Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [kibana]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,835Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [lang-expression]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,835Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [lang-mustache]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,835Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [lang-painless]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,835Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [mapper-extras]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,835Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [parent-join]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,835Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [percolator]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,835Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [rank-eval]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,835Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [reindex]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,836Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [repository-url]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,836Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [search-business-rules]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,836Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [searchable-snapshots]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,836Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [spatial]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,836Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [tasks]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,836Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [transform]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,836Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [transport-netty4]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,836Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [vectors]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,836Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [wildcard]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,836Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-analytics]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,837Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-async]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,837Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-async-search]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,837Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-autoscaling]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,837Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-ccr]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,837Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-core]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,837Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-data-streams]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,837Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-deprecation]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,837Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-enrich]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,837Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-eql]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,837Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-graph]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,837Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-identity-provider]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,838Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-ilm]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,838Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-logstash]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,838Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-ml]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,838Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-monitoring]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,838Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-ql]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,838Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-rollup]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,838Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-security]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,838Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-sql]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,838Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-stack]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,838Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-voting-only-node]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,839Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "loaded module [x-pack-watcher]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,839Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "hive", "node.name": "elasticsearch", "message": "no plugins loaded" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,863Z", "level": "INFO", "component": "o.e.e.NodeEnvironment", "cluster.name": "hive", "node.name": "elasticsearch", "message": "using [1] data paths, mounts [[/usr/share/elasticsearch/data (/dev/mapper/vgubuntu-root)]], net usable_space [1.5tb], net total_space [1.7tb], types [ext4]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,863Z", "level": "INFO", "component": "o.e.e.NodeEnvironment", "cluster.name": "hive", "node.name": "elasticsearch", "message": "heap size [1gb], compressed ordinary object pointers [true]" }
{"type": "server", "timestamp": "2021-01-17T08:32:27,884Z", "level": "INFO", "component": "o.e.n.Node", "cluster.name": "hive", "node.name": "elasticsearch", "message": "node name [elasticsearch], node ID [b06lrNuHTd6PRWCIceZZKw], cluster name [hive]" }
{"type": "server", "timestamp": "2021-01-17T08:32:29,075Z", "level": "ERROR", "component": "o.e.b.ElasticsearchUncaughtExceptionHandler", "cluster.name": "hive", "node.name": "elasticsearch", "message": "uncaught exception in thread [main]", 
"stacktrace": ["org.elasticsearch.bootstrap.StartupException: java.lang.IllegalArgumentException: unknown setting [script.inline] please check that any required plugins are installed, or check the breaking changes documentation for removed settings",
"at org.elasticsearch.bootstrap.Elasticsearch.init(Elasticsearch.java:174) ~[elasticsearch-7.9.1.jar:7.9.1]",
"at org.elasticsearch.bootstrap.Elasticsearch.execute(Elasticsearch.java:161) ~[elasticsearch-7.9.1.jar:7.9.1]",
"at org.elasticsearch.cli.EnvironmentAwareCommand.execute(EnvironmentAwareCommand.java:86) ~[elasticsearch-7.9.1.jar:7.9.1]",
"at org.elasticsearch.cli.Command.mainWithoutErrorHandling(Command.java:127) ~[elasticsearch-cli-7.9.1.jar:7.9.1]",
"at org.elasticsearch.cli.Command.main(Command.java:90) ~[elasticsearch-cli-7.9.1.jar:7.9.1]",
"at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:126) ~[elasticsearch-7.9.1.jar:7.9.1]",
"at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:92) ~[elasticsearch-7.9.1.jar:7.9.1]",
"Caused by: java.lang.IllegalArgumentException: unknown setting [script.inline] please check that any required plugins are installed, or check the breaking changes documentation for removed settings",
"at org.elasticsearch.common.settings.AbstractScopedSettings.validate(AbstractScopedSettings.java:544) ~[elasticsearch-7.9.1.jar:7.9.1]",
"at org.elasticsearch.common.settings.AbstractScopedSettings.validate(AbstractScopedSettings.java:489) ~[elasticsearch-7.9.1.jar:7.9.1]",
"at org.elasticsearch.common.settings.AbstractScopedSettings.validate(AbstractScopedSettings.java:460) ~[elasticsearch-7.9.1.jar:7.9.1]",
"at org.elasticsearch.common.settings.AbstractScopedSettings.validate(AbstractScopedSettings.java:431) ~[elasticsearch-7.9.1.jar:7.9.1]",
"at org.elasticsearch.common.settings.SettingsModule.<init>(SettingsModule.java:149) ~[elasticsearch-7.9.1.jar:7.9.1]",
"at org.elasticsearch.node.Node.<init>(Node.java:385) ~[elasticsearch-7.9.1.jar:7.9.1]",
"at org.elasticsearch.node.Node.<init>(Node.java:277) ~[elasticsearch-7.9.1.jar:7.9.1]",
"at org.elasticsearch.bootstrap.Bootstrap$5.<init>(Bootstrap.java:227) ~[elasticsearch-7.9.1.jar:7.9.1]",
"at org.elasticsearch.bootstrap.Bootstrap.setup(Bootstrap.java:227) ~[elasticsearch-7.9.1.jar:7.9.1]",
"at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:393) ~[elasticsearch-7.9.1.jar:7.9.1]",
"at org.elasticsearch.bootstrap.Elasticsearch.init(Elasticsearch.java:170) ~[elasticsearch-7.9.1.jar:7.9.1]",
"... 6 more"] }
uncaught exception in thread [main]
java.lang.IllegalArgumentException: unknown setting [script.inline] please check that any required plugins are installed, or check the breaking changes documentation for removed settings
	at org.elasticsearch.common.settings.AbstractScopedSettings.validate(AbstractScopedSettings.java:544)
	at org.elasticsearch.common.settings.AbstractScopedSettings.validate(AbstractScopedSettings.java:489)
	at org.elasticsearch.common.settings.AbstractScopedSettings.validate(AbstractScopedSettings.java:460)
	at org.elasticsearch.common.settings.AbstractScopedSettings.validate(AbstractScopedSettings.java:431)
	at org.elasticsearch.common.settings.SettingsModule.<init>(SettingsModule.java:149)
	at org.elasticsearch.node.Node.<init>(Node.java:385)
	at org.elasticsearch.node.Node.<init>(Node.java:277)
	at org.elasticsearch.bootstrap.Bootstrap$5.<init>(Bootstrap.java:227)
	at org.elasticsearch.bootstrap.Bootstrap.setup(Bootstrap.java:227)
	at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:393)
	at org.elasticsearch.bootstrap.Elasticsearch.init(Elasticsearch.java:170)
	at org.elasticsearch.bootstrap.Elasticsearch.execute(Elasticsearch.java:161)
	at org.elasticsearch.cli.EnvironmentAwareCommand.execute(EnvironmentAwareCommand.java:86)
	at org.elasticsearch.cli.Command.mainWithoutErrorHandling(Command.java:127)
	at org.elasticsearch.cli.Command.main(Command.java:90)
	at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:126)
	at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:92)
For complete error details, refer to the log at /usr/share/elasticsearch/logs/hive.log

Onyphe_Threats_1_0

Error:

{
  "errorMessage": "Invalid output\n",
  "input": null,
  "success": false
}

sbt and elasticsearch problem

Hi,
I can't install Cortex, i have a problem with the sbt command, it's downloading the jar file but it does not create the directory target/universal/stage. Also when i try to install elasticsearch , the package can not be found.

Can you help me please?
Thank you

ImportError: No module named 'cortexutils'

New install keep getting this error -

image

I've rerun ' pip install -r requirements.txt' in the analyzers folder.

I've gotten this on a couple analyzers now, but this doesn't happen on all of them. Out of about 5 I'm testing right now the main ones that are giving this error are Onyphe and Censys.

Here's a snapshot of permissions -
image

Thanks and holy cow this looks promising!

Unable to create/delete organization via API

Hi,

I was trying to create/delete an organization via the API, as described here:

https://github.com/TheHive-Project/CortexDocs/blob/master/api/api-guide.md#create
https://github.com/TheHive-Project/CortexDocs/blob/master/api/api-guide.md#delete

curl -XPOST -H 'Authorization: Bearer **API_KEY**' 'https://CORTEX_APP_URL:9001/api/organization' -d '{
  "name": "demo",
  "description": "Demo organization",
  "status": "Active"
}'

However, when executing the above command, I get the following error:

{
  "tableName": "organization",
  "type": "AttributeCheckingError",
  "errors": [
    {
      "name": "organization.{\n  \"name\": \"demo\",\n  \"description\": \"Demo organization\",\n  \"status\": \"Active\"\n}",
      "type": "UnknownAttributeError",
      "message": "Unknown attribute organization.{\n  \"name\": \"demo\",\n  \"description\": \"Demo organization\",\n  \"status\": \"Active\"\n}: {\"type\":\"StringInputValue\",\"value\":[\"\"]}",
      "value": {
        "type": "StringInputValue",
        "value": [
          ""
        ]
      }
    }
  ]
}

Could I be missing something simple?

docker-compose sample file in the installation br0ken

There seem to be some blockers for the docker-compose.yml example file to get it working smoothly and with minimal changes:

  • you seem to be missing an ENV file with the job_directory being set
  • the environment vars which you recommend in the ES setup are not 100% the same as in the docker-compose file
  • in any case, the # - script.allowed_types=inline needs to be commented out.
  • no mentioning of the -e secret=... which should be set in ES
  • the docker-compose file terminates with:
root@bee:~# docker-compose up 
Pulling cortex (thehiveproject/cortex:3.1.0-0.3RC1)...
ERROR: manifest for thehiveproject/cortex:3.1.0-0.3RC1 not found: manifest unknown: manifest unknown
  • you can solve the last issue via specifying :latest
  • the docker-compose file does not set up the ES to coretex network connection properly it seems.

big logs from cortex

Hello,
I have some problem with Cortex. Not sure how I should fix it;/. In
/var/log/cortex/application.log
In log file I get info:
[WARN] from akka.actor.ActorSystemImpl in application-akka.actor.default-dispatcher-5 - Illegal request, responding with status '505 HTTP Version Not Supported': The server does not support the HTTP protocol version used in the request.

What I should do? Sry if this is lame question... I am begginer with this...

Installation DEB

cncs@soar:/opt/cortex$ bin/cortex -Dconfig.file=/etc/cortex/application.conf
Oops, cannot start the server.
java.nio.file.AccessDeniedException: /opt/cortex/RUNNING_PID
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:84)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
at java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:434)
at java.nio.file.Files.newOutputStream(Files.java:216)
at play.core.server.ProdServerStart$.createPidFile(ProdServerStart.scala:148)
at play.core.server.ProdServerStart$.start(ProdServerStart.scala:46)
at play.core.server.ProdServerStart$.main(ProdServerStart.scala:30)
at play.core.server.ProdServerStart.main(ProdServerStart.scala)
cncs@soar:/opt/cortex$

Only Users showing up in my organization

After initial standup of the docker image the only tab I can see in my organization is users. I am unable to see the analyzers or responders as well as the tabs to configure them.

Error Analizer VirusTotal - PIP3 StringIO

When running an analysis with VirusTotal I returned the following error:

Invalid output
Traceback (most recent call last):
  File "VirusTotal/virustotal.py", line 7, in <module>
    from virustotal_api import PublicApi as VirusTotalPublicApi
  File "/opt/Cortex-Analyzers/analyzers/VirusTotal/virustotal_api.py", line 25, in <module>
    import StringIO
ImportError: No module named 'StringIO'

Apparently I have installed the cortexutils for python3

image

Thank you,
Regards.-

Docker installation guide 404

The link to the Docker installation guide 404's in the main installation file.
This file doesn't appear in the source code tree at all.

You can also use TheHive docker-compose file which contains TheHive, Cortex and Elasticsearch, as documented in TheHive's Docker installation instructions.

Make webhook documentation for Webhook

After a chat with Dadokkio, I was made aware Cortex has webhook access like TheHive. I couldn't find any documentation however. This would be a good quality of life change - thanks!

Cortex - UserMgmtCtrl control error

Hi, I just recently installed cortex 3.0.1-1 but it seems to have issue updating the database, TheHive is working properly
refer to the screenshot below as it keeps giving UserMgmtCtrl error
Screenshot from 2020-10-23 07-23-55

docker-compose up - free space missing?

ERROR: failed to register layer: Error processing tar file(exit status 1): write /usr/lib/jvm/java-11-openjdk-amd64/jmods/java.base.jmod: no space left on device

but there is, free space and free ram space...
?

instructions for TheHive config

Hello there,
I am running the latest version of TheHive with Cortex2.
The documents don't explain how to edit the hive configuration to add the user role just created in Cortex.

If you are using TheHive, create a new account inside your organisation with the read, analyze role and generate an API key that you will need to add to TheHive's configuration.

Any help is appreciated.

Fortigard SSL error

I am getting the following error using the Fortigard analyzer:

I only get an SSL error on this analyzer. All the other ones work fine. Proxy settings are set.

Invalid output
Traceback (most recent call last):
  File "Fortiguard/urlcategory.py", line 53, in <module>
    else:
  File "Fortiguard/urlcategory.py", line 41, in run
    'https': 'http://xxxxxxx:8080',
  File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 75, in get
    return request('get', url, params=params, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 60, in request
    return session.request(method=method, url=url, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 533, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 646, in send
    r = adapter.send(request, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 514, in send
    raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='www.fortiguard.com', port=443): Max retries exceeded with url: /webfilter?q=sitesumo.com (Caused by SSLError(SSLEOFError(8, u'EOF occurred in violation of protocol (_ssl.c:590)'),))

cortex service exited status

Hello there,
I have installed thehive with elasticsearch backend via binary, then installed the cortex binaries as well and followed the instructions.
When I start the cortex service I get this error:

pino@optimus:/opt/cortex$ sudo service cortex status
● cortex.service - cortex
Loaded: loaded (/usr/lib/systemd/system/cortex.service; enabled; vendor preset: enabled)
Active: failed (Result: exit-code) since Tue 2018-11-13 18:54:23 GMT; 45s ago
Docs: https://thehive-project.org
Process: 19464 ExecStart=/opt/cortex/bin/cortex -Dconfig.file=/etc/cortex/application.conf -Dlogger.file=/etc/cortex/logback.xml -Dpidfile.path=/dev/null (code=exited, status=25
Main PID: 19464 (code=exited, status=255)

Nov 13 18:54:22 optimus systemd[1]: Started cortex.
Nov 13 18:54:23 optimus systemd[1]: cortex.service: Main process exited, code=exited, status=255/n/a
Nov 13 18:54:23 optimus systemd[1]: cortex.service: Failed with result 'exit-code'.

My application.conf is pretty basic:

search {

Index name.

index = cortex

ElasticSearch cluster name.

cluster = hive

ElasticSearch instance address.

host = ["127.0.0.1:9300"]

Scroll keepalive.

keepalive = 1m

Scroll page size.

pagesize = 50
}

play.http.secret.key="blah"

analyzer.path = ["/opt/Cortex-Analyzers/analyzers"]

ElasticSearch is up and running and of course TheHive is working perfectly, I am not sure what that error message means, any help is appreciated.

Cheers.

Clear Jobs History?

Maybe I've missed it somewhere but is there a way to clear the 'Jobs History' ?

cortex-latest.zip 404

--2021-03-26 17:34:34-- https://download.thehive-project.org/cortex-latest.zip
Resolving download.thehive-project.org (download.thehive-project.org)... 135.125.31.131
Connecting to download.thehive-project.org (download.thehive-project.org)|135.125.31.131|:443... connected.
HTTP request sent, awaiting response... 404 Not Found
2021-03-26 17:34:36 ERROR 404: Not Found.

GPG Keys Error

Not able to install Cortex from DEB because the repo key is not being downloaded.

sudo apt-key adv --keyserver hkp://pgp.mit.edu --recv-key 562CBC1C

Executing: /tmp/apt-key-gpghome.52eLRlobjR/gpg.1.sh --keyserver hkp://pgp.mit.edu --recv-key 562CBC1C
gpg: keyserver receive failed: No data

sorry there are no analyzers for the selected observables

So I have all configured from thehive and cortex to talk to each other. I then have enabled the VirusTotal_GetReport_3_0 analyser and tested with a md5 file hash in the cortex job panel which worked fined.

Now from thehive I create a new case and add the same ioc (different md5,sha1,sha256 hashes) but a popup message keeps saying "Sorry, there are currently no analyzers for the selected observable type(s)" which is obviously a bug of some sort.

I have checked the application log and I can see thehive authenticating with the cortex so it must be something in the communication logic?

Anyway to debug this?

cortex unable to find analyzers and responders

Hello guys,
for some reason Cortex is unable to find the analyzers and responsdes.

Application log shows nothing under New Worker list:

2018-11-13 23:05:39,648 [INFO] from org.thp.cortex.services.WorkerSrv in main - New worker list:

	

2018-11-13 23:05:40,092 [INFO] from play.api.Play in main - Application started (Prod)

My configuration:

analyzer {
  # Directory that holds analyzers
  path = ["/etc/cortex/analyzers"]

  fork-join-executor {
    # Min number of threads available for analyze
    parallelism-min = 2
    # Parallelism (threads) ... ceil(available processors * factor)
    parallelism-factor = 2.0
    # Max number of threads available for analyze
    parallelism-max = 4
  }
}

responder {
  # Directory that holds responders
  path = ["/etc/cortex/responders"]

  fork-join-executor {
    # Min number of threads available for analyze
    parallelism-min = 2
    # Parallelism (threads) ... ceil(available processors * factor)
    parallelism-factor = 2.0
    # Max number of threads available for analyze
    parallelism-max = 4
  }
}

Permissions for those 2 folders:

drw-r-----  66 cortex cortex  4096 Nov 13 22:37 analyzers
drw-r-----   3 cortex cortex  4096 Nov 13 22:39 responders

The folders are populate from the git clone command.

Any idea why is not finding them?

Akka error at the launch of Cortex

Hello
When installing cortex I got this error
"from akka.dispatch.Dispatcher in application-akka.actor.default-dispatcher-4 - null java.security.PrivilegedActionException: null"

I am currently on Rocky OS 8.4

`2021-09-08 09:05:58,800 [INFO] from org.reflections.Reflections in main - Reflections took 121 ms to scan 2 urls, producing 99 keys and 968 values
2021-09-08 09:05:58,827 [INFO] from module in main - Loading model class org.elastic4play.services.AttachmentModel
2021-09-08 09:05:58,829 [INFO] from module in main - Loading model class org.thp.cortex.models.OrganizationModel
2021-09-08 09:05:58,829 [INFO] from module in main - Loading model class org.thp.cortex.models.UserModel
2021-09-08 09:05:58,830 [INFO] from module in main - Loading model class org.thp.cortex.models.JobModel
2021-09-08 09:05:58,830 [INFO] from module in main - Loading model class org.elastic4play.services.DBListModel
2021-09-08 09:05:58,830 [INFO] from module in main - Loading model class org.thp.cortex.models.ReportModel
2021-09-08 09:05:58,830 [INFO] from module in main - Loading model class org.thp.cortex.models.ArtifactModel
2021-09-08 09:05:58,830 [INFO] from module in main - Loading model class org.thp.cortex.models.WorkerModel
2021-09-08 09:05:58,831 [INFO] from module in main - Loading model class org.thp.cortex.models.WorkerConfigModel
2021-09-08 09:05:58,831 [INFO] from module in main - Loading model class org.thp.cortex.models.AuditModel
2021-09-08 09:05:58,838 [INFO] from module in main - Loading authentication module class org.elastic4play.services.auth.ADAuthSrv
2021-09-08 09:05:58,838 [INFO] from module in main - Loading authentication module class org.thp.cortex.services.KeyAuthSrv
2021-09-08 09:05:58,838 [INFO] from module in main - Loading authentication module class org.thp.cortex.services.LocalAuthSrv
2021-09-08 09:05:58,839 [INFO] from module in main - Loading authentication module class org.elastic4play.services.auth.LdapAuthSrv
2021-09-08 09:05:58,839 [INFO] from module in main - Loading authentication module class org.thp.cortex.services.OAuth2Srv
2021-09-08 09:06:00,130 [DEBUG] from play.api.libs.concurrent.ActorSystemProvider in main - Starting application default Akka system: application
2021-09-08 09:06:00,495 [INFO] from akka.event.slf4j.Slf4jLogger in application-akka.actor.default-dispatcher-4 - Slf4jLogger started
2021-09-08 09:06:01,938 [DEBUG] from play.shaded.ahc.io.netty.util.internal.logging.InternalLoggerFactory in main - Using SLF4J as the default logging framework
2021-09-08 09:06:01,941 [DEBUG] from play.shaded.ahc.io.netty.util.ResourceLeakDetector in main - -Dplay.shaded.ahc.io.netty.leakDetection.level: simple
2021-09-08 09:06:01,941 [DEBUG] from play.shaded.ahc.io.netty.util.ResourceLeakDetector in main - -Dplay.shaded.ahc.io.netty.leakDetection.targetRecords: 4
2021-09-08 09:06:01,961 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent0 in main - -Dio.netty.noUnsafe: false
2021-09-08 09:06:01,962 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent0 in main - Java version: 8
2021-09-08 09:06:01,962 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent0 in main - sun.misc.Unsafe.theUnsafe: available
2021-09-08 09:06:01,963 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent0 in main - sun.misc.Unsafe.copyMemory: available
2021-09-08 09:06:01,963 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent0 in main - java.nio.Buffer.address: available
2021-09-08 09:06:01,964 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent0 in main - direct buffer constructor: available
2021-09-08 09:06:01,964 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent0 in main - java.nio.Bits.unaligned: available, true
2021-09-08 09:06:01,964 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent0 in main - jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable prior to Java9
2021-09-08 09:06:01,964 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent0 in main - java.nio.DirectByteBuffer.(long, int): available
2021-09-08 09:06:01,964 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent in main - sun.misc.Unsafe: available
2021-09-08 09:06:01,965 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent in main - -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
2021-09-08 09:06:01,965 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent in main - -Dio.netty.bitMode: 64 (sun.arch.data.model)
2021-09-08 09:06:01,965 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent in main - -Dio.netty.maxDirectMemory: 1810366464 bytes
2021-09-08 09:06:01,965 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent in main - -Dio.netty.uninitializedArrayAllocationThreshold: -1
2021-09-08 09:06:01,966 [DEBUG] from play.shaded.ahc.io.netty.util.internal.CleanerJava6 in main - java.nio.ByteBuffer.cleaner(): available
2021-09-08 09:06:01,966 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent in main - -Dio.netty.noPreferDirect: false
2021-09-08 09:06:01,967 [DEBUG] from play.shaded.ahc.io.netty.util.ResourceLeakDetectorFactory in main - Loaded default ResourceLeakDetector: play.shaded.ahc.io.netty.util.ResourceLeakDetector@7a388990
2021-09-08 09:06:01,973 [DEBUG] from play.shaded.ahc.io.netty.util.internal.PlatformDependent in main - org.jctools-core.MpscChunkedArrayQueue: available
2021-09-08 09:06:01,999 [DEBUG] from play.shaded.ahc.io.netty.util.internal.InternalThreadLocalMap in main - -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
2021-09-08 09:06:01,999 [DEBUG] from play.shaded.ahc.io.netty.util.internal.InternalThreadLocalMap in main - -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
2021-09-08 09:06:02,004 [DEBUG] from play.shaded.ahc.io.netty.channel.MultithreadEventLoopGroup in main - -Dio.netty.eventLoopThreads: 8
2021-09-08 09:06:02,012 [DEBUG] from play.shaded.ahc.io.netty.channel.nio.NioEventLoop in main - -Dio.netty.noKeySetOptimization: false
2021-09-08 09:06:02,012 [DEBUG] from play.shaded.ahc.io.netty.channel.nio.NioEventLoop in main - -Dio.netty.selectorAutoRebuildThreshold: 512
2021-09-08 09:06:02,156 [DEBUG] from play.shaded.ahc.io.netty.buffer.PooledByteBufAllocator in main - -Dio.netty.allocator.numHeapArenas: 8
2021-09-08 09:06:02,156 [DEBUG] from play.shaded.ahc.io.netty.buffer.PooledByteBufAllocator in main - -Dio.netty.allocator.numDirectArenas: 8
2021-09-08 09:06:02,156 [DEBUG] from play.shaded.ahc.io.netty.buffer.PooledByteBufAllocator in main - -Dio.netty.allocator.pageSize: 8192
2021-09-08 09:06:02,156 [DEBUG] from play.shaded.ahc.io.netty.buffer.PooledByteBufAllocator in main - -Dio.netty.allocator.maxOrder: 11
2021-09-08 09:06:02,156 [DEBUG] from play.shaded.ahc.io.netty.buffer.PooledByteBufAllocator in main - -Dio.netty.allocator.chunkSize: 16777216
2021-09-08 09:06:02,156 [DEBUG] from play.shaded.ahc.io.netty.buffer.PooledByteBufAllocator in main - -Dio.netty.allocator.tinyCacheSize: 512
2021-09-08 09:06:02,156 [DEBUG] from play.shaded.ahc.io.netty.buffer.PooledByteBufAllocator in main - -Dio.netty.allocator.smallCacheSize: 256
2021-09-08 09:06:02,156 [DEBUG] from play.shaded.ahc.io.netty.buffer.PooledByteBufAllocator in main - -Dio.netty.allocator.normalCacheSize: 64
2021-09-08 09:06:02,156 [DEBUG] from play.shaded.ahc.io.netty.buffer.PooledByteBufAllocator in main - -Dio.netty.allocator.maxCachedBufferCapacity: 32768
2021-09-08 09:06:02,156 [DEBUG] from play.shaded.ahc.io.netty.buffer.PooledByteBufAllocator in main - -Dio.netty.allocator.cacheTrimInterval: 8192
2021-09-08 09:06:02,156 [DEBUG] from play.shaded.ahc.io.netty.buffer.PooledByteBufAllocator in main - -Dio.netty.allocator.cacheTrimIntervalMillis: 0
2021-09-08 09:06:02,156 [DEBUG] from play.shaded.ahc.io.netty.buffer.PooledByteBufAllocator in main - -Dio.netty.allocator.useCacheForAllThreads: true
2021-09-08 09:06:02,156 [DEBUG] from play.shaded.ahc.io.netty.buffer.PooledByteBufAllocator in main - -Dio.netty.allocator.maxCachedByteBuffersPerChunk: 1023
2021-09-08 09:06:02,164 [DEBUG] from play.shaded.ahc.io.netty.buffer.ByteBufUtil in main - -Dio.netty.allocator.type: pooled
2021-09-08 09:06:02,165 [DEBUG] from play.shaded.ahc.io.netty.buffer.ByteBufUtil in main - -Dio.netty.threadLocalDirectBufferSize: 0
2021-09-08 09:06:02,165 [DEBUG] from play.shaded.ahc.io.netty.buffer.ByteBufUtil in main - -Dio.netty.maxThreadLocalCharBufferSize: 16384
2021-09-08 09:06:03,272 [INFO] from org.thp.cortex.services.DockerJobRunnerSrv in application-akka.actor.default-dispatcher-5 - Docker is available:
Info{architecture=x86_64, clusterStore=null, cgroupDriver=cgroupfs, containers=1, containersRunning=0, containersStopped=1, containersPaused=0, cpuCfsPeriod=true, cpuCfsQuota=true, debug=false, dockerRootDir=/var/lib/docker, storageDriver=overlay2, driverStatus=[[Backing Filesystem, xfs], [Supports d_type, true], [Native Overlay Diff, true], [userxattr, false]], executionDriver=null, experimentalBuild=false, httpProxy=, httpsProxy=, id=ZKN3:JVQR:HJKR:WHAV:ZZWI:PXZF:SSIO:FDWL:Q5ZV:3ISV:LPUE:MJ6D, ipv4Forwarding=true, images=1, indexServerAddress=https://index.docker.io/v1/, initPath=null, initSha1=null, kernelMemory=true, kernelVersion=4.18.0-305.12.1.el8_4.x86_64, labels=[], memTotal=8144953344, memoryLimit=true, cpus=4, eventsListener=0, fileDescriptors=27, goroutines=36, name=FRPARCOROSOCMON1, noProxy=, oomKillDisable=true, operatingSystem=Rocky Linux 8.4 (Green Obsidian), osType=linux, plugins=Plugins{volumes=[local], networks=[bridge, host, ipvlan, macvlan, null, overlay]}, registryConfig=RegistryConfig{indexConfigs={docker.io=IndexConfig{name=docker.io, mirrors=[], secure=true, official=true}}, insecureRegistryCidrs=[127.0.0.0/8]}, serverVersion=20.10.8, swapLimit=true, swarm=SwarmInfo{cluster=null, controlAvailable=false, error=, localNodeState=inactive, nodeAddr=, nodeId=, nodes=null, managers=null, remoteManagers=null}, systemStatus=[], systemTime=Wed Sep 08 09:06:03 EDT 2021}
2021-09-08 09:06:03,291 [WARN] from org.thp.cortex.services.JobRunnerSrv in application-akka.actor.default-dispatcher-5 - The package cortexutils for python hasn't been found
2021-09-08 09:06:03,291 [WARN] from org.thp.cortex.services.JobRunnerSrv in main - The package cortexutils for python hasn't been found
2021-09-08 09:06:03,681 [INFO] from org.thp.cortex.services.JobRunnerSrv in main - The package cortexutils for python2 has valid version: 2.1.0
2021-09-08 09:06:03,681 [INFO] from org.thp.cortex.services.JobRunnerSrv in application-akka.actor.default-dispatcher-5 - The package cortexutils for python2 has valid version: 2.1.0
2021-09-08 09:06:04,090 [INFO] from org.thp.cortex.services.JobRunnerSrv in main - The package cortexutils for python3 has valid version: 2.1.0
2021-09-08 09:06:04,096 [INFO] from org.thp.cortex.services.JobRunnerSrv in application-akka.actor.default-dispatcher-5 - The package cortexutils for python3 has valid version: 2.1.0
2021-09-08 09:06:04,103 [INFO] from org.thp.cortex.services.WorkerSrv in application-akka.actor.default-dispatcher-4 - New worker list:

    IPVoid 1.0
    SEKOIAIntelligenceCenter_Indicators 1.0
    Shuffle 1.0
    SEKOIAIntelligenceCenter_Context 1.0
    HIBP_Query 2.0
    CheckPoint_Unlock 1.0
    DNSSinkhole 1.0
    DomainToolsIris_Investigate 1.0
    Autofocus_SearchJSON 1.0
    DomainTools_Reputation 2.0
    VirusTotal_GetReport 3.0
    AMPforEndpoints_SCDAdd 1.0
    MaxMind_GeoIP 4.0
    Crowdstrike_Falcon_Custom_IOC_API 1.0
    FileInfo 8.0
    PaloAltoNGFW_block_external_user 1.0.0
    CheckPoint_Lock 1.0
    FireEyeiSight 1.0
    Malwares_GetReport 1.0
    Mnemonic_pDNS_Public 3.0
    THOR_Thunderstorm_ScanSample 0.3.1
    VirusTotal_Rescan 3.0
    DomainTools_Risk 2.0
    PassiveTotal_Osint 2.0
    CIRCLPassiveDNS 2.0
    CyberChef_FromHex 1.0
    PaloAltoNGFW_block_port_for_external_communication 2.0.0
    PassiveTotal_Passive_Dns 2.1
    Shodan_Host 1.0
    SendGrid 1.0
    DomainTools_WhoisLookupUnparsed 2.0
    Hashdd_Detail 2.0
    PassiveTotal_Host_Pairs 2.0
    Hunterio_DomainSearch 1.0
    CyberChef_FromCharCode 1.0
    MISPWarningLists 2.0
    Gmail_BlockDomain 1.0
    DomainTools_ReverseIPWhois 2.0
    AbuseIPDB 1.0
    TorProject 1.0
    Redmine_Issue 1.0
    CIRCLPassiveSSL 2.0
    Inoitsu 1.0
    Fortiguard_URLCategory 2.1
    Splunk_Search_User_Agent 3.0
    Yara 2.0
    EmergingThreats_DomainInfo 1.0
    DNSDB_DomainName 2.0
    PhishTank_CheckURL 2.1
    DNS-RPZ 1.0
    MailIncidentStatus 1.0
    CIRCLHashlookup 1.0
    PaloAltoNGFW_block_internal_domain 2.0.0
    StamusNetworks_HostID 1.0
    IPinfo_Hosted_Domains 1.0
    SpamhausDBL 1.0
    SophosIntelix_GetReport 0.3
    PassiveTotal_Trackers 2.0
    ThreatResponse 1.0
    VirusTotal_DownloadSample 3.0
    Gmail_BlockSender 1.0
    Maltiverse_Report 1.0
    SophosIntelix_Submit_Dynamic 0.1
    BackscatterIO_GetObservations 1.0
    OTXQuery 2.0
    Investigate_Sample 1.0
    PaloAltoNGFW_unblock_port_for_internal_communication 1.0.0
    MetaDefenderCloud_Reputation 1.0
    Autofocus_SearchIOC 1.0
    Splunk_Search_Mail_Email 3.0
    LastInfoSec 1.0
    Patrowl_GetReport 1.0
    NSRL 1.0
    AMPforEndpoints_MoveGUID 1.0
    RT4-CreateTicket 1.0
    PhishingInitiative_Scan 1.0
    Mailer 1.0
    C1fApp 1.0
    Diario_Scan 1.0
    RecordedFuture_risk 1.0
    OpenCTI_SearchObservables 2.0
    Nessus 2.0
    KnowBe4 1.0
    SecurityTrails_Passive_DNS 1.0
    JoeSandbox_File_Analysis_Inet 2.0
    Virusshare 2.0
    Velociraptor_Flow 0.1
    EmlParser 2.0
    DomainTools_ReverseIP 2.0
    Yeti 1.0
    StaxxSearch 1.0
    PaloAltoNGFW_unblock_external_domain 1.0.0
    SinkDB 1.1
    PaloAltoNGFW_unblock_external_IP_address 1.0.0
    MalwareBazaar 1.0
    DomainToolsIris_AddRiskyDNSTag 1.0
    Robtex_Forward_PDNS_Query 1.0
    WOT_Lookup 2.0
    Elasticsearch_Analysis 1.0
    Splunk_Search_Hash 3.0
    Autofocus_GetSampleAnalysis 1.0
    Virustotal_Downloader 0.1
    DuoUnlockUserAccount 1.0
    PaloAltoNGFW_unblock_internal_user 1.0.0
    VirusTotal_Scan 3.0
    EmergingThreats_IPInfo 1.0
    Shodan_ReverseDNS 1.0
    Shodan_Host_History 1.0
    Wazuh 1.0
    PassiveTotal_Whois_Details 2.0
    Urlscan.io_Search 0.1.1
    PaloAltoNGFW_block_external_IP_address 2.0.0
    DomainTools_WhoisLookup 2.0
    PaloAltoNGFW_block_internal_IP_address 2.0.0
    GRR 0.1
    Cyberprotect_ThreatScore 3.0
    PaloAltoNGFW_block_external_domain 2.0.0
    ZEROFOX_Close_alert 1.0
    Minemeld 1.0
    PassiveTotal_Malware 2.0
    DomainTools_ReverseNameServer 2.0
    IntezerCommunity 1.0
    DNSDB_IPHistory 2.0
    Ldap_Query 2.0
    PaloAltoNGFW_unblock_internal_domain 1.0.0
    GoogleSafebrowsing 2.0
    PassiveTotal_Enrichment 2.0
    PayloadSecurity_File_Analysis 1.0
    Triage 1.0
    Msg_Parser 3.0
    DomainMailSPFDMARC_Analyzer 1.1
    PassiveTotal_Unique_Resolutions 2.0
    Splunk_Search_User 3.0
    CuckooSandbox_Url_Analysis 1.2
    BackscatterIO_Enrichment 1.0
    DomainTools_ReverseWhois 2.0
    SophosIntelix_Submit_Static 0.1
    Threatcrowd 1.0
    Umbrella_Blacklister 1.1
    ZEROFOX_Takedown_request 1.0
    CyberCrime-Tracker 1.0
    Gmail_DeleteMessage 1.0
    EmailRep 1.0
    URLhaus 2.0
    MISP 2.1
    TeamCymruMHR 1.0
    DShield_lookup 1.0
    EmergingThreats_MalwareInfo 1.0
    StopForumSpam 1.0
    DomainTools_HostingHistory 2.0
    CyberChef_FromBase64 1.0
    Abuse_Finder 3.0
    Investigate_Categorization 1.0
    SecurityTrails_Whois 1.0
    DomainTools_WhoisHistory 2.0
    MetaDefenderCloud_Scan 1.0
    PassiveTotal_Ssl_Certificate_History 2.0
    Splunk_Search_Other 3.0
    Malpedia 1.0
    MetaDefenderCore_Scan 1.0
    Splunk_Search_Registry 3.0
    Crt_sh_Transparency_Logs 1.0
    IPinfo_Details 1.0
    CERTatPassiveDNS 2.0
    Urlscan.io_Scan 0.1.0
    DomainToolsIris_CheckMaliciousTags 1.0
    ProofPoint_Lookup 1.0
    PayloadSecurity_Url_Analysis 1.0
    Shodan_DNSResolve 1.0
    Splunk_Search_Mail_Subject 3.0
    VMRay 4.1
    GoogleDNS_resolve 1.0.0
    DomainToolsIris_Pivot 1.0
    MetaDefenderCloud_GetReport 1.0
    OpenCTI_SearchExactObservable 2.0
    Hipposcore 2.0
    Shodan_InfoDomain 1.0
    CuckooSandbox_File_Analysis_Inet 1.2
    DNS_Lookingglass 1.0
    JoeSandbox_File_Analysis_Noinet 2.0
    GoogleVisionAPI_WebDetection 1.0.0
    Valhalla_GetRuleMatches 0.3.1
    TalosReputation 1.0
    Vulners_CVE 1.0
    Splunk_Search_IP 3.0
    TorBlutmagie 1.0
    SpamAssassin 1.0
    Splunk_Search_Domain_FQDN 3.0
    FireHOLBlocklists 2.0
    Vulners_IOC 1.0
    NERD 1.0
    ThreatGrid 1.0
    Robtex_Reverse_PDNS_Query 1.0
    Gmail_UnblockDomain 1.0
    PassiveTotal_Ssl_Certificate_Details 2.0
    PaloAltoNGFW_block_internal_user 1.0.0
    AMPforEndpoints_IsolationStart 1.0
    Hashdd_Status 2.0
    PaloAltoNGFW_unblock_port_for_external_communication 1.0.0
    PaloAltoNGFW_unblock_internal_IP_address 1.0.0
    DNSDB_NameHistory 2.0
    PhishingInitiative_Lookup 2.0
    AMPforEndpoints_IsolationStop 1.0
    SoltraEdge 1.0
    Pulsedive_GetIndicator 1.0
    QRadar_Auto_Closing_Offense 1.0
    IBMXForce_Lookup 1.0
    Gmail_UnblockSender 1.0
    Splunk_Search_URL_URI_Path 3.0
    IVRE 1.0
    JoeSandbox_Url_Analysis 2.0
    GreyNoise 3.1
    Censys 1.0
    Malwares_Scan 1.0
    Robtex_IP_Query 1.0
    HippoMore 2.0
    PaloAltoNGFW_unblock_external_user 1.0.0
    HybridAnalysis_GetReport 1.0
    DuoLockUserAccount 1.0
    AMPforEndpoints_SCDRemove 1.0
    ClamAV_FileInfo 1.1
    PaloAltoNGFW_block_port_for_internal_communication 2.0.0
    ForcepointWebsensePing 1.0
    Shodan_Search 2.0
    Umbrella_Report 1.0
    PassiveTotal_Components 2.0
    AzureTokenRevoker 1.0
    MetaDefenderCore_GetReport 1.0
    Diario_GetReport 1.0
    MalwareClustering_Search 1.0
    Mnemonic_pDNS_Closed 3.0
    Splunk_Search_File_Filename 3.0
    UnshortenLink 1.2
    Onyphe_Summary 1.0
    AnyRun_Sandbox_Analysis 1.0

2021-09-08 09:06:04,211 [INFO] from com.sksamuel.elastic4s.http.JavaClient$ in application-akka.actor.default-dispatcher-4 - Creating HTTP client on http://10.84.40.21:9200,http://10.84.40.22:9200,http://10.84.40.23:9200
2021-09-08 09:06:04,230 [INFO] from com.sksamuel.elastic4s.http.JavaClient$ in main - Creating HTTP client on http://10.84.40.21:9200,http://10.84.40.22:9200,http://10.84.40.23:9200
2021-09-08 09:06:04,234 [ERROR] from akka.dispatch.Dispatcher in application-akka.actor.default-dispatcher-4 - null
java.security.PrivilegedActionException: null
at java.security.AccessController.doPrivileged(Native Method)
at org.elasticsearch.client.RestClientBuilder.build(RestClientBuilder.java:191)
at com.sksamuel.elastic4s.http.JavaClient$.apply(JavaClient.scala:132)
at org.elastic4play.database.DBConfiguration.$anonfun$getClient$1(DBConfiguration.scala:132)
at scala.collection.immutable.Map$EmptyMap$.getOrElse(Map.scala:110)
at org.elastic4play.database.DBConfiguration.getClient(DBConfiguration.scala:132)
at org.elastic4play.database.DBConfiguration.execute(DBConfiguration.scala:149)
at org.elastic4play.database.SearchWithScroll.(DBFind.scala:139)
at org.elastic4play.database.DBFind.searchWithScroll(DBFind.scala:62)
at org.elastic4play.database.DBFind.apply(DBFind.scala:105)
at org.elastic4play.services.FindSrv.apply(FindSrv.scala:58)
at org.thp.cortex.services.WorkerSrv.find(WorkerSrv.scala:128)
at org.thp.cortex.services.WorkerSrv.$anonfun$rescan$4(WorkerSrv.scala:137)
at org.thp.cortex.services.UserSrv.inInitAuthContext(UserSrv.scala:102)
at org.thp.cortex.services.WorkerSrv.$anonfun$rescan$3(WorkerSrv.scala:136)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:56)
at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:93)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:93)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:48)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.nio.file.AccessDeniedException: /etc/elasticsearch/certs/elasticsearch.jks
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:84)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
at java.nio.file.Files.newByteChannel(Files.java:361)
at java.nio.file.Files.newByteChannel(Files.java:407)
at java.nio.file.spi.FileSystemProvider.newInputStream(FileSystemProvider.java:384)
at java.nio.file.Files.newInputStream(Files.java:152)
at org.elastic4play.database.DBConfiguration.$anonfun$sslContextMaybe$1(DBConfiguration.scala:80)
at scala.Option.map(Option.scala:230)
at org.elastic4play.database.DBConfiguration.sslContextMaybe$lzycompute(DBConfiguration.scala:76)
at org.elastic4play.database.DBConfiguration.sslContextMaybe(DBConfiguration.scala:76)
at org.elastic4play.database.DBConfiguration.$anonfun$httpClientConfig$1(DBConfiguration.scala:117)
at org.elasticsearch.client.RestClientBuilder.createHttpClient(RestClientBuilder.java:215)
... 27 common frames omitted
2021-09-08 09:06:04,248 [WARN] from org.thp.cortex.services.JobRunnerSrv in main - The package cortexutils for python hasn't been found
2021-09-08 09:06:04,606 [INFO] from org.thp.cortex.services.JobRunnerSrv in main - The package cortexutils for python2 has valid version: 2.1.0
2021-09-08 09:06:05,000 [INFO] from org.thp.cortex.services.JobRunnerSrv in main - The package cortexutils for python3 has valid version: 2.1.0
2021-09-08 09:06:05,001 [INFO] from com.sksamuel.elastic4s.http.JavaClient$ in main - Creating HTTP client on http://10.84.40.21:9200,http://10.84.40.22:9200,http://10.84.40.23:9200
2021-09-08 09:06:05,034 [WARN] from org.thp.cortex.services.JobRunnerSrv in main - The package cortexutils for python hasn't been found
2021-09-08 09:06:05,397 [INFO] from org.thp.cortex.services.JobRunnerSrv in main - The package cortexutils for python2 has valid version: 2.1.0
2021-09-08 09:06:05,795 [INFO] from org.thp.cortex.services.JobRunnerSrv in main - The package cortexutils for python3 has valid version: 2.1.0
2021-09-08 09:06:05,796 [INFO] from com.sksamuel.elastic4s.http.JavaClient$ in main - Creating HTTP client on http://10.84.40.21:9200,http://10.84.40.22:9200,http://10.84.40.23:9200
`

Yeti - SSL error with self-signed certificates

Hi :)

I've been configuring our Yeti instance with the newly installed Cortex one and an error has occured with the Yeti Analyzer :

Invalid output
Traceback (most recent call last):
File "Yeti/yeti.py", line 45, in
YetiAnalyzer().run()
File "Yeti/yeti.py", line 28, in run
api = pyeti.YetiApi("{}/api/".format(self.url))
File "/usr/local/lib/python2.7/dist-packages/pyeti/api.py", line 23, in init
self._test_connection()
File "/usr/local/lib/python2.7/dist-packages/pyeti/api.py", line 329, in _test_connection
if self._make_post("observablesearch/"): # replace this with a more meaningful URL
File "/usr/local/lib/python2.7/dist-packages/pyeti/api.py", line 335, in _make_post
return self._make_request(url, method="POST", **kwargs)
File "/usr/local/lib/python2.7/dist-packages/pyeti/api.py", line 354, in _make_request
verify=self.verify_ssl, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 116, in post
return request('post', url, data=data, json=json, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 514, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='yeti.login-securite.com', port=443): Max retries exceeded with url: /api/observablesearch/ (Caused by SSLError(SSLError(1, u'[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:661)'),))

Our certificate is signed by Gandi but i don't know why our tools we use do not recognize it.
Anyway, I'd like to be able to set the verify_ssl attributes to False to my Yeti instance configuration (if possible thanks to the UI).

I temporarily modified the following so my analizer works :
def run(self):
api = pyeti.YetiApi("{}/api/".format(self.url), verify_ssl=False)
data = self.get_data()
to the YetiAnalyzer class.

I will take a look on how to add a parameter on the UI to set it to False without modifying the code.
If you could give me a little help on where to look as I just started to use/configure Cortex, that would be appreciated.

Best regards,
Guillaume

3.1.0: Update documentation (branch 3.1.0)

  • Update the Migration Guide Page to include links to the full migration guides of 3.1.0
  • Cortex 3.1.0: copy TH 3.5.0 migration Guide
  • Test migration guide
  • Update dedicated migration guide with dedicated command and output to Cortex
  • Cortex: update Installation guide with ES 7 (configuration of ES)
  • Cortex docker image instructions: run with volumes /tmp and /var/share/docker.sock
  • Want to run Cortex as container and analyzers&responders as programs inside the container ? How to build your dockerfile with FROM: thehiveproject/cortex:latest

Cannot deserialize ES results after migration to Cortex 3.1

Hello,

I tried to migrate from Cortex 3.0.1 to Cortex 3.1.0, but I'm facing deserialization errors when accessing the UI and trying to authenticate:

com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot deserialize instance of 'long' out of START_OBJECT token

I can confirm that the ES instance is running 7.9.3 and was created in 6.8:

~$ curl -X GET 'http://elasticsearch:9200/cortex_4?pretty'
{
  "cortex_4" : {
    "aliases" : { },
    "mappings" : {
      "date_detection" : false,
      "numeric_detection" : false,
      "properties" : {...}
    },
    "settings" : {
      "index" : {
        "mapping" : {
          "nested_fields" : {
            "limit" : "100"
          }
        },
        "number_of_shards" : "5",
        "provided_name" : "cortex_4",
        "creation_date" : "1595607504954",
        "number_of_replicas" : "1",
        "uuid" : "nENE3MbbTSa1szb3gE83oQ",
        "version" : {
          "created" : "6080099",
          "upgraded" : "7090399"
        }
      }
    }
  }
}

Any clue ?

Create Cortex administrator account

I am trying to create Cortex admin user, and I am receiving the following exception. Please advice.

2022-02-04 12:01:25,424 [WARN] from org.elastic4play.database.SearchWithScroll in application-akka.actor.default-dispatcher-6 - Search error
com.sksamuel.elastic4s.http.JavaClientExceptionWrapper: java.net.ConnectException: Connection refused
	at com.sksamuel.elastic4s.http.JavaClient$$anon$1.onFailure(JavaClient.scala:69)
	at org.elasticsearch.client.RestClient$FailureTrackingResponseListener.onDefinitiveFailure(RestClient.java:617)
	at org.elasticsearch.client.RestClient$1.failed(RestClient.java:375)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:137)
	at org.apache.http.impl.nio.client.DefaultClientExchangeHandlerImpl.executionFailed(DefaultClientExchangeHandlerImpl.java:101)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.failed(AbstractClientExchangeHandler.java:426)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.connectionRequestFailed(AbstractClientExchangeHandler.java:348)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.access$100(AbstractClientExchangeHandler.java:62)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler$1.failed(AbstractClientExchangeHandler.java:392)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:137)
	at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager$1.failed(PoolingNHttpClientConnectionManager.java:316)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:137)
	at org.apache.http.nio.pool.RouteSpecificPool.failed(RouteSpecificPool.java:162)
	at org.apache.http.nio.pool.AbstractNIOConnPool.requestFailed(AbstractNIOConnPool.java:609)
	at org.apache.http.nio.pool.AbstractNIOConnPool$InternalSessionRequestCallback.failed(AbstractNIOConnPool.java:889)
	at org.apache.http.impl.nio.reactor.SessionRequestImpl.failed(SessionRequestImpl.java:162)
	at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvent(DefaultConnectingIOReactor.java:176)
	at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvents(DefaultConnectingIOReactor.java:148)
	at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor.execute(AbstractMultiworkerIOReactor.java:351)
	at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.execute(PoolingNHttpClientConnectionManager.java:221)
	at org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase$1.run(CloseableHttpAsyncClientBase.java:64)
	at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.net.ConnectException: Connection refused
	at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:777)
	at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvent(DefaultConnectingIOReactor.java:174)
	... 5 common frames omitted
2022-02-04 12:01:25,432 [WARN] from org.elastic4play.database.SearchWithScroll in application-akka.actor.default-dispatcher-10 - Search error
com.sksamuel.elastic4s.http.JavaClientExceptionWrapper: java.net.ConnectException: Connection refused
	at com.sksamuel.elastic4s.http.JavaClient$$anon$1.onFailure(JavaClient.scala:69)
	at org.elasticsearch.client.RestClient$FailureTrackingResponseListener.onDefinitiveFailure(RestClient.java:617)
	at org.elasticsearch.client.RestClient$1.failed(RestClient.java:375)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:137)
	at org.apache.http.impl.nio.client.DefaultClientExchangeHandlerImpl.executionFailed(DefaultClientExchangeHandlerImpl.java:101)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.failed(AbstractClientExchangeHandler.java:426)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.connectionRequestFailed(AbstractClientExchangeHandler.java:348)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.access$100(AbstractClientExchangeHandler.java:62)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler$1.failed(AbstractClientExchangeHandler.java:392)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:137)
	at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager$1.failed(PoolingNHttpClientConnectionManager.java:316)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:137)
	at org.apache.http.nio.pool.RouteSpecificPool.failed(RouteSpecificPool.java:162)
	at org.apache.http.nio.pool.AbstractNIOConnPool.requestFailed(AbstractNIOConnPool.java:609)
	at org.apache.http.nio.pool.AbstractNIOConnPool$InternalSessionRequestCallback.failed(AbstractNIOConnPool.java:889)
	at org.apache.http.impl.nio.reactor.SessionRequestImpl.failed(SessionRequestImpl.java:162)
	at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvent(DefaultConnectingIOReactor.java:176)
	at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvents(DefaultConnectingIOReactor.java:148)
	at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor.execute(AbstractMultiworkerIOReactor.java:351)
	at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.execute(PoolingNHttpClientConnectionManager.java:221)
	at org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase$1.run(CloseableHttpAsyncClientBase.java:64)
	at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.net.ConnectException: Connection refused
	at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:777)
	at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvent(DefaultConnectingIOReactor.java:174)
	... 5 common frames omitted
2022-02-04 12:01:25,482 [INFO] from play.api.Play in main - Application started (Prod) (no global state)
2022-02-04 12:01:26,079 [INFO] from play.core.server.AkkaHttpServer in main - Enabling HTTP/2 on Akka HTTP server...
2022-02-04 12:01:26,080 [INFO] from play.core.server.AkkaHttpServer in main - Listening for HTTP on /0:0:0:0:0:0:0:0:9001
2022-02-04 14:19:12,274 [ERROR] from org.elastic4play.controllers.Authenticated in application-akka.actor.default-dispatcher-14 - Authentication failure:
	session: AuthenticationError User session not found
	pki: AuthenticationError Certificate authentication is not configured
	key: AuthenticationError Authentication header not found
	init: AuthenticationError Use of initial user is forbidden because users exist in database
2022-02-04 14:19:12,277 [INFO] from org.thp.cortex.services.ErrorHandler in application-akka.actor.default-dispatcher-14 - POST /api/stream returned 401
org.elastic4play.AuthenticationError: Authentication failure
	at org.elastic4play.controllers.Authenticated.$anonfun$getContext$4(Authenticated.scala:272)
	at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307)
	at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:56)
	at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:93)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
	at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:93)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:48)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
	at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
	at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
	at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
	at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
	at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
2022-02-04 14:21:37,382 [ERROR] from org.elastic4play.database.DBConfiguration in application-akka.actor.default-dispatcher-23 - ElasticSearch request failure: POST:/cortex_6/_search?
StringEntity({"query":{"match":{"relations":{"query":"user"}}},"size":0},Some(application/json))
 => ElasticError(index_not_found_exception,no such index,Some(_na_),Some(cortex_6),None,List(ElasticError(index_not_found_exception,no such index,Some(_na_),Some(cortex_6),None,null,None,None,None,List())),None,None,None,List())
2022-02-04 14:21:37,436 [ERROR] from org.elastic4play.database.DBConfiguration in application-akka.actor.default-dispatcher-14 - ElasticSearch request failure: POST:/cortex_6/_search?
StringEntity({"query":{"match":{"relations":{"query":"user"}}},"size":0},Some(application/json))
 => ElasticError(index_not_found_exception,no such index,Some(_na_),Some(cortex_6),None,List(ElasticError(index_not_found_exception,no such index,Some(_na_),Some(cortex_6),None,null,None,None,None,List())),None,None,None,List())
2022-02-04 14:21:38,707 [INFO] from com.sksamuel.elastic4s.http.JavaClient$ in application-akka.actor.default-dispatcher-21 - Creating HTTP client on http://127.0.0.1:9200
2022-02-04 14:21:38,752 [INFO] from com.sksamuel.elastic4s.http.JavaClient$ in application-akka.actor.default-dispatcher-21 - Creating HTTP client on http://127.0.0.1:9200
2022-02-04 14:21:38,770 [INFO] from com.sksamuel.elastic4s.http.JavaClient$ in application-akka.actor.default-dispatcher-21 - Creating HTTP client on http://127.0.0.1:9200
2022-02-04 14:21:38,787 [INFO] from com.sksamuel.elastic4s.http.JavaClient$ in application-akka.actor.default-dispatcher-21 - Creating HTTP client on http://127.0.0.1:9200
2022-02-04 14:21:38,829 [INFO] from com.sksamuel.elastic4s.http.JavaClient$ in application-akka.actor.default-dispatcher-21 - Creating HTTP client on http://127.0.0.1:9200
2022-02-04 14:21:38,846 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Create a new empty database
2022-02-04 14:21:38,847 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrate database from version 0, add operations for version 2
2022-02-04 14:21:38,851 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrate database from version 0, add operations for version 3
2022-02-04 14:21:38,852 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrate database from version 0, add operations for version 4
2022-02-04 14:21:38,852 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrate database from version 0, add operations for version 5
2022-02-04 14:21:38,852 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrate database from version 0, add operations for version 6
2022-02-04 14:21:39,248 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrating 0 entities from sequence
2022-02-04 14:21:39,277 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrating 0 entities from artifact
2022-02-04 14:21:39,279 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrating 0 entities from audit
2022-02-04 14:21:39,281 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrating 0 entities from data
2022-02-04 14:21:39,283 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrating 0 entities from dblist
2022-02-04 14:21:39,284 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrating 0 entities from job
2022-02-04 14:21:39,285 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrating 0 entities from organization
2022-02-04 14:21:39,286 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrating 0 entities from report
2022-02-04 14:21:39,288 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrating 0 entities from user
2022-02-04 14:21:39,289 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrating 0 entities from worker
2022-02-04 14:21:39,292 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-21 - Migrating 0 entities from workerConfig
2022-02-04 14:21:40,109 [INFO] from org.elastic4play.services.MigrationSrv in application-akka.actor.default-dispatcher-23 - End of migration
2022-02-04 14:30:07,229 [ERROR] from org.elastic4play.database.DBConfiguration in application-akka.actor.default-dispatcher-14 - ElasticSearch request failure: POST:/cortex_6/_update/admin?_source=true&refresh=wait_for&routing=admin&retry_on_conflict=5
StringEntity({"script":{"source":"ctx._source[\"password\"]=params.param0;ctx._source[\"updatedBy\"]=params.param1;ctx._source[\"updatedAt\"]=params.param2","params":{"param0":"ڭ៥僣龚픥誚Ꞟ퓻ƺ橒,64a6a4f9446b9c0ab685af5985e6c7f6673eae2b5a5791b6cfc438fca727df94","param1":"init","param2":1643985007202}}},Some(application/json))
 => ElasticError(illegal_argument_exception,request [/cortex_6/_update/admin] contains unrecognized parameters: [_source], [retry_on_conflict],None,None,None,List(ElasticError(illegal_argument_exception,request [/cortex_6/_update/admin] contains unrecognized parameters: [_source], [retry_on_conflict],None,None,None,null,None,None,None,List())),None,None,None,List())
2022-02-04 14:30:07,230 [INFO] from org.thp.cortex.services.ErrorHandler in application-akka.actor.default-dispatcher-14 - POST /api/user returned 500
org.elastic4play.InternalError: Unknown error: ElasticError(illegal_argument_exception,request [/cortex_6/_update/admin] contains unrecognized parameters: [_source], [retry_on_conflict],None,None,None,List(ElasticError(illegal_argument_exception,request [/cortex_6/_update/admin] contains unrecognized parameters: [_source], [retry_on_conflict],None,None,None,null,None,None,None,List())),None,None,None,List())
	at org.elastic4play.database.DBConfiguration.$anonfun$execute$2(DBConfiguration.scala:158)
	at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307)
	at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:56)
	at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:93)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
	at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:93)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:48)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
	at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
	at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
	at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
	at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
	at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
2022-02-04 14:30:07,779 [INFO] from org.thp.cortex.services.ErrorHandler in application-akka.actor.default-dispatcher-20 - GET /api/stream/rqjqbnfJUb returned 401
org.elastic4play.AuthenticationError: Authentication header not found
	at org.elastic4play.controllers.Authenticated.$anonfun$getFromApiKey$1(Authenticated.scala:143)
	at scala.Option.fold(Option.scala:251)
	at org.elastic4play.controllers.Authenticated.getFromApiKey(Authenticated.scala:143)
	at org.thp.cortex.controllers.StreamCtrl$$anonfun$1.applyOrElse(StreamCtrl.scala:101)
	at org.thp.cortex.controllers.StreamCtrl$$anonfun$1.applyOrElse(StreamCtrl.scala:101)
	at scala.concurrent.Future.$anonfun$recoverWith$1(Future.scala:417)
	at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:56)
	at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:93)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
	at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:93)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:48)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
	at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
	at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
	at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
	at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
	at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)

Get job report artifacts clarification

Hello,

Thank you for this repo, it helped me a ton to write a Cortex client library in Go! And of course thanks for Cortex itself too :)

While I was writing tests I noticed that an example output in the documentation for getting job report and the real Cortex output are different.

The hyperlink above declares an Artifact inside a Report object as a flat object with two attributes:

{
 "type": "sha1",
 "value": "cd1c2da4de388a4b5b60601f8b339518fe8fbd31"
}

But in fact the scheme of an Artifact is the same as an Observable Artifact:

{
  "attributes": {
    "dataType": "ip"
  },
  "data": "8.8.4.4"
}

Which seems kinda wrong to me so maybe it's actually a Cortex bug.
What you think?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.