Giter VIP home page Giter VIP logo

dalton's People

Contributors

0x120102181f0a040a01181c avatar dc-secureworks avatar jsoref avatar mundruid avatar scribbles avatar urbanski avatar whartond avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dalton's Issues

tls entries in eve.log

Based on the note in the dalton.py, tls logs in the EVE output is disabled regardless of the version of suricata being used.

Is it possible to only disable this output if the version of suricata being called doesn't support multiple tls loggers? (Suri 4 and >= 3.1 as per notes)

Suricata agent invalid repos

The url that is used to build the Suricata agent, http://downloads.suricata-ids.org/, is no longer valid. See error below:

curl http://downloads.suricata-ids.org/
curl: (6) Could not resolve host: downloads.suricata-ids.org

I suggest the following install for suricata agent:

RUN mkdir suricata
WORKDIR /src/suricata
RUN wget http://www.openinfosecfoundation.org/download/suricata-${SURI_VERSION}.tar.gz
RUN tar -xvzf suricata-${SURI_VERSION}.tar.gz
WORKDIR /src/suricata/suricata-${SURI_VERSION} 
# configure, make, and install
RUN ./configure --prefix=/usr --sysconfdir=/etc --localstatedir=/var && make && make install && ldconfig

Happy to work on this issue if it gets approved. Thanks!

support for lua based rules

Would be cool if a user could provide lua scripts along with a job. If a lua script is uploaded with a job, automatic modification of the config to enable lua scripts and set the 'scripts:' appropriately to a tmp file in the tmp job direct of the agent would be handy as well.

this might get tricky as the rule name references the script file, so renaming of the scripts might not be possible.

validation that a rule leveraging the lua script provided would be a nice feature to include too.

Install through a proxy...

Hello,
I'm trying to run the install through a proxy. My current environment has the internet completely blocked. The proxy option in the README works only for apt-get and doesnt apply to the install process.

Thanks,

Dalton on-premise release

Good morning,

Are there any plans to release a version for deployment in on-premise or virtualization environments?

Regards

start-dalton.sh Fails to build Suricata 7.x (Dockerfile_suricata needs libpcre2-dev

Hello!

I'd like to report a small issue with Dalton. the dalton-agent/Dockerfiles/Dockerfile_suricata needs to be updated -- somewhere between lines 10 through 15, would you very kindly add libpcre2-dev to the list of installed packages?

The docker image for Suricata "latest" (7.0.0) failed to compile suricata because libpcre2-dev was missing in the image. To resolve this issue I just updated the docker file and added libpcre2-dev next to libpcre3, and re-ran start-dalton.sh and everything was right as rain.

Host operating system: Ubuntu 22.04.3
Docker compose version info:
docker-compose version 1.29.2, build unknown
docker-py version: 5.0.3
CPython version: 3.10.12
OpenSSL version: OpenSSL 3.0.2 15 Mar 2022

Proposed change (PR #168) :

RUN apt-get update -y && DEBIAN_FRONTEND=noninteractive apt-get install -y \
    python3.8 \
    tcpdump \
    libpcre2-dev libpcre3 libpcre3-dbg libpcre3-dev libnss3-dev\
    build-essential autoconf automake libtool libpcap-dev libnet1-dev \
    libyaml-0-2 libyaml-dev zlib1g zlib1g-dev libcap-ng-dev libcap-ng0 \
    make libmagic-dev libjansson-dev libjansson4 pkg-config rustc cargo \
    liblua5.1-dev libevent-dev

dump buffers breaks really old version of snort

It would appear in the WEBUI that dump buffers is an optional selection, however this does not appear to be the case with snort.

in looking at
https://github.com/secureworks/dalton/blob/master/dalton-agent/dalton-agent.py#L898-L900

we can see that the dump-buffers is enforced on all runs of snort, however this feature was introduced in Snort 2.9.9.0.

Is it possible to add a check that only adds this option depending on the version of snort which is being run?

dalton_controller constantly restarting.

Cause:

Traceback (most recent call last):
  File "/opt/dalton/run.py", line 2, in <module>
    from flask_cache import Cache
  File "/usr/local/lib/python2.7/site-packages/flask_cache/__init__.py", line 24, in <module>
    from werkzeug import import_string
ImportError: cannot import name import_string

Workaround:
Add werkzeug==0.16.1 to requirements.txt

dalton controller erroring on jinja import

I see some other "dalton controller constantly restarting" issues, but none of them seem to relate to my issue. Clean git pull (I deleted my old dalton install, upgraded Docker, etc) and everything starts but the dalton_controller, which repeats this error:

Traceback (most recent call last):
File "/opt/dalton/run.py", line 1, in
from flask import Flask
File "/usr/local/lib/python3.9/site-packages/flask/init.py", line 14, in
from jinja2 import escape
ImportError: cannot import name 'escape' from 'jinja2' (/usr/local/lib/python3.9/site-packages/jinja2/init.py)

From my googling (https://stackoverflow.com/questions/71718167/importerror-cannot-import-name-escape-from-jinja2), it appears that the recommended fix for this is to upgrade to a newer version of Flask, which seems like it's not an option.

Is there something else that I'm missing? Host system is macOS, Docker version is "Docker version 20.10.14, build a224086".

TIA.

Ability to retain pcaps on the controller for future use

I have a standard set of pcaps I'd like to test rules against for false positives, regressions, etc.

instead of uploading the pcaps in a zip every time, I'd like to have the ability to, when submitted a job, opt to "retain the pcap". Then when submitting jobs in the feature (even teapot jobs) allow the retained pcaps to be used.

retained pcaps would ideally be available to all ids engines.

include raw log files in Job Zipfile

In the Job Zipfile, it would be nice if all of the log files produced, despite them being supported by Dalton would be attached.

my problem stems from the EVE JSON file as displayed in the browser has a bunch of escaping done to it, i'd rather just download the file as generated by the engine.

I'm not sure how the results are stored on the controller, but perhaps a "download" button for each file displayed in the job results too.

Dalton seems to not recognize IPv6 packets as having a session

I uploaded the attached PCAP which has a session (including a TCP SYN packet), and it gives me the following error:

As Dalton says, "pain don't hurt." But an incomplete pcap sure can.

The pcap file 'alert_file_IPv6.pcap' contains TCP traffic but does not contain any TCP packets with the SYN flag set.

Almost all IDS rules that look for TCP traffic require an established connection.
You will need to provide a more complete pcap if you want accurate results.

If you need help crafting a pcap, Flowsynth may be able to help --
https://github.com/secureworks/flowsynth

And, "there's always barber college...."

alert_file_IPv6.pcap.zip

Build and upload automatically docker images

I followed the installation tutorial in the readme, and when i build the docker images they take a long time to compile all images.

The ideia of this issue is to suggest that we use Github actions to build and auto upload to Docker Hub all the images, and cut down the time of image compilations.

It would only be necessary to download them, changing the command:

docker compose build && docker compose up -d

to:

docker compose pull && docker compose up -d

Or even add the pull_policy: pull statement so you don't need 2 commands, with this command, its already download the images automatically if they not founded locally:

docker compose up -d

Install failing

Hello,

For some reason the install is failing. I'm getting the error below and immediately after the setup exists with the error code shown. I'm not behind a proxy, it's inside a VM, with internet connectivity and no proxy.

ERROR: Service 'controller' failed to build: The command '/bin/sh -c apt-get update && DEBIAN_FRONTEND=noninteractive apt-get -y install wireshark-common p7zip-full' returned a non-zero code: 100

Can anyone help, please ?

Thank you

Please add tcp handshake for tcp/http pcap generation

Case when I generate pcap in flowsynth and pass it to the dalton in web interface doesnt work due to tcp handshake absence.
Suricata doesnt work with tcp handshakeless sessions so please add tcp.initialize to

synth = "flow default %s %s:%s > %s:%s%s;" % (

synth = "flow default %s %s:%s > %s:%s%s ( tcp.initialize; ) ; " % (

or the case is broken otherwise.

use suricata socket

Problem: Running 1000 pcaps is slow.

Idea: Start Suricata upon the agent receiving it's first job, load the requested ruleset, start listening on a socket, send the pcaps through the socket. If the second job uses the same ruleset (hash compare?) then use the socket to run the pcap without having to spend the time reloading suri. If the third job uses a different ruleset, you can either restart suricata, or use the socket ruleset-reload-rules.

Outcome: Using the same ruleset consecutively between jobs is significantly faster, while still maintaining the original speed if a different ruleset is desired.

Display number of alerts on the Queue page

Feature request for a display enhancement.

If a job is completed, it would be handy to show the number of alerts for the job on the queue page. Provide a quick way to narrow down the runs that do/dont have alerts.

Allow same version of IDS with different build options

I would like to have multiple configurations of the same engine and verison..

For Example. I'd like to have both Suricata 4.1.4 without Rust and Suricata 4.1.4 with Rust.

At first thought, this would be as simple as adding another sensor, dockerfile. But in practice because of the way the SENSOR_TECHNOLOGY string is auto determined, it's more complex.

My first thought was to create an ENV variable in the Dockerfile which contained the SENSOR_TECHNOLOGY string. During build time (of the docker image), with the exception of the engine type and version are known, with the exception of the support for "current".
This can probably be addressed in dalton-agent.py by checking for the env variable first and default to the current "auto" method is not.

I am interested in hearing the best way to approach this. I'm more than willing to attempt a PR with some direction.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.