hasecuritysolutions / vulnwhisperer Goto Github PK
View Code? Open in Web Editor NEWCreate actionable data from your Vulnerability Scans
Home Page: https://twitter.com/VulnWhisperer
License: Apache License 2.0
Create actionable data from your Vulnerability Scans
Home Page: https://twitter.com/VulnWhisperer
License: Apache License 2.0
Maybe find a way, in both windows and linux, to get an error log created so that failures can be tracked.
Hi, first of all credit to this wonderful tool, may i suggest for your next update to add support for parsing burp scan xml file.
I am unable to get pass the report generation stage - seems an invalid template ID is being used to generate the qualys reports.
Processing 1/399
[ACTION] - Generating report for 10794322
[FAIL] Could not process report ID: <?xml version="1.0" encoding="UTF-8"?>
<ServiceResponse xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="https://qualysapi.qg2.apps.qualys.com/qps/xsd/3.0/was/report.xsd">
<responseCode>INVALID_REQUEST</responseCode>
<responseErrorDetails>
<errorMessage>Template with id 126024 was not found. Provide a different template id.</errorMessage>
</responseErrorDetails>
</ServiceResponse>
I am unclear where the template id of 126024 is coming from. Anyone else having a similar issue?
Importing the vulnWhispererBaseVisuals.json adds VulnWhisperer - Risk: Critical (Also info, low, medium, high and total) as a 'goal' type visualization which without alteration prevents the visualization from working.
Changing the type to 'gauge' manually in the JSON file seems to resolve this.
Use to track when scans are running, supposed to run, and latest results. This should be a separate scan index.
It should include all the data about a scan result, without the history information. Including:
status, control, uuid, name, read, enabled, owner, creation_date, user_permissions, folder_id, starttime, timezone, last_modification_date, shared, type, id, rrules)
Hello,
I'm hoping this is a simple question. But for the moment I am unable to resolve.
I have lots of data in Qualys VM and would like to try VulnWhisperer. However, it appears the default Qualys pull request is for the WAS application. How/where do I set up VulnWhisperer for Qualys VM?
Thanks!
Richard
HI All,
I need help,I have configured VulnWhisper on my winodws 7 64x.
Configured it and run as per Git Readme.
Now I have Nessus data in CSV.
Can you guide me now what to do with CSV, so i can get complete scene in one page.
Graph and top 10 data as shown in screenshot.
Thanks in advance.
Hi, it'd be helpful if
https://github.com/austin-taylor/VulnWhisperer#vulnerability-frameworks
linked to the home pages of the supported frameworks.
Please add support for OpenVAS
I have a fresh install of Elasticsearch, Logstash, Kibana, Nessus in default locations on the same Ubuntu 16.04 server VM and managed to import the Nessus scans and got the csv file in '/opt/vulnwhisp/nessus/My Scans' folder. Need some guidance on importing the CSV files into Logstah or Elasticsearch for a new index. Sorry am new to Elastic stack and trying to get VulnWhisperer operational and will be happy to contribute instructions on how to setup the entire setup.
Thanks in advance for your help.
Singh
allowing arguments instead of an ini file will help reduce file dependence when running in Docker containers.
All ini files can be sent in the docker run, or in a docker-compose file.
Is it possible to read nbe or xml export files from vulnWhisperer insteat of read from openvas directly to import reports from other systems, e.g. customer. Maybe there are only some small modifications needed.
@austin-taylor Thanks for this cool project.
#! Deprecation: match_mapping_type [float] is invalid and will be ignored: No field type matched on [float], possible values are [object, string, long, double, boolean, date, binary]
#! Deprecation: match_mapping_type [byte] is invalid and will be ignored: No field type matched on [byte], possible values are [object, string, long, double, boolean, date, binary]
#! Deprecation: match_mapping_type [short] is invalid and will be ignored: No field type matched on [short], possible values are [object, string, long, double, boolean, date, binary]
#! Deprecation: match_mapping_type [integer] is invalid and will be ignored: No field type matched on [integer], possible values are [object, string, long, double, boolean, date, binary]
#! Deprecation: match_mapping_type [geo_point] is invalid and will be ignored: No field type matched on [geo_point], possible values are [object, string, long, double, boolean, date, binary]
#! Deprecation: The [string] field is deprecated, please use [text] or [keyword] instead on [source]
#! Deprecation: The [string] field is deprecated, please use [text] or [keyword] instead on [synopsis]
#! Deprecation: The [string] field is deprecated, please use [text] or [keyword] instead on [see_also]
#! Deprecation: The [string] field is deprecated, please use [text] or [keyword] instead on [cve]
#! Deprecation: The [string] field is deprecated, please use [text] or [keyword] instead on [solution]
#! Deprecation: The [string] field is deprecated, please use [text] or [keyword] instead on [@Version]
#! Deprecation: The [string] field is deprecated, please use [text] or [keyword] instead on [risk]
#! Deprecation: [omit_norms] is deprecated, please use [norms] instead with the opposite boolean value
{
"acknowledged": true
}
Current Versions:
elasticsearch v5.5.1
kibana v5.5.1
logstash v5.5.1
When I run the VulnWhisper command against my nessus instance, it connects but shows 0 scan to be processed.
We have Tenable SecurityCenter and 5 nessus scanners. I'm not sure if the integrations with nessus work when they are managed by securitycenter. For example I know you can no longer login to the nessus boxes and view scan history or policies; since all of that is done at the securitycenter level and pushed out to all scanners.
This is what I get when I run vulnwhisp against securitycenter
[INFO] Connected to database at /opt/vulnwhisperer/database/report_tracker.db
[INFO] Attempting to connect to nessus...
[FAIL] Could not login to Nessus
[FAIL] Could not properly load your config!
Reason: [FAIL] Could not connect to nessus -- Please verify your settings in <vulnwhisp.base.config.vwConfig object at 0x7f8b8b68ea50> are correct and try again.
Reason: [FAIL] Could not login to Nessus
SecurityCenter's API can be found at https://docs.tenable.com/sccv/api/index.html
I would be willing to offer assistance to get this working.
Hi,
The json result file from openvas scan is not processed by logstash:
[2018-03-05T12:36:52,893][DEBUG][logstash.inputs.file ] each: file grew: /opt/vulnwhisp/openvas/openvas_scan_e05c3ff470a1439d9525df35ac064be6_1520247708.json: old size 0, new size 2634 [2018-03-05T12:36:53,894][DEBUG][logstash.inputs.file ] each: file grew: /opt/vulnwhisp/openvas/openvas_scan_e05c3ff470a1439d9525df35ac064be6_1520247708.json: old size 0, new size 2634 [2018-03-05T12:36:54,896][DEBUG][logstash.inputs.file ] each: file grew: /opt/vulnwhisp/openvas/openvas_scan_e05c3ff470a1439d9525df35ac064be6_1520247708.json: old size 0, new size 2634 [2018-03-05T12:36:55,897][DEBUG][logstash.inputs.file ] each: file grew: /opt/vulnwhisp/openvas/openvas_scan_e05c3ff470a1439d9525df35ac064be6_1520247708.json: old size 0, new size 2634 [2018-03-05T12:36:56,899][DEBUG][logstash.inputs.file ] each: file grew: /opt/vulnwhisp/openvas/openvas_scan_e05c3ff470a1439d9525df35ac064be6_1520247708.json: old size 0, new size 2634
due to not having a end of line at the end. If i append a blank line to the json file logstash process it fine.
Thanks.
Best regards.
Hello. I am unable to get the json output for my qualys scans.
[ACTION] - Generating report for 12206112
[INFO] - Successfully generated report! ID: 12206112
[INFO] - New Report ID: 813288
[ACTION] - File written to /opt/vulnwhisp/qualys/813288.csv
[ACTION] - Downloading file ID: 813288
[FAIL] - Could not process 12206112 - to_json() got an unexpected keyword argument 'lines'
The csv reports generate just fine, but not the json - the json files are empty due to the above error. As far as I can tell, the logstash filters for qualys are set up for ingesting the json, not the csv, correct? Can I do anything withe CSV reports that are generated or do they exist only as a step in creating the final json report format?
should be zlib1g-dev rather than zlibg1-dev... the "g" and "1" are transposed.
Adding the 'plugin family' to the logstash will help with grouping vulns.
happy to work on this if you like.
Since the given template uses deprecated commands for elasticsearch 6.0+ has someone created a new mapping?
root@scanbox:/opt/vulnwhisperer/configs# cat example.ini
[nessus]
enabled=true
hostname=192.168.1.54
port=8834
username=scanner
password=!myP@SSWORDisPASSW0RD
write_path=/opt/vulnwhisperer/nessus/
db_path=/opt/vulnwhisperer/database
trash=false
verbose=true
root@sanbox:/bin# vuln_whisperer -c /opt/vulnwhisperer/configs/example.ini -s nessus
[INFO] Creating file /opt/vulnwhisperer/database/report_tracker.db
[INFO] Connected to database at /opt/vulnwhisperer/database/report_tracker.db
[INFO] Attempting to connect to nessus...
[SUCCESS] Connected to nessus on 192.168.1.54:8834
[INFO] Gathering all scan data... this may take a while...
[INFO] Identified 0 scans to be processed
[INFO] No new scans to process. Exiting...
I have over 200 scans on this server
I am able to connect to Nessus per the documentation. However after my scans are identified the next line says
[ERROR] 'token'
local variable 'token_id' referenced before assignment.
I am trying to debug as well.
It will be nice if this project supports Docker deployment.
Tks
Override, the nessus username/password in the URL arguments. That way, an ini file does not have to include sensitive information.
In the "VulnWhisperer - Reporting" and "VulnWhisperer - Reporting Qualys Scoring" dashboards, I have the following errors:
In the "VulnWhisperer - Risk Mitigation" dashboard, I have the following errors:
In the "VulnWhisperer - Risk Mitigation Qualys Web Scoring" dashboard, I have the following errors:
Fields in the logstash-vulnwhisperer-*
index are:
Index was loaded:
curl -XPUT 'http://192.168.160.5:9200/logstash-vulnwhisperer-template' -d@/home/ansible/VulnWhisperer/elasticsearch/logstash-vulnwhisperer-template.json
I'm not sure if this is a bug, an incorrectly loaded index or data that needs to be configured in OpenVAS?
This is with ELK version 5.6.9.
Allow the ability to pass a date, or last x days, so that it will only pull nessus scans from that time period to current date.
For instance, be able to run vulnwhisperer to tell it only pull the last 3 days.
OWASP ZAP is one of the most popular open source web application vulnerability scanners.
It will be really cool to support for it, I notice it isn't on the list of supported
scanners. Otherwise, how could VulnWhisperer to extend to support it ?
I'm attempting to setup Vulnwhisp on a Ubuntu machine and getting the following error "Could not connect to database at /opt/vulnwhisp/database/report_tracker.db", not sure how to get around this problem, can anyone please help??
create a different error message if the config ini file does not exist. Current error says "Could not properly load your config! Reason: No Section: 'nessus'".
Stating that the file did not exist would help with troubleshooting issues.
An authenticated scan of 1000 hosts generated a file around 500 MB in size when exported as a csv manually. This tool fails to download this scan, but after I removed that scan it succeeded at downloading a much smaller scan that targeted only 1 host.
Client: Archlinux 4.16.10-1-ARCH
VulnWhisperer: 1.5 and master
Python: 2.7.15
Server: Centos 7.5.1804
Nessus: Professional 7.1.0
(root@endor-vm) vulnwhisperer # python2 bin/vuln_whisperer -c configs/frameworks_example.ini -s nessus
[INFO] Creating directory /opt/vulnwhisperer/database
[INFO] Creating file /opt/vulnwhisperer/database/report_tracker.db
[INFO] Connected to database at /opt/vulnwhisperer/database/report_tracker.db
[INFO] Attempting to connect to nessus...
[SUCCESS] Connected to nessus on sherlock.strsoh.org:8834
[INFO] Gathering all scan data... this may take a while...
[INFO] Identified 2 scans to be processed
Download for file id 1720763221.
..............................
...............
[FAIL] ERROR:
(root@endor-vm) vulnwhisperer # python2 bin/vuln_whisperer -c configs/frameworks_example.ini -s nessus
[INFO] Connected to database at /opt/vulnwhisperer/database/report_tracker.db
[INFO] Attempting to connect to nessus...
[SUCCESS] Connected to nessus on sherlock:8834
[INFO] Gathering all scan data... this may take a while...
[INFO] Identified 1 scans to be processed
Download for file id 1331664840.
.
Processing 1/1 for scan: easton-mbx1
[INFO] 522 records written to easton-mbx1_55_56_1527889658.csv
the latest version of requests and urllib3 no longer supports the workaround to remove the insecure requests warning being used. Execution will fail with the following:
...(snip)...
from requests.packages.urllib3.exceptions import InsecureRequestWarning
ImportError: cannot import name InsecureRequestWarning
Fixed by removing the imports. I can push or if you guys just want to fix that's cool too. Thanks for a great tool!
Hi,
Still cant get it to do anything beyond converting the nessus file to csv, both logstash and kibana service status shows running ok, not sure what am i doing wrong but it doesn't generate report and/or output the visualization to kibana, I have followed the video sample to follow the exact steps but I am not getting me anywhere.
Please advise
below is the output and attached are the logstash yml, logstash conf and .ini file
logstash_conf.txt
logstash_yml.txt
ini file.txt
OUTPUT
test@test-virtual-machine:~/Desktop/VulnWhisperer-master$ sudo vuln_whisperer -c configs/frameworks_example.ini -s nessus
[INFO] Connected to database at /opt/vulnwhisp/database/report_tracker.db
[INFO] Attempting to connect to nessus...
[SUCCESS] Connected to nessus on localhost:8834
[INFO] Gathering all scan data... this may take a while...
[INFO] Identified 9 scans to be processed
[INFO] Directory already exist for /opt/vulnwhisp/nessus/Trash - Skipping creation
[INFO] Directory already exist for /opt/vulnwhisp/nessus/My Scans - Skipping creation
Download for file id 1168345030.
.
Processing 1/9 for scan: test
[INFO] 78 records written to test_25_26_1521411373.csv
Download for file id 819727134.
.
Processing 2/9 for scan: scan
[INFO] 86 records written to scan_22_23_1520385708.csv
Download for file id 1969498333.
.
Processing 3/9 for scan: attemptthird
[INFO] 79 records written to attemptthird_19_20_1519954223.csv
Download for file id 79218840.
.
Processing 5/9 for scan: attempt 1
[INFO] 79 records written to attempt_1_14_15_1519952202.csv
Download for file id 950718717.
.
Processing 8/9 for scan: test
[INFO] 79 records written to test_7_8_1519946993.csv
Logging should be implemented for nightly runs to shows status of jobs
When I open one of your dashboard templates my kibana crashes with the following messages:
Version: 6.2.2
Build: 16588
Error: Uncaught TypeError: gaugeTypes[_this.gaugeConfig.type] is not a constructor (http://172.18.48.110/bundles/vendors.bundle.js?v=16588:12)
at window.onerror (http://172.18.48.110/bundles/commons.bundle.js?v=16588:21:468094)
Adding .if(eq,0,null).fit(carry)
to a timelion function makes it carrying over the last result until the next one. Since there is more time between scans, carrying over results between two scans seems much more appealing imo.
Hi,
I've downloaded the tool and I'm currently in the process of setting it up with Nessus. However, I was running into some issues with Vulnwhisperer not picking up the report_tracker.db file. As it turns out, Nessus needs to create this database file and then the path must be supplied in the Vulnwhisperer configuration file. Could you please update the configuration screenshot to maybe say something like "write_path=/path/to/nessus/report_tracker.db" or something like that? As a new user to Nessus and Vulnwhisperer this took me a while to understand on my own.
I'm also curious what the next steps are with a brand new installation? Once I point Vulnwhisperer to the right database, what comes next? My understanding is that the ELK stack should be started?
I have placed both filebeat.yml, filebeat input, and nessus logstash config in the correct locations. Logstash doesn't seem to be creating the index. I noticed that there is a logstash template but it isn't clear if it needs to be imported or not.
Hi,
I have managed to get the command working with "touch report_tracker.db", however the scripts runs acknowledges the nessus scan but it doesn't go further than that, not sure what is going on. Can anyone help?
->/opt/VulnWhisperer$ vuln_whisperer -c configs/nessus.ini -s nessus
[INFO] Connected to database at /opt/VulnWhisperer/vulnwhisp/database/report_tracker.db
[INFO] Attempting to connect to nessus...
[SUCCESS] Connected to nessus on localhost:8834
[INFO] Gathering all scan data... this may take a while...
[INFO] Identified 1 scans to be processed
I've updated vulnwhisp.py, added the openvas section to the config file, and added openvas.py to the frameworks folder. When I run vuln_whisperer -c configs/frameworks_example.ini -s openvas
there is no output and I am immediately returned to the command prompt.
"Next you'll need to import the visualizations into Kibana and setup your logstash config"
Well you are so right this is needed because I'm stuck here.
/Regards
Is VulnWhisperer designed to support multiple nessus scanners?
Because I've read another comment about the Tenable Security Center that it only does work when you address a scanner directly. But this would not be a big issue if I am able to address all of them.
On Debian 8, the following additional packages must be installed and aren't getting picked up anywhere else:
zlibg1-dev
Libxml2-dev
Libxslt1-dev
Update docs to reflect. Again, I can do and push if you want. Thanks!
All piped reports are getting the same parsing error:
Error parsing csv {:field=>"message", :source=>" Recommended upgrade : Server 2008 R2 Service Pack 1
\" \"79638\",\"CVE-2014-6321\",\"10.0\",\"Critical\",\"x.x.x.x\",\"tcp\",\"3389\",
\"MS14-066: Vulnerability in Schannel Could ", :exception=>#<CSV::MalformedCSVError: Illegal quoting in line 1.>}```
How VulnWhisperer run scans with OpenVas ?
Thank you
I'm using the standard nessus conf file to pull my scans through but all scans are timestamped 5 hours ahead of their last modified time.
So if i executed a scan at 12pm the CSV file would have a unix time stamp of 5pm on the same day.
Anyone came across a similar issue?
Is it necessary to normalize the 'last_modification_date'? wouldn't it be simpler to leave it in UTC and let Kibana handle it?
https://github.com/austin-taylor/VulnWhisperer/blob/e4e9ed7f28b379e1add848441c9448963258e8b6/vulnwhisp/vulnwhisp.py#L257
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.