Giter VIP home page Giter VIP logo

wivet's Introduction

wivet

Web Input Vector Extractor Teaser

WIVET is a benchmarking project that aims to statistically analyze web link extractors. In general, web application vulnerability scanners fall into this category. These VAs, given a URL(s), try to extract as many input vectors as possibly they can to increase the coverage of the attack surface.

WIVET provides a good sum of input vectors to any extractor and presents the results. In order an input extractor to run meaningfully, it has to provide some kind of session handling, which nearly all of the decent crawlers do.

Here's the Cheers List

I don't maintain this archaic repo anymore. If interested, I would be happy to see someone improving. I also have a document of future work ideas somewhere :)

wivet's People

Contributors

andresriancho avatar bedirhan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

wivet's Issues

Incorrect path to links.txt

Migrated from https://code.google.com/p/wivet/issues/detail?id=14

[Fri Mar 21 17:31:09 2014] PHP Warning: file(/tmp/wivet-read-only/wivet/innerpages/links.txt): failed to open stream: No such file or directory in /tmp/wivet-read-only/wivet/functions.php on line 361

The file is actually inside ./wivet/offscanpages/statistics/links.txt

Found running wivet with:
php -S localhost:8054 -t wivet/

Incorrect path to bots.txt

Migrated from https://code.google.com/p/wivet/issues/detail?id=15

[Fri Mar 21 17:31:09 2014] PHP Warning: file(/tmp/wivet-read-only/wivet/offscanpages/bots.txt): failed to open stream: No such file or directory in /tmp/wivet-read-only/wivet/functions.php on line 374

The file is actually inside ./wivet/offscanpages/statistics/bots.txt

Found running wivet with:
php -S localhost:8054 -t wivet/

Various issues in functions.php and statistics.php break coverage calculation

Migrated from https://code.google.com/p/wivet/issues/detail?id=17

[Fri Mar 21 17:31:09 2014] PHP Warning: Invalid argument supplied for foreach() in /tmp/wivet-read-only/wivet/functions.php on line 362
[Fri Mar 21 17:31:09 2014] PHP Notice: Undefined index: pageVisits in /tmp/wivet-read-only/wivet/offscanpages/statistics.php on line 145
[Fri Mar 21 17:31:09 2014] PHP Warning: Division by zero in /tmp/wivet-read-only/wivet/offscanpages/statistics.php on line 145
[Fri Mar 21 17:31:09 2014] PHP Warning: Invalid argument supplied for foreach() in /tmp/wivet-read-only/wivet/offscanpages/statistics.php on line 151

With all these, coverage is ALWAYS 0%

Need an open source software LICENSE

Wivet is great, but it has no license file, which causes trouble when I try to use it. Please add some standard open source software license.

I've contributed a pull request that creates a LICENSE file with the MIT license as content. Please pull, or choose a different popular OSS license from "http://opensource.org/licenses".

Thanks!!

Installation Documentation

My apologies if this information is out there somewhere, but are there any installation documentation available?
I have placed the code into my /var/www folder, created a wivet database, and ran the SQL script. As well as have edited the config.sample.php file to contain the database credentials (also saved that file as config.php). I am still receiving errors stating "Could not save Record", so I am thinking I am missing a step.
Thanks,
Mark

Timezone issues

Warning: date(): It is not safe to rely on the system's timezone settings. You are required to use the date.timezone setting or the date_default_timezone_set() function. In case you used any of those methods and you are still getting this warning, you most likely misspelled the timezone identifier. We selected the timezone 'UTC' for now, but please set date.timezone to select your timezone. in /var/www/html/offscanpages/statistics.php on line 38

long2ip is unnecessary in statistics.php

<td>:&nbsp;&nbsp;<b><?php echo htmlentities(long2ip($scan['ipaddress']), ENT_QUOTES);?></b></td>

echo '<a href="statistics.php?id='.$scan['record'] .'">'.long2ip($scan['ipaddress']).' started on '.date('M dS Y h:i:s A',$scan['starttime']).'</a>&nbsp;&nbsp;(Coverage: '.htmlentities($scan['score'], ENT_QUOTES).')'.' &nbsp;&nbsp;'.htmlentities($uagent, ENT_QUOTES) ;

$scan['ipaddress'] seems to be already an IP string, so long2ip may not be needed.
In PHP 8.0, this may result in an error: Uncaught TypeError: long2ip(): Argument #1 ($ip) must be of type int

Using db to store results

Hello,

I was able to successfully use wivet when storing results in .dat files. I run into some issues when using sql database to store results. I have set up the wivet database, configured credentials, and set the following as:

define('DATASTORE', 'db');

However, crawling results are not being stored and the home page with the main table is blank. Is there anything else that must be done to make wivet work using a database?

The reason I am trying to use sql database vs .dat files is that I think there might be a race condition created when crawling wivet. Thus, using ZAP built-in Spider, I see GET messages being successfully sent to wivet/innerpages/test# but not all are being recorded in wivet statistic as a pass.

Thank you

Wrong coverage

Migrated from https://code.google.com/p/wivet/issues/detail?id=13

I have a coverage of 100% but one test failed.

This is because in offscanpages/current.php $_SESSION['currenturlsvisited']['logginclude'] exists but in not present in $descEntries.

The page logginclude must be added to descEntries or not added in $_SESSION['currenturlsvisited']

This bug is due to the crawler I used which looks at .svn/entries files and found logginclude.php exists.

Regards

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.