Giter VIP home page Giter VIP logo

rhinosecuritylabs / pacu Goto Github PK

View Code? Open in Web Editor NEW
4.0K 108.0 652.0 26.18 MB

The AWS exploitation framework, designed for testing the security of Amazon Web Services environments.

Home Page: https://rhinosecuritylabs.com/aws/pacu-open-source-aws-exploitation-framework/

License: BSD 3-Clause "New" or "Revised" License

Python 99.90% Dockerfile 0.03% Makefile 0.06%
aws-security penetration-testing aws security python

pacu's Introduction

Update 5/4/2021: We recently added support for installing via pip, take a look at the Installation section for running pacu when installed with pip. If you want to run pacu directly from the checked out repo you'll want to use ./cli.py instead of python3 pacu.py.

With this change files that where written to ./sessions/<session> before are now output to ~/.local/share/pacu/<session>.

Quick reference

What is Pacu?

Pacu is an open-source AWS exploitation framework, designed for offensive security testing against cloud environments. Created and maintained by Rhino Security Labs, Pacu allows penetration testers to exploit configuration flaws within an AWS account, using modules to easily expand its functionality. Current modules enable a range of attacks, including user privilege escalation, backdooring of IAM users, attacking vulnerable Lambda functions, and much more.

Installation

Pacu is a fairly lightweight program, as it requires only Python3.7+ and pip3 to install a handful of Python libraries.

Quick Installation

  > pip3 install -U pip
  > pip3 install -U pacu
  > pacu

For a more detailed and user-friendly set of user instructions, please check out the Wiki's installation guide.

How to use Pacu's Docker image

Try in PWD

Option 1: Run with default entrypoint which directly runs Pacu

$ docker run -it rhinosecuritylabs/pacu:latest

Option 2: Run without default entrypoint

$ docker run -it --entrypoint /bin/sh rhinosecuritylabs/pacu:latest

Option 3: Run with AWS config and credentials

Warning: Running this command will mount your local AWS configuration files into the Docker container when it is launched. This means that any user with access to the container will have access to your host computer's AWS credentials.

$ docker run -it -v ~/.aws:/root/.aws rhinosecuritylabs/pacu:latest

Getting Started

The first time Pacu is launched, you will be prompted to start and name a new session. This session will be used to store AWS key pairs, as well as any data obtained from running various modules. You can have any number of different sessions in Pacu, each with their own sets of AWS keys and data, and resume a session at any time (though a restart is currently required to switch between sessions).

Modules require an AWS key, which grants you minimal access to an AWS environment and is comprised of an access key ID and a secret access key. To set your session's keys, use the set_keys command, and then follow the prompts to supply a key alias (nickname for reference), an AWS access key ID, an AWS secret access key, and an AWS session token (if you are using one).

If you are ever stuck, help will bring up a list of available commands.

Basic Commands in Pacu

  • list will list the available modules for the regions that were set in the current session.
  • help module_name will return the applicable help information for the specified module.
  • run module_name will run the specified module with its default parameters.
  • run module_name --regions eu-west-1,us-west-1 will run the specified module against the eu-west-1 and us-west-1 regions (for modules that support the --regions argument)

Running Pacu From the CLI

  • pacu --help will display the help menu
  • pacu --session <session name> sets the session to use for commands that require one
  • pacu --list-modules will list all modules available (does not require session)
  • pacu --pacu-help will list the pacu help window (does not require session)
  • pacu --module-name <module name> the name of a module to perform an action on, you can execute or get information on the module
  • pacu --exec execute the module provided in --module-name
  • pacu --module-info get information on the module provided in --module-name
  • pacu --data <service name || all> query the local SQLAlchemy database to retrieve enumerated information
  • pacu --module-args="<arg1> <value> <arg2> <value>" supply optional module arguments to the module being executed
  • pacu --set-regions <region1 region2 || all> set the regions to use in the session, separate regions by a space or enter all for all regions
  • pacu --whoami get information about the current user

Pacu's Modular Power

Pacu uses a range of plug-in modules to assist an attacker in enumeration, privilege escalation, data exfiltration, service exploitation, and log manipulation within AWS environments. Contributions or ideas for new modules are welcome.

To keep pace with ongoing AWS product developments, we've designed Pacu from the ground up with extensibility in mind. A common syntax and data structure keep modules easy to build and expand on - no need to specify AWS regions or make redundant permission checks between modules. A local SQLite database is used to manage and manipulate retrieved data, minimizing API calls (and associated logs). Reporting and attack auditing is also built into the framework; Pacu assists the documentation process through command logging and exporting, helping build a timeline for the testing process.

Community

We're always happy to get bug reports in the Pacu framework itself, as well as testing and feedback on different modules, and generally, critical feedback to help refine the framework. Any support for Pacu through use, testing, improvement, or just by spreading the word, would be very much appreciated.

If you're interested in contributing directly to the Pacu Framework itself, please read our contribution guidelines for code conventions and git-flow notes.

Developing Pacu Modules

If you're interested in writing your own modules for Pacu, check out our Module Development wiki page. As you develop new capabilities please reach out to us -- we'd love to add your new modules into the core collection that comes with Pacu.

Pacu Framework Development Goals

  • Improve interface formatting
  • Database forward-migrations and version tracking
  • "Attack Playbooks" to allow for easier use of complex module execution chains
  • Colored console output
  • Module Dry-Run functionality
  • Allow use of standalone config files
  • Plugin architecture improvements

Notes

  • Pacu is officially supported in OSX and Linux.
  • Pacu is Open-Source Software and is distributed with a BSD-3-Clause License.

Submitting Requests / Bug Reports

  • Report vulnerabilities in Pacu directly to us via email: [email protected] .
  • Pacu creates error logs within each session's folder, as well as a global error log for out-of-session errors which is created in the main directory. If you can, please include these logs with your bug reports, as it will dramatically simplify the debugging process.
  • If you have a feature request, an idea, or a bug to report, please submit them here.
    • Please include a description sufficient to reproduce the bug you found, including tracebacks and reproduction steps, and check for other reports of your bug before filing a new bug report. Don't submit duplicates.

Wiki

For walkthroughs and full documentation, please visit the Pacu wiki.

Contact Us

Disclaimers, and the AWS Acceptable Use Policy

  • To the best of our knowledge Pacu's capabilities are compliant with the AWS Acceptable Use Policy, but as a flexible and modular tool, we cannot guarantee this will be true in every situation. It is entirely your responsibility to ensure that how you use Pacu is compliant with the AWS Acceptable Use Policy.
  • Depending on what AWS services you use and what your planned testing entails, you may need to review AWS Customer Support Policy for Penetration Testing before actually running Pacu against your infrastructure.
  • As with any penetration testing tool, it is your responsibility to get proper authorization before using Pacu outside of your environment.
  • Pacu is software that comes with absolutely no warranties whatsoever. By using Pacu, you take full responsibility for any and all outcomes that result.

pacu's People

Contributors

agroyz avatar alexanderinsa avatar benfriedland-rhino avatar benfromkc avatar berney avatar daveyesland avatar dependabot[bot] avatar eduardschwarzkopf avatar evastanaccount avatar ishmandoo avatar jack-ganbold avatar jaywon avatar jdearmas avatar jyenduri-uptycs avatar manasmbellani avatar mgeeky avatar naikordian avatar rhino-nick avatar rhinoassessments avatar rjulian avatar ryanjarv avatar sebastian-mora avatar sgn00 avatar shivamagg97 avatar spencer-doak avatar spengietz avatar tenebrae93 avatar webbinroot avatar y4nush avatar za avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pacu's Issues

enum_cloudtrails found in README but not as the actual run option

Pacu (XXXX:XXXX) > run enum_cloudtrails
Module not found. Is it spelled correctly? Try using the module search function.
Pacu (XXXX:XXXX) > ls
  backdoor_ec2_sec_groups
  cloudtrail_csv_injection
  download_lightsail_ssh_keys
  add_ec2_startup_sh_script
  dl_cloudtrail_event_history
  disrupt_monitoring
  enum_monitoring
  enum_ebs_volumes_snapshots
  s3_bucket_dump
  enum_users_roles_policies_groups
  enum_glue
  enum_ec2
  enum_elb_logging
  download_ec2_userdata
  get_credential_report
  confirm_permissions
  enum_ec2_termination_protection
  backdoor_users_password
  backdoor_users_keys
  s3_enum
  privesc_scan
  backdoor_assume_role

Not Able to Fetch Permissions Attached to the User

I've created a user with a bunch of permission policies attached to it. But the whoami command doesn't show the corresponding permissions.

Pacu (pacu:pacu01) > whoami
{
"UserName": null,
"RoleName": null,
"Arn": "arn:aws:iam::******:user/pacu01",
"AccountId": "*
*
*
*
*
",
"UserId": "AIDA56TULUO6JBGE4GZAC",
"Roles": null,
"Groups": null,
"Policies": null,
"AccessKeyId": "AKIA56TULUO6HTREKK43",
"SecretAccessKey": "wAn8lgO6FxYhNU6OH7/G********************",
"SessionToken": null,
"KeyAlias": "pacu01",
"PermissionsConfirmed": null,
"Permissions": {
"Allow": {},
"Deny": {}
}
}

Feature Request - store ec2 userdata one instance per file

rather than putting all the enumerated userdata into one file with the instance_id@region on the line above, create one file per userdata, and then convert the \n and \t into newlines and tabs. This would make it much easier to read and determine what devs are doing with their user-data files...

Exception connecting to ap-southeast-2 "Could not connect to the endpoint URL"

Pacu (xxxx:xxxx) > run enum_ebs_volumes_snapshots
    Running module enum_ebs_volumes_snapshots...
      This module will enumerate all of the Elastic Block Store volumes and snapshots in the account and save the data to the current session. It will also note whether or not each volume/snapshot is encrypted, then write a list of the unencrypted volumes to ./sessions/[current_session_name]/downloads/unencrypted_ebs_volumes_[timestamp].csv and unencrypted snapshots to ./sessions/[current_session_name]/downloads/unencrypted_ebs_snapshots_[timestamp].csv in .CSV format.

  Targeting regions ['us-east-2', 'us-east-1', 'us-west-1', 'us-west-2', 'ap-northeast-1', 'ap-northeast-2', 'ap-south-1', 'ap-southeast-1', 'ap-southeast-2', 'ca-central-1', 'eu-central-1', 'eu-west-1', 'eu-west-2', 'eu-west-3', 'sa-east-1'].
  No account IDs were passed in as arguments and the account ID for the current user has not been stored in this session yet. An account ID is required to get valid results from the snapshot enumeration portion of this module. If you know the current users account ID then enter it now, otherwise, enter y to try and fetch it, or enter n to skip EBS snapshot enumeration. ([account_id]/y/n) y
  Starting region us-east-2 (this may take a while if there are thousands of EBS volumes/snapshots)...
    3 total volume(s) found in us-east-2.
    2 total snapshot(s) found in us-east-2.
  Starting region us-east-1 (this may take a while if there are thousands of EBS volumes/snapshots)...
    1 total volume(s) found in us-east-1.
    6 total snapshot(s) found in us-east-1.
  Starting region us-west-1 (this may take a while if there are thousands of EBS volumes/snapshots)...
    0 total volume(s) found in us-west-1.
    0 total snapshot(s) found in us-west-1.
  Starting region us-west-2 (this may take a while if there are thousands of EBS volumes/snapshots)...
    0 total volume(s) found in us-west-2.
    0 total snapshot(s) found in us-west-2.
  Starting region ap-northeast-1 (this may take a while if there are thousands of EBS volumes/snapshots)...
    0 total volume(s) found in ap-northeast-1.
    0 total snapshot(s) found in ap-northeast-1.
  Starting region ap-northeast-2 (this may take a while if there are thousands of EBS volumes/snapshots)...
    0 total volume(s) found in ap-northeast-2.
    0 total snapshot(s) found in ap-northeast-2.
  Starting region ap-south-1 (this may take a while if there are thousands of EBS volumes/snapshots)...
    0 total volume(s) found in ap-south-1.
    0 total snapshot(s) found in ap-south-1.
  Starting region ap-southeast-1 (this may take a while if there are thousands of EBS volumes/snapshots)...
    0 total volume(s) found in ap-southeast-1.
    0 total snapshot(s) found in ap-southeast-1.
  Starting region ap-southeast-2 (this may take a while if there are thousands of EBS volumes/snapshots)...

[2018-06-27 18:53:57] Pacu encountered an error while running the previous command. Check sessions/test1/error_log.txt for technical details. [LOG LEVEL: LOW]

    <class 'botocore.exceptions.EndpointConnectionError'>: Could not connect to the endpoint URL: "https://ec2.ap-southeast-2.amazonaws.com/"

Ability from Pacu Operator Workstation Run Inline Shell Command

Is your feature request related to a problem? Please describe.
I want to be able to have a subcommand like msfconsole execute so I can run local shell commands.

Describe the solution you'd like
I would like to be able to run cat, ls, grep, or jq especially considering the extensive returns of aws cli output in JSON format.

Describe alternatives you've considered
There are plenty but this would increase the ergonomics and flexibility of the tool.

Additional context
Add any other context or screenshots about the feature request here.

Bug: run privesc_scan failes with UnboundLocalError

Ubuntu 16.x
Configured API creds, ran confirm_privs, tried to run privesc_scan... Fails, error below:

[2018-07-25 18:07:54] (check):
Traceback (most recent call last):
File "./pacu.py", line 1572, in run
self.idle()
File "./pacu.py", line 1480, in idle
self.idle()
File "./pacu.py", line 1478, in idle
self.parse_command(command)
File "./pacu.py", line 525, in parse_command
self.parse_exec_module_command(command)
File "./pacu.py", line 757, in parse_exec_module_command
self.exec_module(command)
File "./pacu.py", line 955, in exec_module
summary_data = module.main(command[2:], self)
File "/home/ubuntu/pacu-master/modules/privesc_scan/main.py", line 466, in main
summary_data['success'] = escalated
<class 'UnboundLocalError'>: local variable 'escalated' referenced before assignment

EKS - Permissions

I would love for this to check to the EKS permissions for managed k8s

Describe the solution you'd like
during the checks or a new check to see if there are any k8s associated with the keys.
if so this would allow full control and it should be possible to generate the kubectl config and allow full cluster control

AVAILABLE COMMANDS
       o create-cluster

       o delete-cluster

       o describe-cluster

       o help

       o list-clusters

       o update-kubeconfig

Describe alternatives you've considered
manually checking eks

KeyError: User when using AssumedRole credentials

Describe the bug
When using keys from an AssumedRole I am unable to run the iam__enum_permissions module. If I didn't read the directions, please school me. I suspect the root of the issue is that my user and username aren't being populated in the identity because the ARN attached to the credentials doesn't fit the same format as a standard IAM user.

To Reproduce
Steps to reproduce the behavior:

  1. Run the command 'set_keys' and set with Access Key, Secret Key, and Token for an Assumed Role
  2. Run the command 'run iam__enum_permissions'
  3. See error

The output of whoami is:

{
  "UserName": null,
  "UserArn": null,
  "AccountId": null,
  "UserId": null,
  "Roles": null,
  "Groups": null,
  "Policies": null,
  "AccessKeyId": "*****************",
  "SecretAccessKey": "*****************",
  "SessionToken": "*****************",
  "KeyAlias": "foo",
  "PermissionsConfirmed": null,
  "Permissions": {
    "Allow": {},
    "Deny": {}
  }
}

Expected behavior
The iam__enum_permissions module should complete successfully.

Operating System (please complete the following information):

  • OS + Version: OSX 10.11
  • Python version: 3.6.6

Error log

Traceback (most recent call last):
  File "pacu.py", line 1634, in run
    self.idle()
  File "pacu.py", line 1567, in idle
    self.parse_command(command)
  File "pacu.py", line 514, in parse_command
    self.parse_exec_module_command(command)
  File "pacu.py", line 828, in parse_exec_module_command
    self.exec_module(command)
  File "pacu.py", line 1049, in exec_module
    summary_data = module.main(command[2:], self)
  File "/Users/jwarren6/Apps/pacu/modules/iam__enum_permissions/main.py", line 126, in main
    user_name=identity['User']['UserName'],
<class 'KeyError'>: 'User'

Error not a valid load balancer ARN

Error below, I masked AWS account ID.

Pacu (xxxx:xxx) > run enum_elb_logging
    Running module enum_elb_logging...
      This module will enumerate all EC2 Elastic Load Balancers and save their data to the current session, as well as write a list of ELBs with logging disabled to ./sessions/[current_session_name]/downloads/elbs_no_logs_[timestamp].csv.

  Starting region us-east-2...
    3 total load balancer(s) found in us-east-2.
  Starting region us-east-1...
    0 total load balancer(s) found in us-east-1.
  Starting region us-west-1...
    0 total load balancer(s) found in us-west-1.
  Starting region us-west-2...
    0 total load balancer(s) found in us-west-2.
  Starting region ap-northeast-1...
    0 total load balancer(s) found in ap-northeast-1.
  Starting region ap-northeast-2...
    0 total load balancer(s) found in ap-northeast-2.
  Starting region ap-south-1...
    0 total load balancer(s) found in ap-south-1.
  Starting region ap-southeast-1...
    0 total load balancer(s) found in ap-southeast-1.
  Starting region ap-southeast-2...
    0 total load balancer(s) found in ap-southeast-2.
  Starting region ca-central-1...
    0 total load balancer(s) found in ca-central-1.
  Starting region eu-central-1...
    0 total load balancer(s) found in eu-central-1.
  Starting region eu-west-1...
    0 total load balancer(s) found in eu-west-1.
  Starting region eu-west-2...
    0 total load balancer(s) found in eu-west-2.
  Starting region eu-west-3...
    0 total load balancer(s) found in eu-west-3.
  Starting region sa-east-1...
    0 total load balancer(s) found in sa-east-1.
  3 total load balancer(s) found.
{
  "LoadBalancerArn": "arn:aws:elasticloadbalancing:us-east-2:NNNNNNNNN:loadbalancer/net/test-cluster-data/0dc10c937e173033",
  "DNSName": "test-cluster-data-0dc10c937e173025.elb.us-east-2.amazonaws.com",
  "CanonicalHostedZoneId": "ZLMOA37VPKARR",
  "CreatedTime": "Sat, 17 Feb 2018 19:14:27",
  "LoadBalancerName": "test-cluster-data",
  "Scheme": "internet-facing",
  "VpcId": "vpc-8cc221a5",
  "State": {
    "Code": "active"
  },
  "Type": "network",
  "AvailabilityZones": [
    {
      "ZoneName": "us-east-2b",
      "SubnetId": "subnet-2f82a833",
      "LoadBalancerAddresses": [
        {}
      ]
    },
    {
      "ZoneName": "us-east-2a",
      "SubnetId": "subnet-5d718622",
      "LoadBalancerAddresses": [
        {}
      ]
    }
  ],
  "IpAddressType": "ipv4",
  "Region": "us-east-2"
}

[2018-06-27 18:58:37] Pacu encountered an error while running the previous command. Check sessions/xxx/error_log.txt for technical details. [LOG LEVEL: LOW]

    <class 'botocore.exceptions.ClientError'>: An error occurred (ValidationError) when calling the DescribeLoadBalancerAttributes operation: 'arn:aws:elasticloadbalancing:us-east-2:NNNNNNNNN:loadbalancer/net/test-cluster-data/0dc10c937e173025' is not a valid load balancer ARN

ReadTimeoutError on iam__bruteforce_permissions

Describe the bug
Got incorrect import while running iam__bruteforce_permissions

To Reproduce
Steps to reproduce the behavior:

  1. Run the command run iam__bruteforce_permissions
  2. See error

Expected behavior
No errors.

Screenshots

Traceback (most recent call last):
  File "pacu.py", line 1704, in run
    self.idle()
  File "pacu.py", line 1639, in idle
    self.idle()
  File "pacu.py", line 1637, in idle
    self.parse_command(command)
  File "pacu.py", line 518, in parse_command
    self.parse_exec_module_command(command)
  File "pacu.py", line 879, in parse_exec_module_command
    self.exec_module(command)
  File "pacu.py", line 1109, in exec_module
    summary_data = module.main(command[2:], self)
  File "/home/eth/tools/pacu/modules/iam__bruteforce_permissions/main.py", line 331, in main
    current_client = pacu_main.get_boto3_client(service, region)
  File "pacu.py", line 1508, in get_boto3_client
    config=boto_config
  File "/home/eth/tools/pacu/boto3/__init__.py", line 91, in client
    return _get_default_session().client(*args, **kwargs)
  File "/home/eth/tools/pacu/boto3/session.py", line 263, in client
    aws_session_token=aws_session_token, config=config)
  File "/home/eth/tools/pacu/botocore/session.py", line 861, in create_client
    client_config=config, api_version=api_version)
  File "/home/eth/tools/pacu/botocore/client.py", line 70, in create_client
    cls = self._create_client_class(service_name, service_model)
  File "/home/eth/tools/pacu/botocore/client.py", line 95, in _create_client_class
    base_classes=bases)
  File "/home/eth/tools/pacu/botocore/hooks.py", line 227, in emit
    return self._emit(event_name, kwargs)
  File "/home/eth/tools/pacu/botocore/hooks.py", line 210, in _emit
    response = handler(**kwargs)
  File "/home/eth/tools/pacu/boto3/utils.py", line 61, in _handler
    module = import_module(module)
  File "/home/eth/tools/pacu/boto3/utils.py", line 52, in import_module
    __import__(name)
  File "/home/eth/tools/pacu/boto3/s3/inject.py", line 15, in <module>
    from boto3.s3.transfer import create_transfer_manager
  File "/home/eth/tools/pacu/boto3/s3/transfer.py", line 129, in <module>
    from s3transfer.manager import TransferConfig as S3TransferConfig
  File "/home/eth/.local/lib/python3.6/site-packages/s3transfer/manager.py", line 21, in <module>
    from s3transfer.utils import get_callbacks
  File "/home/eth/.local/lib/python3.6/site-packages/s3transfer/utils.py", line 27, in <module>
    from botocore.exceptions import ReadTimeoutError
<class 'ImportError'>: cannot import name 'ReadTimeoutError'

Operating System (please complete the following information):

  • Python version: Python 3.6.8

Using AWS AssumeRole functionality with Modules - How?

Currently trying to get the iam__privesc_scan module working and running into some assume role problems. Within our accounts, we have a centralised IAM account that contains user's and then roles which enable access to the other accounts.

I have configured my /.aws/credentials file with the required access keys etc and it works with other tooling that uses the sts service but everytime I run the iam__privesc_scan it keeps returning permission related problems for the central IAM account and not the intended AssumeRole account.

Is there a flag I am missing to direct pacu to execute the module as a specific role?

China region not added?

Tried to update the regions through the latest version of botocore, but cn-north-1 and cn-northwest-1 are not there.

Is there a way to manually add the region to the session? Some modules fail because there is no --region attribute on them.

Thanks!

pacu/install.sh incorrectly catches python 3.7.7 as not meeting the minimum requirement

pacu/install.sh incorrectly catches python 3.7.7 as not meeting the minimum python 3 requirement of 3.5

sh pacu/install.sh
[ + ] Checking Python version . . .
[ $ ] python3 --version
Python 3.7.7
pacu/install.sh: 16: [[: not found
[ - ] Pacu requires Python to be installed at version 3.5 or higher. Your version is: Python 3.7.7
Please install Python version 3.5 or higher. https://www.python.org/downloads/

Remove old sessions

This tool is awesome, truly learning a lot with it.

I would like the ability to remove old sessions, especially when dealing with clients, I would prefer that old keys and information could be purged completely.

Thanks!

run privesc_scan can't find the enum_users_roles_policies

I assume because the module was renamed to enum_users_roles_policies_groups

Pacu (sandbox-no-perms:blah) > run privesc_scan
    Running module privesc_scan...

This module will scan for permission misconfigurations to see where privilege escalation will be possible. Available attack paths will be presented to the user and executed on if chosen.


  CONFIRMED: CreateAccessKey

  Attempting confirmed privilege escalation methods...

    Starting method CreateAccessKey...
      Is there a specific user you want to target? They must not already have two sets of access keys created for their user. Enter their user name now or just hit enter to enumerate users and view a list of options:
  The required data (IAM > Users) has not been found in this session, do you want to run the module "enum_users_roles_policies" to fetch that data? If not, re-run this module with the correct argument(s) specifying the values for this data. (y/n) y
**Module not found. Is it spelled correctly? Try using the module search function.**
  Found 0 user(s). Choose a user below.
    [0] Other (Manually enter user name)
  Choose an option:

s3__bucket_finder fails when calling aws cli

Describe the bug
I tried to run the s3__bucket_finder module, and I found out that it was silently failing when calling the check_output in this line:

output = subprocess.check_output(command, shell=True, stderr=subprocess.STDOUT).decode('utf-8')

The only thing that I found out was that the command was exiting with an exit code of 1.

I called the same command via Python3 cli with no problem so I can't really find out what's happening here.

  • macOS High Sierra, Python 3.7.0

I was trying Pacu on a public "bounty": www.lambdashell.com

So my exact command and output can be shared here:

Pacu (LambdaShellSession:LambdaShellKeys) > run s3__bucket_finder -d lambdashell.com
  Running module s3__bucket_finder...
[s3__bucket_finder] This module requires external dependencies: ['https://github.com/aboul3la/Sublist3r.git', 'https://raw.githubusercontent.com/RhinoSecurityLabs/Security-Research/master/tools/aws-pentest-tools/s3/Buckets.txt']

Install them now? (y/n)

[s3__bucket_finder] Installing 2 total dependencies...
[s3__bucket_finder]   Dependency aboul3la/Sublist3r already installed.
[s3__bucket_finder]   Dependency Buckets.txt already installed.
[s3__bucket_finder] Dependencies finished installing.
[s3__bucket_finder] Generating bucket permutations list...
[s3__bucket_finder] Generated 2 bucket permutations. Beginning search across 16 regions.
Buckets searched: 100.0% (32/32)
[s3__bucket_finder] [+] Results:
[s3__bucket_finder]     Number of Buckets that Exist: 0
[s3__bucket_finder]     Number of Buckets that are Listable: 0
[s3__bucket_finder] s3__bucket_finder completed.

[s3__bucket_finder] MODULE SUMMARY:

  0 total buckets were found.
  0 buckets were found with viewable contents.

It was supposed to find one listable bucket called www.lambdashell.com on us-west-1.

Pacu on CloudGoat#1

Can anyone help - I can't for the life of me figure what I'm doing wrong with Pacu. I'm running in the Ubuntu 20.04 WSL deployment.

I deploy CloudGoat iam_privesc_by_rollback (and can manually exploit it directly through aws cli.....)

Now I want to check out PACU. I start up pacu and set_keys as provided with the scenario, and also set us_west_2 as the region.
I run aws__enum_account and it results in ".....the security token included in the request is invalid".

It appears that pacu is not providing the keys....and the error log file shows a call to get identity with no arguments. What am I doing wrong?

ps..I assume confirm_permissions is no longer an active module as its not in the PACU list?

AccessDeniedException not handled throughout enum_monitoring module

Using a key that doesn't have permissions to make those calls will result in the AccessDeniedException, and since it's not passed in the module the enumeration will stop.

For example:
<class 'botocore.exceptions.ClientError'>: An error occurred (AccessDeniedException) when calling the ListDetectors operation: User: arn:aws:iam::xxx:user/janedoe is not authorized to perform: guardduty:ListDetectors on resource: arn:aws:guardduty:ap-southeast-1:xxx:detector/*

iam__enum_permissions crashes if a policy has no Resource key

Traceback (most recent call last):
  File "pacu.py", line 1561, in run_gui
    self.idle()
  File "pacu.py", line 1441, in idle
    self.idle()
  File "pacu.py", line 1441, in idle
    self.idle()
  File "pacu.py", line 1441, in idle
    self.idle()
  [Previous line repeated 1 more time]
  File "pacu.py", line 1439, in idle
    self.parse_command(command)
  File "pacu.py", line 463, in parse_command
    self.parse_exec_module_command(command)
  File "pacu.py", line 585, in parse_exec_module_command
    self.exec_module(command)
  File "pacu.py", line 859, in exec_module
    summary_data = module.main(command[2:], self)
  File "/pacu/modules/iam__enum_permissions/main.py", line 275, in main
    role = parse_attached_policies(client, attached_policies, role)
  File "/pacu/modules/iam__enum_permissions/main.py", line 523, in parse_attached_policies
    user = parse_document(document, user)
  File "/pacu/modules/iam__enum_permissions/main.py", line 670, in parse_document
    if isinstance(statement['Resource'], list):
<class 'KeyError'>: 'Resource'

iam__privesc_scan not running all potential escalation methods

Bug Description
When running iam__privesc_scan it attempts the first escalation method but then doesn't perform any more. The only method that gets executed is CreateNewPolicyVersion, once this is complete it tries SetExistingDefaultPolicyVersion but then gos back to a prompt (Is there a specific policy you want to target?)

To Reproduce
Steps to reproduce the behavior:

  1. run iam__privesc_scan
  2. No permissions detected yet message appears so hit Y to run module "iam__enum_permissions" to fetch them
  3. Enter the policy to target
  4. See error

Expected behavior
The module continues through all of the 22 methods of escalation.

Screenshots
https://gist.github.com/nmarchini/b99923987721632e50192810d67c137a#file-gistfile1-txt

Operating System (please complete the following information):

  • Mac OS 10.14.4
  • Python version:

Error log
[2019-04-08 21:45:11] (session2):
Traceback (most recent call last):
File "pacu.py", line 1680, in run
self.idle()
File "pacu.py", line 1613, in idle
self.parse_command(command)
File "pacu.py", line 516, in parse_command
self.parse_exec_module_command(command)
File "pacu.py", line 858, in parse_exec_module_command
self.exec_module(command)
File "pacu.py", line 1087, in exec_module
summary_data = module.main(command[2:], self)
File "/Users/user.name/OneDrive - Acme/scratch/pacu/modules/iam__privesc_scan/main.py", line 583, in main
response = methods[potential_method](pacu_main, print, input, fetch_data)
File "/Users/user.name/OneDrive - Acme/scratch/pacu/modules/iam__privesc_scan/main.py", line 816, in SetExistingDefaultPolicyVersion
PolicyArn=policy_arn
File "/Users/user.name/OneDrive - Acme/scratch/pacu/botocore/client.py", line 314, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/Users/user.name/OneDrive - Acme/scratch/pacu/botocore/client.py", line 586, in _make_api_call
api_params, operation_model, context=request_context)
File "/Users/user.name/OneDrive - Acme/scratch/pacu/botocore/client.py", line 621, in _convert_to_request_dict
api_params, operation_model)
File "/Users/user.name/OneDrive - Acme/scratch/pacu/botocore/validate.py", line 291, in serialize_to_request
raise ParamValidationError(report=report.generate_report())
<class 'botocore.exceptions.ParamValidationError'>: Parameter validation failed:
Invalid length for parameter PolicyArn, value: 4, valid range: 20-inf

sqlite3 no such table pacu session error

sh-4.2$ python3 pacu.py --list-modules
/usr/local/lib/python3.7/site-packages/sqlalchemy/sql/functions.py:67: SAWarning: The GenericFunction 'array_agg' is already registered and is going to be overriden.
"is going to be overriden.".format(identifier)
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1244, in _execute_context
cursor, statement, parameters, context
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 552, in do_execute
cursor.execute(statement, parameters)
sqlite3.OperationalError: no such table: pacu_session

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "pacu.py", line 1635, in
Main().run()
File "pacu.py", line 1629, in run
self.run_cli(args)
File "pacu.py", line 1445, in run_cli
sessions = self.database.query(PacuSession).all()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3174, in all
return list(self)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3330, in iter
return self._execute_and_instances(context)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3355, in _execute_and_instances
result = conn.execute(querycontext.statement, self._params)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 988, in execute
return meth(self, multiparams, params)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/sql/elements.py", line 287, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1107, in _execute_clauseelement
distilled_params,
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1248, in _execute_context
e, statement, parameters, cursor, context
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1466, in _handle_dbapi_exception
util.raise_from_cause(sqlalchemy_exception, exc_info)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 398, in raise_from_cause
reraise(type(exception), exception, tb=exc_tb, cause=cause)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 152, in reraise
raise value.with_traceback(tb)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1244, in _execute_context
cursor, statement, parameters, context
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 552, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table: pacu_session
[SQL: SELECT pacu_session.id AS pacu_session_id, pacu_session.created AS pacu_session_created, pacu_session.is_active AS pacu_session_is_active, pacu_session.name AS pacu_session_name, pacu_session.boto_user_agent AS pacu_session_boto_user_agent, pacu_session.key_alias AS pacu_session_key_alias, pacu_session.access_key_id AS pacu_session_access_key_id, pacu_session.secret_access_key AS pacu_session_secret_access_key, pacu_session.session_token AS pacu_session_session_token, pacu_session.session_regions AS pacu_session_session_regions, pacu_session."APIGateway" AS "pacu_session_APIGateway", pacu_session."CloudTrail" AS "pacu_session_CloudTrail", pacu_session."CloudWatch" AS "pacu_session_CloudWatch", pacu_session."CodeBuild" AS "pacu_session_CodeBuild", pacu_session."Config" AS "pacu_session_Config", pacu_session."DataPipeline" AS "pacu_session_DataPipeline", pacu_session."DynamoDB" AS "pacu_session_DynamoDB", pacu_session."EC2" AS "pacu_session_EC2", pacu_session."ECS" AS "pacu_session_ECS", pacu_session."Glue" AS "pacu_session_Glue", pacu_session."GuardDuty" AS "pacu_session_GuardDuty", pacu_session."IAM" AS "pacu_session_IAM", pacu_session."Inspector" AS "pacu_session_Inspector", pacu_session."Lambda" AS "pacu_session_Lambda", pacu_session."Lightsail" AS "pacu_session_Lightsail", pacu_session."S3" AS "pacu_session_S3", pacu_session."SecretsManager" AS "pacu_session_SecretsManager", pacu_session."SSM" AS "pacu_session_SSM", pacu_session."Shield" AS "pacu_session_Shield", pacu_session."VPC" AS "pacu_session_VPC", pacu_session."WAF" AS "pacu_session_WAF", pacu_session."WAFRegional" AS "pacu_session_WAFRegional", pacu_session."Account" AS "pacu_session_Account", pacu_session."AccountSpend" AS "pacu_session_AccountSpend"
FROM pacu_session]
(Background on this error at: http://sqlalche.me/e/e3q8)

sh-4.2$ python3 --version
Python 3.7.8

sh-4.2$ pip3 --version
pip 9.0.3 from /usr/lib/python3.7/site-packages (python 3.7)

sh-4.2$ aws --version
aws-cli/1.18.127 Python/3.7.8 Linux/4.14.186-146.268.amzn2.x86_64 botocore/1.17.50

Improve installation/packaging process

Setup seems to use an unconventional installation method using a shell script to run pip install. Then you must call python3 pacu.py to start it up. It would be ideal to have the project be completely installed using pip e.g.

pip install git+https://github.com/RhinoSecurityLabs/pacu.git

SQLITE error

Describe the bug

Pacu (decathalon:dec) > whoami

[2018-11-13 09:50:21] Pacu encountered an error while running the previous command. Check sessions/decathalon/error_log.txt for technical details. [LOG LEVEL: MINIMAL]

<class 'sqlalchemy.exc.OperationalError'>: (sqlite3.OperationalError) no such column: aws_key.role_name [SQL: 'SELECT aws_key.id AS aws_key_id, aws_key.session_id AS aws_key_session_id, aws_key.user_name AS aws_key_user_name, aws_key.role_name AS aws_key_role_name, aws_key.arn AS aws_key_arn, aws_key.account_id AS aws_key_account_id, aws_key.user_id AS aws_key_user_id, aws_key.roles AS aws_key_roles, aws_key.groups AS aws_key_groups, aws_key.policies AS aws_key_policies, aws_key.access_key_id AS aws_key_access_key_id, aws_key.secret_access_key AS aws_key_secret_access_key, aws_key.session_token AS aws_key_session_token, aws_key.key_alias AS aws_key_key_alias, aws_key.permissions_confirmed AS aws_key_permissions_confirmed, aws_key.allow_permissions AS aws_key_allow_permissions, aws_key.deny_permissions AS aws_key_deny_permissions \nFROM aws_key \nWHERE aws_key.session_id = ? AND aws_key.key_alias = ?'] [parameters: (20, 'dec')] (Background on this error at: http://sqlalche.me/e/e3q8)

**To Reproduce**
Steps to reproduce the behavior:
1. whoami


**Expected behavior**
to list the IAM role details


**Operating System (please complete the following information):**
 - OS Ubuntu 18.04]
 - Python version:.3.5

**Error log**

[2018-11-13 09:50:23] (decathalon):
Traceback (most recent call last):
File "pacu.py", line 1676, in run
self.idle()
File "pacu.py", line 1609, in idle
self.parse_command(command)
File "pacu.py", line 504, in parse_command
self.parse_data_command(command)
File "pacu.py", line 577, in parse_data_command
session.print_all_data_in_session()
File "/tmp/pacu/core/models.py", line 199, in print_all_data_in_session
owned_keys = column.value.all()
File "/usr/local/lib/python3.5/dist-packages/sqlalchemy/orm/query.py", line 2783, in all
return list(self)
File "/usr/local/lib/python3.5/dist-packages/sqlalchemy/orm/dynamic.py", line 247, in iter
return iter(self._clone(sess))
File "/usr/local/lib/python3.5/dist-packages/sqlalchemy/orm/query.py", line 2935, in iter
return self._execute_and_instances(context)
File "/usr/local/lib/python3.5/dist-packages/sqlalchemy/orm/query.py", line 2958, in _execute_and_instances
result = conn.execute(querycontext.statement, self._params)
File "/usr/local/lib/python3.5/dist-packages/sqlalchemy/engine/base.py", line 948, in execute
return meth(self, multiparams, params)
File "/usr/local/lib/python3.5/dist-packages/sqlalchemy/sql/elements.py", line 269, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/usr/local/lib/python3.5/dist-packages/sqlalchemy/engine/base.py", line 1060, in _execute_clauseelement
compiled_sql, distilled_params
File "/usr/local/lib/python3.5/dist-packages/sqlalchemy/engine/base.py", line 1200, in _execute_context
context)
File "/usr/local/lib/python3.5/dist-packages/sqlalchemy/engine/base.py", line 1413, in _handle_dbapi_exception
exc_info
File "/usr/local/lib/python3.5/dist-packages/sqlalchemy/util/compat.py", line 203, in raise_from_cause
reraise(type(exception), exception, tb=exc_tb, cause=cause)
File "/usr/local/lib/python3.5/dist-packages/sqlalchemy/util/compat.py", line 186, in reraise
raise value.with_traceback(tb)
File "/usr/local/lib/python3.5/dist-packages/sqlalchemy/engine/base.py", line 1193, in _execute_context
context)
File "/usr/local/lib/python3.5/dist-packages/sqlalchemy/engine/default.py", line 508, in do_execute
cursor.execute(statement, parameters)
<class 'sqlalchemy.exc.OperationalError'>: (sqlite3.OperationalError) no such column: aws_key.role_name [SQL: 'SELECT aws_key.id AS aws_key_id, aws_key.session_id AS aws_key_session_id, aws_key.user_name AS aws_key_user_name, aws_key.role_name AS aws_key_role_name, aws_key.arn AS aws_key_arn, aws_key.account_id AS aws_key_account_id, aws_key.user_id AS aws_key_user_id, aws_key.roles AS aws_key_roles, aws_key.groups AS aws_key_groups, aws_key.policies AS aws_key_policies, aws_key.access_key_id AS aws_key_access_key_id, aws_key.secret_access_key AS aws_key_secret_access_key, aws_key.session_token AS aws_key_session_token, aws_key.key_alias AS aws_key_key_alias, aws_key.permissions_confirmed AS aws_key_permissions_confirmed, aws_key.allow_permissions AS aws_key_allow_permissions, aws_key.deny_permissions AS aws_key_deny_permissions \nFROM aws_key \nWHERE ? = aws_key.session_id'] [parameters: (20,)] (Background on this error at: http://sqlalche.me/e/e3q8

Pacu cannot find AWS CLI

"I'm getting a aws: not found error."
Pacu is unable to find the AWS CLI binary if it is not in bourne shell profile path.

To Reproduce
Steps to reproduce the behavior:

  1. Run any aws command within Pacu

Expected behavior
AWS commands should be ran against the installed AWS CLI.

Operating System (please complete the following information):

  • OS + Version: Kali
  • Python version: .2.7, 3.6

Discover AWS elastic container registry permissions

The following permissions are not being discovered via bruteforce;

"Effect": "Allow",
"Action": "ecr:GetAuthorizationToken"

as well as

"Effect": "Allow",
"Action": [
"ecr:GetAuthorizationToken",
"ecr:BatchCheckLayerAvailability",
"ecr:GetDownloadUrlForLayer",
"ecr:GetRepositoryPolicy",
"ecr:DescribeRepositories",
"ecr:ListImages",
"ecr:DescribeImages",
"ecr:BatchGetImage",
"ecr:InitiateLayerUpload",
"ecr:UploadLayerPart",
"ecr:CompleteLayerUpload",
"ecr:PutImage"

and probably others.

Consider in development roadmap.

-Alex

Installation Error on windows 10 (Python 3.4)

I am trying to install Pacu in my Windows 10 Machine. i have python3.4 installed in my system.

requirements are installed in my system realted to pacu, but when i try to execute python pacu.py i am getting below mentioned error:

C:\Python34\pacu>c:\Python34\python.exe pacu.py
File "pacu.py", line 45
session_tag = f'({session.name})'
^
SyntaxError: invalid syntax

Kindly look into this

ec2__download_userdata: add filter for tags

Is your feature request related to a problem? Please describe.
Not related to a problem as the data (filtered EC2 instances) can be obtained by the aws cli and added to pacu (after some massaging). I have tested accounts with a very large number of instances (many being out of scope). A EC2 tag filter would help limit the scope of ec2__download_userdata pulls to specific (in scope) instances.

Describe the solution you'd like
Add an optional filter for tag names and their associated values to the ec2__download_userdata module. This would require the functionality of the following aws cli query: aws ec2 describe-instances --filters “Name=tag:NAME,Values=VALUE” --region SPECIFIED-REGION

This would allow for filtering on specific tags in large AWS accounts.

Describe alternatives you've considered
Use aws cli to get specific instances based on tags:

$ aws ec2 describe-instances --filters "Name=tag:TAG-NAME,Values=TAG-VALUE" --query "Reservations[*].Instances[*].InstanceId[]" --region SPECIFIED-REGION --output text > tagged.txt
$ sed -i $'s/\t/@SPECIFIED-REGION,/g' tagged.txt

This will output all tagged instances in comma-separated format with region-name for pacu consumption

Can't run module and can't understand the issue

After setting up cloudgoat, trying to run iam__enum_permissions and getting the following errors:

Pacu (yes:bob) > run iam__enum_permissions
Running module iam__enum_permissions...
[iam__enum_permissions] Failed to discover the current users username, enter it now or Ctrl+C to exit the module: bob
[iam__enum_permissions] Confirming permissions for users:
[iam__enum_permissions] bob...
[iam__enum_permissions] List groups for user failed
[iam__enum_permissions] FAILURE: MISSING REQUIRED AWS PERMISSIONS
[iam__enum_permissions] List user policies failed
[iam__enum_permissions] FAILURE: MISSING REQUIRED AWS PERMISSIONS
[iam__enum_permissions] List attached user policies failed
[iam__enum_permissions] FAILURE: MISSING REQUIRED AWS PERMISSIONS
[iam__enum_permissions] Confirmed Permissions for bob
[iam__enum_permissions] iam__enum_permissions completed.

[iam__enum_permissions] MODULE SUMMARY:

Confirmed permissions for 0 user(s).
Confirmed permissions for 0 role(s).

Add "data" command output to the cmd logs

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

I couldn't found a way to export the information gathered through pacu.

Describe the solution you'd like
A clear and concise description of what you want to happen.

add the "data" command output to the "cmd_log.txt" so it is easier to export the results out of pacu.

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.

Implement a "reporting" module which lets you save a file on the filesystem with the information of certain pairs of keys or session.

Additional context
Add any other context or screenshots about the feature request here.

Feature Request: Recon Mode

Many of the modules make changes to the AWS account (as per the wiki)... Although it SHOULD be assumed that no one will use this without RTFM.... It might be a good idea to add a "Recon" toggle, that will only permit you to execute "safe" checks. Similar to the nmap script tagging concept.

Script to autogenerate `service_regions.json` from web

AWS publishes a list of endpoints, protocols, regions, and services (and mappings between them all) online. Noticing that the service_regions.json seems to be exactly that, is there a script to generate that list as new endpoints come online or old ones are offlined?

Parsing HTML is not ideal, but at least it's in tables, and there's the ec2:DescribeRegions API call to get the current region list, so you can regex out all of the API endpoint URLs probably simple enough maybe.

<class 'KeyError'>: 'url'

I'm getting <class 'KeyError'>: 'url' while trying to download AWS inspector report using the following command

run inspector__get_reports --download-reports

OS: Linux mint 19.2 tina
Python: 3.6.8

pacu console output

[inspector__get_reports] Report saved to: sessions/Session_name/downloads/inspector_assessments/0-XYZ.html
[inspector__get_reports] Report saved to: sessions/Session_name/downloads/inspector_assessments/0-ABC.html

[2019-08-20 08:28:16] Pacu encountered an error while running the previous command. Check sessions/Session_name/error_log.txt for technical details. [LOG LEVEL: MINIMAL]

<class 'KeyError'>: 'url'

error.log

Traceback (most recent call last):
File "pacu.py", line 1704, in run
self.idle()
File "pacu.py", line 1637, in idle
self.parse_command(command)
File "pacu.py", line 518, in parse_command
self.parse_exec_module_command(command)
File "pacu.py", line 879, in parse_exec_module_command
self.exec_module(command)
File "pacu.py", line 1109, in exec_module
summary_data = module.main(command[2:], self)
File "/home/hammad/pacu/pacu/modules/inspector__get_reports/main.py", line 73, in main
with urllib.request.urlopen(response['url']) as response, open(file_name, 'a') as out_file:
<class 'KeyError'>: 'url'

Need support for Assumed Role

I've assumed a SAML role that is tied to my corp AD. I entered the key id, secret and session token. Attempting to confirm permissions results in this error:

Pacu (mysession) > run confirm_permissions
    Running module confirm_permissions...
      This module will attempt to use IAM APIs to enumerate a confirmed list of IAM permissions for the current user. This is done by checking attached and inline policies for the user and the groups they are in.

[2018-07-14 19:48:50] Pacu encountered an error while running the previous command. Check sessions/mysession/error_log.txt for technical details. [LOG LEVEL: MINIMAL]

    <class 'botocore.exceptions.ClientError'>: An error occurred (ValidationError) when calling the GetUser operation: Must specify userName when calling with non-User credentials

Error in the Log file is:

[2018-07-14 19:39:40] (mysession): 
Traceback (most recent call last):
  File "pacu.py", line 1394, in run
    self.idle()
  File "pacu.py", line 1303, in idle
    self.idle()
  File "pacu.py", line 1303, in idle
    self.idle()
  File "pacu.py", line 1303, in idle
    self.idle()
  [Previous line repeated 2 more times]
  File "pacu.py", line 1301, in idle
    self.parse_command(command)
  File "pacu.py", line 693, in parse_command
    self.exec_module(command)
  File "pacu.py", line 866, in exec_module
    module.main(command[2:], self)
  File "/Users/chrisfarris/InfoSec/pacu/modules/confirm_permissions/main.py", line 85, in main
    user = client.get_user()
  File "/Users/chrisfarris/InfoSec/pacu/botocore/client.py", line 314, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/Users/chrisfarris/InfoSec/pacu/botocore/client.py", line 612, in _make_api_call
    raise error_class(parsed_response, operation_name)
<class 'botocore.exceptions.ClientError'>: An error occurred (ValidationError) when calling the GetUser operation: Must specify userName when calling with non-User credentials

Basic issue is that my keys are not tied to an IAM User. They're tied to a role which I assumed via federated identity. This is probably also the case when using Instance Profile keys.

The AWS call aws sts get-caller-identity should probably be called first to determine what form of identity the keys I've obtained are:

Patton:pacu chrisfarris$ aws sts get-caller-identity --profile myprofile
{
    "Account": "530000000074",
    "UserId": "AROACENSORED:[email protected]",
    "Arn": "arn:aws:sts::530000000074:assumed-role/my-company-role-name/[email protected]"
}

note the "assumed-role in the arn above.

Feature - grab snapshot permissions

Snapshots can be shared between accounts or made public.

Patton chrisfarris$ aws ec2 describe-snapshot-attribute --attribute createVolumePermission --snapshot-id snap-9e123456
{
    "SnapshotId": "snap-9e123456",
    "CreateVolumePermissions": [
        {
            "UserId": "05nnnnnnnn8"
        }
    ]
}

This would require iterating down all the snaps and making this second call, but it would be really good to pull a list of public or shared snaps.

`disrupt_monitoring.py` should have disruptions for all of the same services that `enum_monitoring.py` enumerates

Disrupting monitoring applies to a variety of services: VPC flow logs, CloudTrail, AWS Config, CloudWatch alarms, S3 events, etc... CloudTrail and GuardDuty detectors are great, but there's other things that qualify too; essentially anything that takes an automated action based on the state of something of an event can function as an alarm, depending on what we're trying to do/mutate.

Monitoring intelligence beyond enumeration

Being able to enumerate the monitoring options in place via enum_monitoring.py is great (once it covers the full set), but intelligent analysis of it is also going to be really important. There are some complicated (and not) setups that are highly secure, or at least have specific nuances worth of care that the user should be aware of, but that only really come out of being able to detect and analyze the logging and monitoring environment.

This is definitely a later-stage kind of thing, but the ability to automatically enumerate these things and infer these insights will enable all sorts of cool stuff, because we'll have specific resource names we can now throw into other bulk/blind enumeration for more targeted enumeration on resource, and for action.

Things like:

  • CloudTrail is logging to an in-account bucket, but there's an escalation path to getting delete access to that bucket.
    • The Bucket has notifications turned on for any deletes, and there's no way to disable that. Deleting CloudTrail objects will clean the history, but alert someone/something, use as a last resort

or

  • CloudTrail is logging to an out-of-account bucket, and there's no way to sanitize logs.
    • This is a great time for flashing red warnings to the user saying use caution, your actions are logged durably and we don't know what analysis they're doing on them.

enum_ebs_volumes_snapshots fails with AssumeRole

Pacu (mysession) > run enum_ebs_volumes_snapshots
    Running module enum_ebs_volumes_snapshots...
      This module will enumerate all of the Elastic Block Store volumes and snapshots in the account and save the data to the current session. It will also note whether or not each volume/snapshot is encrypted, then write a list of the unencrypted volumes to ./sessions/[current_session_name]/downloads/unencrypted_ebs_volumes_[timestamp].csv and unencrypted snapshots to ./sessions/[current_session_name]/downloads/unencrypted_ebs_snapshots_[timestamp].csv in .CSV format.

  Targeting regions ['ap-northeast-1', 'ap-northeast-2', 'ap-south-1', 'ap-southeast-1', 'ap-southeast-2', 'ca-central-1', 'eu-central-1', 'eu-west-1', 'eu-west-2', 'eu-west-3', 'sa-east-1', 'us-east-1', 'us-east-2', 'us-west-1', 'us-west-2'].
  No account IDs were passed in as arguments and the account ID for the current user has not been stored in this session yet. An account ID is required to get valid results from the snapshot enumeration portion of this module. If you know the current users account ID then enter it now, otherwise, enter y to try and fetch it, or enter n to skip EBS snapshot enumeration. ([account_id]/y/n) y
  Error running get_user. It is possible that the account ID has been returned in this error: An error occurred (ValidationError) when calling the GetUser operation: Must specify userName when calling with non-User credentials
  If the AWS account ID was returned in the previous error, enter it now to continue, or enter n to skip EBS snapshot enumeration. ([account_id]/n)

sts get-caller-identity is a better method to get the AccountID

module not installed

Run the install script, then ran the script, also tried installing the module manually.

python3 pacu.py
Traceback (most recent call last):
File "pacu.py", line 21, in
import boto3
File "/opt/Cloud/AWS/pacu/boto3/init.py", line 16, in
from boto3.session import Session
File "/opt/Cloud/AWS/pacu/boto3/session.py", line 17, in
import botocore.session
File "/opt/Cloud/AWS/pacu/botocore/session.py", line 27, in
import botocore.client
File "/opt/Cloud/AWS/pacu/botocore/client.py", line 16, in
from botocore import waiter, xform_name
File "/opt/Cloud/AWS/pacu/botocore/waiter.py", line 13, in
import jmespath
<class 'ModuleNotFoundError'>: No module named 'jmespath'

Pacu was not able to start because a required Python package was not found.
Run sh install.sh to check and install Pacu's Python requirements.

sudo pip install jmespath
Requirement already satisfied: jmespath in /root/.local/lib/python2.7/site-packages (0.9.3)

Download Public IP for All Ec2

Describe the solution you'd like
To be able to grab all EC2 public ip's and save them to a file.

Describe alternatives you've considered

aws ec2 describe-instances --filter "Name=instance-state-name,Values=running" --query "Reservations[].Instances[].[PublicIpAddress, Tags[?Key=='Name'].Value|[0]]" --output text

Additional context
this would allow to scan all other ip's and see if any other systems are vun.

project organization question

I'm looking into addressing #72

I have a few questions:

  1. Is there a purpose of boto3 and botocore being included in the project?
    • i.e is it modified or the dependencies were just checked in?
    • is it still needed if it is installed via requirements.txt
  2. most python projects that are installed via pip have a top level module i.e
pacu/
└── pacu
    ├── core
    ├── modules
    └── pp_modules

vs the current

pacu/
├── core
├── modules
├── pp_modules

and are you guys open to this type of change?
3. project versioning would need to be maintained, it doesn't look like you are currently versioning / maintaining changelog updates upon merging to master. Is this something you would be willing to start doing?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.