Appearance
ScoutSuite Cheat Sheet
Overview
ScoutSuite is an open-source multi-cloud security auditing tool that enables security posture assessment of cloud environments. It gathers configuration data for manual inspection and highlights risk areas through the use of a web-based report. ScoutSuite supports multiple cloud providers including AWS, Azure, Google Cloud Platform, Alibaba Cloud, and Oracle Cloud Infrastructure, making it an essential tool for comprehensive cloud security assessments.
⚠️ Warning: Only use ScoutSuite against cloud environments you own or have explicit permission to audit. Unauthorized cloud security scanning may violate terms of service or local laws.
Installation
Python Package Installation
bash
# Install via pip (recommended)
pip3 install scoutsuite
# Install with all dependencies
pip3 install scoutsuite[all]
# Install specific cloud provider support
pip3 install scoutsuite[aws]
pip3 install scoutsuite[azure]
pip3 install scoutsuite[gcp]
# Verify installation
scout --help
Docker Installation
bash
# Pull ScoutSuite Docker image
docker pull nccgroup/scoutsuite
# Run ScoutSuite in Docker
docker run -it --rm \
-v ~/.aws:/root/.aws \
-v $(pwd):/opt/scoutsuite-report \
nccgroup/scoutsuite
# Create alias for easier usage
echo 'alias scout="docker run -it --rm -v ~/.aws:/root/.aws -v $(pwd):/opt/scoutsuite-report nccgroup/scoutsuite"' >> ~/.bashrc
source ~/.bashrc
Manual Installation
bash
# Clone repository
git clone https://github.com/nccgroup/ScoutSuite.git
cd ScoutSuite
# Install dependencies
pip3 install -r requirements.txt
# Install ScoutSuite
python3 setup.py install
# Or run directly
python3 scout.py --help
Virtual Environment Installation
bash
# Create virtual environment
python3 -m venv scoutsuite-env
source scoutsuite-env/bin/activate
# Install ScoutSuite
pip install scoutsuite
# Verify installation
scout --help
# Deactivate when done
deactivate
AWS Configuration
AWS Credentials Setup
bash
# Install AWS CLI
pip3 install awscli
# Configure AWS credentials
aws configure
# Enter Access Key ID, Secret Access Key, Region, Output format
# Alternative: Use environment variables
export AWS_ACCESS_KEY_ID="your_access_key"
export AWS_SECRET_ACCESS_KEY="your_secret_key"
export AWS_DEFAULT_REGION="us-east-1"
# Use AWS profiles
aws configure --profile production
aws configure --profile development
# List configured profiles
aws configure list-profiles
IAM Permissions for ScoutSuite
json
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"acm:DescribeCertificate",
"acm:ListCertificates",
"cloudformation:DescribeStacks",
"cloudformation:GetTemplate",
"cloudformation:ListStacks",
"cloudtrail:DescribeTrails",
"cloudtrail:GetTrailStatus",
"cloudwatch:DescribeAlarms",
"config:DescribeConfigRules",
"config:DescribeConfigurationRecorders",
"config:DescribeDeliveryChannels",
"directconnect:DescribeConnections",
"ec2:Describe*",
"ecs:Describe*",
"ecs:List*",
"elasticloadbalancing:Describe*",
"iam:Generate*",
"iam:Get*",
"iam:List*",
"kms:Describe*",
"kms:Get*",
"kms:List*",
"lambda:Get*",
"lambda:List*",
"logs:Describe*",
"rds:Describe*",
"redshift:Describe*",
"route53:Get*",
"route53:List*",
"s3:Get*",
"s3:List*",
"ses:Get*",
"ses:List*",
"sns:Get*",
"sns:List*",
"sqs:Get*",
"sqs:List*"
],
"Resource": "*"
}
]
}
Basic AWS Scanning
bash
# Basic AWS scan
scout aws
# Scan with specific profile
scout aws --profile production
# Scan specific regions
scout aws --regions us-east-1,us-west-2
# Scan all regions
scout aws --all-regions
# Exclude specific services
scout aws --skip-services s3,iam
# Include only specific services
scout aws --services ec2,s3,iam
# Custom output directory
scout aws --report-dir /tmp/aws-audit
Azure Configuration
Azure Credentials Setup
bash
# Install Azure CLI
curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
# Login to Azure
az login
# List subscriptions
az account list
# Set default subscription
az account set --subscription "subscription-id"
# Create service principal for ScoutSuite
az ad sp create-for-rbac --name "ScoutSuite" --role "Reader"
# Use service principal
export AZURE_CLIENT_ID="client-id"
export AZURE_CLIENT_SECRET="client-secret"
export AZURE_TENANT_ID="tenant-id"
export AZURE_SUBSCRIPTION_ID="subscription-id"
Basic Azure Scanning
bash
# Basic Azure scan
scout azure
# Scan with service principal
scout azure --client-id $AZURE_CLIENT_ID \
--client-secret $AZURE_CLIENT_SECRET \
--tenant-id $AZURE_TENANT_ID
# Scan specific subscription
scout azure --subscription-ids subscription-id
# Scan all subscriptions
scout azure --all-subscriptions
# Custom output directory
scout azure --report-dir /tmp/azure-audit
# Exclude specific services
scout azure --skip-services keyvault,storage
Google Cloud Platform Configuration
GCP Credentials Setup
bash
# Install Google Cloud SDK
curl https://sdk.cloud.google.com | bash
exec -l $SHELL
# Initialize gcloud
gcloud init
# Authenticate
gcloud auth login
# Set default project
gcloud config set project PROJECT_ID
# Create service account for ScoutSuite
gcloud iam service-accounts create scoutsuite \
--display-name="ScoutSuite Service Account"
# Grant necessary roles
gcloud projects add-iam-policy-binding PROJECT_ID \
--member="serviceAccount:scoutsuite@PROJECT_ID.iam.gserviceaccount.com" \
--role="roles/viewer"
# Create and download key
gcloud iam service-accounts keys create scoutsuite-key.json \
--iam-account=scoutsuite@PROJECT_ID.iam.gserviceaccount.com
# Set environment variable
export GOOGLE_APPLICATION_CREDENTIALS="scoutsuite-key.json"
Basic GCP Scanning
bash
# Basic GCP scan
scout gcp
# Scan with service account key
scout gcp --service-account scoutsuite-key.json
# Scan specific project
scout gcp --project-id PROJECT_ID
# Scan all projects
scout gcp --all-projects
# Custom output directory
scout gcp --report-dir /tmp/gcp-audit
# Exclude specific services
scout gcp --skip-services compute,storage
Advanced Scanning Options
Multi-Cloud Scanning
bash
# Scan multiple cloud providers
scout aws azure gcp
# Scan with different configurations
scout aws --profile prod azure --subscription-ids sub1 gcp --project-id proj1
# Custom report directory for multi-cloud
scout aws azure gcp --report-dir /tmp/multi-cloud-audit
# Parallel scanning
scout aws --report-dir /tmp/aws-audit &
scout azure --report-dir /tmp/azure-audit &
scout gcp --report-dir /tmp/gcp-audit &
wait
Custom Rule Sets
bash
# Use custom ruleset
scout aws --ruleset custom-rules.json
# Create custom ruleset
cat > custom-aws-rules.json << 'EOF'
{
"rules": {
"ec2-security-group-opens-all-ports-to-all": {
"enabled": true,
"level": "danger"
},
"s3-bucket-no-logging": {
"enabled": true,
"level": "warning"
}
}
}
EOF
# Use custom ruleset
scout aws --ruleset custom-aws-rules.json
# Disable specific rules
scout aws --skip-rules ec2-security-group-opens-all-ports-to-all
Output Customization
bash
# Generate specific output formats
scout aws --report-format json
# Custom report name
scout aws --report-name "production-audit-$(date +%Y%m%d)"
# Include additional metadata
scout aws --timestamp
# Verbose output
scout aws --debug
# Quiet mode
scout aws --quiet
# Force overwrite existing reports
scout aws --force
Automation Scripts
Automated Multi-Account AWS Audit
bash
#!/bin/bash
# Automated multi-account AWS security audit
ACCOUNTS_FILE="aws_accounts.txt"
OUTPUT_BASE_DIR="aws_audits_$(date +%Y%m%d_%H%M%S)"
REPORT_SUMMARY="$OUTPUT_BASE_DIR/audit_summary.html"
# Create accounts file if it doesn't exist
if [ ! -f "$ACCOUNTS_FILE" ]; then
cat > "$ACCOUNTS_FILE" << 'EOF'
# AWS Accounts Configuration
# Format: PROFILE_NAME|ACCOUNT_ID|DESCRIPTION
production|123456789012|Production Environment
staging|123456789013|Staging Environment
development|123456789014|Development Environment
EOF
echo "Created $ACCOUNTS_FILE - please configure with your AWS accounts"
exit 1
fi
mkdir -p "$OUTPUT_BASE_DIR"
# Function to audit single account
audit_account() {
local profile="$1"
local account_id="$2"
local description="$3"
local output_dir="$OUTPUT_BASE_DIR/$profile"
echo "[+] Auditing account: $profile ($account_id)"
# Run ScoutSuite audit
scout aws \
--profile "$profile" \
--report-dir "$output_dir" \
--report-name "$profile-audit" \
--force \
--quiet
if [ $? -eq 0 ]; then
echo " ✓ Audit completed: $profile"
return 0
else
echo " ✗ Audit failed: $profile"
return 1
fi
}
# Function to generate summary report
generate_summary() {
echo "[+] Generating summary report"
cat > "$REPORT_SUMMARY" << 'EOF'
<!DOCTYPE html>
<html>
<head>
<title>AWS Multi-Account Security Audit Summary</title>
<style>
body { font-family: Arial, sans-serif; margin: 20px; }
.header { background-color: #f0f0f0; padding: 20px; border-radius: 5px; }
.account { margin: 20px 0; padding: 15px; border: 1px solid #ddd; border-radius: 5px; }
.danger { border-color: #f44336; background-color: #ffebee; }
.warning { border-color: #ff9800; background-color: #fff3e0; }
.success { border-color: #4caf50; background-color: #e8f5e8; }
table { border-collapse: collapse; width: 100%; margin: 10px 0; }
th, td { border: 1px solid #ddd; padding: 8px; text-align: left; }
th { background-color: #f2f2f2; }
.high { color: #d32f2f; font-weight: bold; }
.medium { color: #f57c00; font-weight: bold; }
.low { color: #388e3c; }
</style>
</head>
<body>
<div class="header">
<h1>AWS Multi-Account Security Audit Summary</h1>
<p>Generated: $(date)</p>
</div>
EOF
# Process each account
while IFS='|' read -r profile account_id description; do
# Skip comments and empty lines
[[ "$profile" =~ ^#.*$ || -z "$profile" ]] && continue
echo " <div class=\"account\">" >> "$REPORT_SUMMARY"
echo " <h2>$profile - $description</h2>" >> "$REPORT_SUMMARY"
echo " <p>Account ID: $account_id</p>" >> "$REPORT_SUMMARY"
# Check if audit was successful
report_file="$OUTPUT_BASE_DIR/$profile/scoutsuite-report/scoutsuite_results_aws-$profile.js"
if [ -f "$report_file" ]; then
# Extract findings summary (simplified)
danger_count=$(grep -o '"danger"' "$report_file" 2>/dev/null | wc -l)
warning_count=$(grep -o '"warning"' "$report_file" 2>/dev/null | wc -l)
echo " <table>" >> "$REPORT_SUMMARY"
echo " <tr><th>Severity</th><th>Count</th></tr>" >> "$REPORT_SUMMARY"
echo " <tr><td class=\"high\">High Risk</td><td>$danger_count</td></tr>" >> "$REPORT_SUMMARY"
echo " <tr><td class=\"medium\">Medium Risk</td><td>$warning_count</td></tr>" >> "$REPORT_SUMMARY"
echo " </table>" >> "$REPORT_SUMMARY"
# Link to full report
echo " <p><a href=\"$profile/scoutsuite-report/report.html\" target=\"_blank\">View Full Report</a></p>" >> "$REPORT_SUMMARY"
else
echo " <p style=\"color: red;\">❌ Audit failed or report not found</p>" >> "$REPORT_SUMMARY"
fi
echo " </div>" >> "$REPORT_SUMMARY"
done < "$ACCOUNTS_FILE"
cat >> "$REPORT_SUMMARY" << 'EOF'
</body>
</html>
EOF
echo "[+] Summary report generated: $REPORT_SUMMARY"
}
# Function to generate consolidated findings
generate_consolidated_findings() {
echo "[+] Generating consolidated findings report"
local findings_file="$OUTPUT_BASE_DIR/consolidated_findings.json"
local csv_file="$OUTPUT_BASE_DIR/consolidated_findings.csv"
# Create CSV header
echo "Account,Service,Finding,Severity,Resource,Description" > "$csv_file"
# Process each account's findings
while IFS='|' read -r profile account_id description; do
[[ "$profile" =~ ^#.*$ || -z "$profile" ]] && continue
report_file="$OUTPUT_BASE_DIR/$profile/scoutsuite-report/scoutsuite_results_aws-$profile.js"
if [ -f "$report_file" ]; then
# Extract findings (simplified - would need proper JSON parsing in production)
echo "[+] Processing findings for $profile"
# This is a simplified example - in practice, you'd use jq or python to parse JSON
grep -o '"danger"\|"warning"' "$report_file" | while read severity; do
echo "$profile,Unknown,Security Finding,$severity,Unknown,Automated finding from ScoutSuite" >> "$csv_file"
done
fi
done < "$ACCOUNTS_FILE"
echo "[+] Consolidated findings saved: $csv_file"
}
# Main execution
echo "[+] Starting multi-account AWS security audit"
# Check dependencies
if ! command -v scout &> /dev/null; then
echo "[-] ScoutSuite not found. Please install ScoutSuite first."
exit 1
fi
if ! command -v aws &> /dev/null; then
echo "[-] AWS CLI not found. Please install AWS CLI first."
exit 1
fi
# Audit each account
success_count=0
total_count=0
while IFS='|' read -r profile account_id description; do
# Skip comments and empty lines
[[ "$profile" =~ ^#.*$ || -z "$profile" ]] && continue
total_count=$((total_count + 1))
if audit_account "$profile" "$account_id" "$description"; then
success_count=$((success_count + 1))
fi
# Wait between audits to avoid rate limiting
sleep 10
done < "$ACCOUNTS_FILE"
echo "[+] Audit completed: $success_count/$total_count accounts successful"
# Generate reports
generate_summary
generate_consolidated_findings
echo "[+] Multi-account audit completed"
echo "[+] Results saved in: $OUTPUT_BASE_DIR"
echo "[+] Open $REPORT_SUMMARY for summary"
Continuous Cloud Security Monitoring
bash
#!/bin/bash
# Continuous cloud security monitoring with ScoutSuite
CONFIG_FILE="cloud_monitoring.conf"
LOG_DIR="cloud_monitoring_logs"
ALERT_EMAIL="security@company.com"
SCAN_INTERVAL=86400 # 24 hours
mkdir -p "$LOG_DIR"
# Create default configuration
if [ ! -f "$CONFIG_FILE" ]; then
cat > "$CONFIG_FILE" << 'EOF'
# Cloud Security Monitoring Configuration
# AWS Configuration
AWS_ENABLED=true
AWS_PROFILES="production,staging"
AWS_REGIONS="us-east-1,us-west-2"
# Azure Configuration
AZURE_ENABLED=true
AZURE_SUBSCRIPTIONS="sub1,sub2"
# GCP Configuration
GCP_ENABLED=false
GCP_PROJECTS="project1,project2"
# Alerting Configuration
ALERT_ON_NEW_FINDINGS=true
ALERT_ON_SEVERITY_INCREASE=true
CRITICAL_THRESHOLD=10
HIGH_THRESHOLD=50
# Reporting
KEEP_HISTORICAL_REPORTS=true
HISTORICAL_RETENTION_DAYS=30
EOF
echo "Created $CONFIG_FILE - please configure monitoring settings"
exit 1
fi
source "$CONFIG_FILE"
# Function to run AWS monitoring
monitor_aws() {
if [ "$AWS_ENABLED" != "true" ]; then
return 0
fi
echo "[+] Running AWS security monitoring"
IFS=',' read -ra PROFILES <<< "$AWS_PROFILES"
for profile in "${PROFILES[@]}"; do
timestamp=$(date +%Y%m%d_%H%M%S)
output_dir="$LOG_DIR/aws_${profile}_$timestamp"
echo "[+] Scanning AWS profile: $profile"
scout aws \
--profile "$profile" \
--regions "$AWS_REGIONS" \
--report-dir "$output_dir" \
--report-name "aws-$profile-$timestamp" \
--force \
--quiet
if [ $? -eq 0 ]; then
echo " ✓ AWS scan completed: $profile"
analyze_findings "$output_dir" "aws" "$profile"
else
echo " ✗ AWS scan failed: $profile"
fi
done
}
# Function to run Azure monitoring
monitor_azure() {
if [ "$AZURE_ENABLED" != "true" ]; then
return 0
fi
echo "[+] Running Azure security monitoring"
IFS=',' read -ra SUBSCRIPTIONS <<< "$AZURE_SUBSCRIPTIONS"
for subscription in "${SUBSCRIPTIONS[@]}"; do
timestamp=$(date +%Y%m%d_%H%M%S)
output_dir="$LOG_DIR/azure_${subscription}_$timestamp"
echo "[+] Scanning Azure subscription: $subscription"
scout azure \
--subscription-ids "$subscription" \
--report-dir "$output_dir" \
--report-name "azure-$subscription-$timestamp" \
--force \
--quiet
if [ $? -eq 0 ]; then
echo " ✓ Azure scan completed: $subscription"
analyze_findings "$output_dir" "azure" "$subscription"
else
echo " ✗ Azure scan failed: $subscription"
fi
done
}
# Function to run GCP monitoring
monitor_gcp() {
if [ "$GCP_ENABLED" != "true" ]; then
return 0
fi
echo "[+] Running GCP security monitoring"
IFS=',' read -ra PROJECTS <<< "$GCP_PROJECTS"
for project in "${PROJECTS[@]}"; do
timestamp=$(date +%Y%m%d_%H%M%S)
output_dir="$LOG_DIR/gcp_${project}_$timestamp"
echo "[+] Scanning GCP project: $project"
scout gcp \
--project-id "$project" \
--report-dir "$output_dir" \
--report-name "gcp-$project-$timestamp" \
--force \
--quiet
if [ $? -eq 0 ]; then
echo " ✓ GCP scan completed: $project"
analyze_findings "$output_dir" "gcp" "$project"
else
echo " ✗ GCP scan failed: $project"
fi
done
}
# Function to analyze findings and send alerts
analyze_findings() {
local output_dir="$1"
local cloud_provider="$2"
local account="$3"
echo "[+] Analyzing findings for $cloud_provider:$account"
# Find the results file
results_file=$(find "$output_dir" -name "*.js" -type f | head -1)
if [ ! -f "$results_file" ]; then
echo "[-] Results file not found for $cloud_provider:$account"
return 1
fi
# Count findings by severity (simplified)
critical_count=$(grep -o '"danger"' "$results_file" 2>/dev/null | wc -l)
high_count=$(grep -o '"warning"' "$results_file" 2>/dev/null | wc -l)
echo " Critical findings: $critical_count"
echo " High findings: $high_count"
# Check thresholds and send alerts
if [ "$critical_count" -ge "$CRITICAL_THRESHOLD" ]; then
send_alert "CRITICAL" "$cloud_provider" "$account" "$critical_count" "critical findings detected"
fi
if [ "$high_count" -ge "$HIGH_THRESHOLD" ]; then
send_alert "HIGH" "$cloud_provider" "$account" "$high_count" "high-severity findings detected"
fi
# Compare with previous scan if available
if [ "$ALERT_ON_NEW_FINDINGS" = "true" ]; then
compare_with_previous "$cloud_provider" "$account" "$results_file"
fi
}
# Function to send alerts
send_alert() {
local severity="$1"
local cloud_provider="$2"
local account="$3"
local count="$4"
local message="$5"
local subject="[$severity] Cloud Security Alert: $cloud_provider:$account"
local body="Security alert for $cloud_provider account '$account': $count $message at $(date)"
echo "$body" | mail -s "$subject" "$ALERT_EMAIL" 2>/dev/null || \
echo "Alert: $subject - $body (email failed)"
}
# Function to compare with previous scan
compare_with_previous() {
local cloud_provider="$1"
local account="$2"
local current_results="$3"
# Find previous results file
previous_results=$(find "$LOG_DIR" -name "*${cloud_provider}_${account}_*.js" -type f | sort | tail -2 | head -1)
if [ -f "$previous_results" ] && [ "$previous_results" != "$current_results" ]; then
echo "[+] Comparing with previous scan"
# Simple comparison (in practice, would need proper JSON diff)
current_critical=$(grep -o '"danger"' "$current_results" 2>/dev/null | wc -l)
previous_critical=$(grep -o '"danger"' "$previous_results" 2>/dev/null | wc -l)
if [ "$current_critical" -gt "$previous_critical" ]; then
new_critical=$((current_critical - previous_critical))
send_alert "NEW FINDINGS" "$cloud_provider" "$account" "$new_critical" "new critical findings since last scan"
fi
fi
}
# Function to cleanup old reports
cleanup_old_reports() {
if [ "$KEEP_HISTORICAL_REPORTS" = "true" ]; then
echo "[+] Cleaning up reports older than $HISTORICAL_RETENTION_DAYS days"
find "$LOG_DIR" -type d -mtime +$HISTORICAL_RETENTION_DAYS -exec rm -rf {} + 2>/dev/null || true
fi
}
# Function to generate trend report
generate_trend_report() {
echo "[+] Generating security trend report"
local trend_file="$LOG_DIR/security_trends.csv"
local html_report="$LOG_DIR/security_trends.html"
# Create CSV header if file doesn't exist
if [ ! -f "$trend_file" ]; then
echo "Date,Cloud,Account,Critical,High,Medium,Low" > "$trend_file"
fi
# Add current data (simplified)
current_date=$(date +%Y-%m-%d)
echo "$current_date,AWS,production,5,15,25,10" >> "$trend_file"
# Generate HTML trend report
python3 << EOF
import csv
import matplotlib.pyplot as plt
from datetime import datetime
import os
# Read trend data
dates = []
critical_counts = []
try:
with open('$trend_file', 'r') as f:
reader = csv.DictReader(f)
for row in reader:
dates.append(row['Date'])
critical_counts.append(int(row['Critical']))
# Create trend chart
plt.figure(figsize=(12, 6))
plt.plot(dates, critical_counts, marker='o', linewidth=2, markersize=6)
plt.title('Critical Security Findings Trend')
plt.xlabel('Date')
plt.ylabel('Critical Findings Count')
plt.xticks(rotation=45)
plt.grid(True, alpha=0.3)
plt.tight_layout()
plt.savefig('$LOG_DIR/trend_chart.png', dpi=150, bbox_inches='tight')
plt.close()
print("Trend chart generated: $LOG_DIR/trend_chart.png")
except Exception as e:
print(f"Error generating trend chart: {e}")
EOF
}
# Main monitoring loop
echo "[+] Starting continuous cloud security monitoring"
echo "[+] Scan interval: $SCAN_INTERVAL seconds"
while true; do
echo "[+] Starting monitoring cycle at $(date)"
# Run monitoring for each cloud provider
monitor_aws
monitor_azure
monitor_gcp
# Generate reports and cleanup
generate_trend_report
cleanup_old_reports
echo "[+] Monitoring cycle completed at $(date)"
echo "[+] Next scan in $((SCAN_INTERVAL / 3600)) hours"
sleep "$SCAN_INTERVAL"
done
ScoutSuite Report Parser
bash
#!/bin/bash
# Parse ScoutSuite reports and extract key findings
REPORT_DIR="$1"
OUTPUT_FORMAT="${2:-json}" # json, csv, html
if [ -z "$REPORT_DIR" ] || [ ! -d "$REPORT_DIR" ]; then
echo "Usage: $0 <report_directory> [output_format]"
echo "Output formats: json, csv, html"
exit 1
fi
# Function to parse ScoutSuite JavaScript results
parse_scoutsuite_results() {
local results_file="$1"
local output_file="$2"
local format="$3"
echo "[+] Parsing ScoutSuite results: $results_file"
python3 << EOF
import json
import re
import csv
import sys
from datetime import datetime
# Read the JavaScript file and extract JSON data
with open('$results_file', 'r') as f:
content = f.read()
# Extract JSON from JavaScript variable assignment
json_match = re.search(r'scoutsuite_results\s*=\s*({.*});', content, re.DOTALL)
if not json_match:
print("Error: Could not extract JSON from ScoutSuite results")
sys.exit(1)
try:
data = json.loads(json_match.group(1))
except json.JSONDecodeError as e:
print(f"Error parsing JSON: {e}")
sys.exit(1)
# Extract findings
findings = []
def extract_findings(obj, path="", service=""):
if isinstance(obj, dict):
for key, value in obj.items():
if key == "findings" and isinstance(value, dict):
for finding_id, finding_data in value.items():
if isinstance(finding_data, dict):
finding = {
"service": service,
"finding_id": finding_id,
"description": finding_data.get("description", ""),
"level": finding_data.get("level", ""),
"path": path,
"flagged_items": len(finding_data.get("flagged_items", [])),
"items": finding_data.get("flagged_items", [])
}
findings.append(finding)
elif key == "services" and isinstance(value, dict):
for svc_name, svc_data in value.items():
extract_findings(svc_data, f"{path}/{key}/{svc_name}", svc_name)
else:
extract_findings(value, f"{path}/{key}", service)
elif isinstance(obj, list):
for i, item in enumerate(obj):
extract_findings(item, f"{path}[{i}]", service)
# Extract findings from the data
extract_findings(data)
# Output based on format
if '$format' == 'json':
with open('$output_file', 'w') as f:
json.dump(findings, f, indent=2)
print(f"JSON output saved: $output_file")
elif '$format' == 'csv':
with open('$output_file', 'w', newline='') as f:
if findings:
writer = csv.DictWriter(f, fieldnames=['service', 'finding_id', 'description', 'level', 'flagged_items'])
writer.writeheader()
for finding in findings:
writer.writerow({
'service': finding['service'],
'finding_id': finding['finding_id'],
'description': finding['description'],
'level': finding['level'],
'flagged_items': finding['flagged_items']
})
print(f"CSV output saved: $output_file")
elif '$format' == 'html':
html_content = f"""
<!DOCTYPE html>
<html>
<head>
<title>ScoutSuite Findings Report</title>
<style>
body {{ font-family: Arial, sans-serif; margin: 20px; }}
table {{ border-collapse: collapse; width: 100%; }}
th, td {{ border: 1px solid #ddd; padding: 8px; text-align: left; }}
th {{ background-color: #f2f2f2; }}
.danger {{ background-color: #ffebee; }}
.warning {{ background-color: #fff3e0; }}
.info {{ background-color: #e3f2fd; }}
</style>
</head>
<body>
<h1>ScoutSuite Security Findings</h1>
<p>Generated: {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}</p>
<p>Total Findings: {len(findings)}</p>
<table>
<tr>
<th>Service</th>
<th>Finding</th>
<th>Description</th>
<th>Severity</th>
<th>Affected Items</th>
</tr>
"""
for finding in findings:
severity_class = finding['level'] if finding['level'] in ['danger', 'warning', 'info'] else 'info'
html_content += f"""
<tr class="{severity_class}">
<td>{finding['service']}</td>
<td>{finding['finding_id']}</td>
<td>{finding['description']}</td>
<td>{finding['level']}</td>
<td>{finding['flagged_items']}</td>
</tr>
"""
html_content += """
</table>
</body>
</html>
"""
with open('$output_file', 'w') as f:
f.write(html_content)
print(f"HTML output saved: $output_file")
# Print summary
danger_count = sum(1 for f in findings if f['level'] == 'danger')
warning_count = sum(1 for f in findings if f['level'] == 'warning')
info_count = sum(1 for f in findings if f['level'] == 'info')
print(f"\\nSummary:")
print(f" Critical (danger): {danger_count}")
print(f" High (warning): {warning_count}")
print(f" Info: {info_count}")
print(f" Total: {len(findings)}")
EOF
}
# Main execution
echo "[+] Parsing ScoutSuite reports in: $REPORT_DIR"
# Find all ScoutSuite result files
results_files=$(find "$REPORT_DIR" -name "scoutsuite_results_*.js" -type f)
if [ -z "$results_files" ]; then
echo "[-] No ScoutSuite result files found in $REPORT_DIR"
exit 1
fi
# Parse each results file
for results_file in $results_files; do
# Extract provider and account from filename
basename=$(basename "$results_file" .js)
provider_account=$(echo "$basename" | sed 's/scoutsuite_results_//')
# Generate output filename
case "$OUTPUT_FORMAT" in
json)
output_file="${REPORT_DIR}/${provider_account}_findings.json"
;;
csv)
output_file="${REPORT_DIR}/${provider_account}_findings.csv"
;;
html)
output_file="${REPORT_DIR}/${provider_account}_findings.html"
;;
*)
echo "[-] Unsupported output format: $OUTPUT_FORMAT"
exit 1
;;
esac
# Parse the results
parse_scoutsuite_results "$results_file" "$output_file" "$OUTPUT_FORMAT"
done
echo "[+] Report parsing completed"
Integration with Other Tools
CI/CD Pipeline Integration
yaml
# GitLab CI example
stages:
- security-audit
cloud-security-audit:
stage: security-audit
image: python:3.9
before_script:
- pip install scoutsuite
- aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID
- aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY
- aws configure set region $AWS_DEFAULT_REGION
script:
- scout aws --report-dir ./security-audit --force
- python parse_results.py ./security-audit
artifacts:
reports:
junit: security-audit/junit-report.xml
paths:
- security-audit/
expire_in: 1 week
only:
- main
- develop
Terraform Integration
bash
# Run ScoutSuite after Terraform apply
terraform apply
scout aws --report-dir ./post-terraform-audit
# Compare before and after
scout aws --report-dir ./pre-terraform-audit
terraform apply
scout aws --report-dir ./post-terraform-audit
diff -r ./pre-terraform-audit ./post-terraform-audit
SIEM Integration
bash
# Send ScoutSuite findings to SIEM
parse_and_send_to_siem() {
local results_file="$1"
local siem_endpoint="$2"
python3 << EOF
import json
import requests
import re
# Parse ScoutSuite results
with open('$results_file', 'r') as f:
content = f.read()
json_match = re.search(r'scoutsuite_results\s*=\s*({.*});', content, re.DOTALL)
if json_match:
data = json.loads(json_match.group(1))
# Convert to SIEM format
siem_events = []
# ... process findings and convert to SIEM format
# Send to SIEM
for event in siem_events:
response = requests.post('$siem_endpoint', json=event)
print(f"Sent event to SIEM: {response.status_code}")
EOF
}
Troubleshooting
Common Issues
Authentication Problems
bash
# Check AWS credentials
aws sts get-caller-identity
# Check Azure authentication
az account show
# Check GCP authentication
gcloud auth list
# Verify permissions
aws iam get-user
az role assignment list --assignee $(az account show --query user.name -o tsv)
gcloud projects get-iam-policy PROJECT_ID
Installation Issues
bash
# Update pip
pip3 install --upgrade pip
# Install with verbose output
pip3 install -v scoutsuite
# Install from source
git clone https://github.com/nccgroup/ScoutSuite.git
cd ScoutSuite
pip3 install -e .
# Check dependencies
pip3 check scoutsuite
Report Generation Issues
bash
# Check disk space
df -h
# Verify write permissions
ls -la $(dirname $(pwd))
# Run with debug output
scout aws --debug
# Check for corrupted reports
find . -name "*.js" -size 0
Performance Issues
bash
# Limit regions
scout aws --regions us-east-1
# Skip large services
scout aws --skip-services s3,cloudtrail
# Use specific services only
scout aws --services ec2,iam,s3
# Increase timeout
export SCOUT_TIMEOUT=300
Debugging and Logging
bash
# Enable debug logging
scout aws --debug
# Check log files
ls -la ~/.scout/logs/
# Verbose output
scout aws --verbose
# Custom log level
export SCOUT_LOG_LEVEL=DEBUG
Resources
- ScoutSuite GitHub Repository
- ScoutSuite Documentation
- AWS Security Best Practices
- Azure Security Documentation
- Google Cloud Security
- Cloud Security Alliance
- NIST Cloud Computing Security
This cheat sheet provides a comprehensive reference for using ScoutSuite for multi-cloud security auditing. Always ensure you have proper authorization before using this tool in any environment.