Salta ai contenuti

S3Scanner

S3Scanner is a security reconnaissance tool that probes for open and misconfigured AWS S3 buckets. It can enumerate bucket contents, identify permission issues, and find sensitive data exposed through overly permissive bucket policies. This tool is essential for authorized cloud security assessments and AWS penetration testing.

Key Capabilities:

  • Scan for bucket existence and accessibility
  • Enumerate bucket contents and permissions
  • Test for common misconfiguration patterns
  • Find buckets with public read/write access
  • Validate bucket policies and ACLs
git clone https://github.com/sa7mon/S3Scanner.git
cd S3Scanner
python3 -m pip install -r requirements.txt
pip3 install s3scanner
s3scanner --version
python3 -m s3scanner --help
s3scanner -b bucket-name
s3scanner --bucket my-company-bucket
# Check if bucket exists and is publicly readable
s3scanner -b target-bucket -o json

# Enumerate bucket contents (if accessible)
s3scanner -b target-bucket --enumerate
# Create a wordlist of bucket names to test
cat > bucket_names.txt << 'EOF'
company-backups
company-logs
company-documents
company-test
company-prod
EOF

s3scanner -l bucket_names.txt
s3scanner --list bucket_names.txt
CommandPurpose
-b, --bucketScan a specific bucket name
-l, --listScan multiple buckets from file
-o, --out-fileSave results to output file
--format jsonOutput results as JSON
--enumerateList bucket contents if accessible
--threadsSet number of scanning threads
-v, --verboseEnable verbose output
--dumpDownload all accessible files
--max-keysLimit enumeration results
--regionSpecify AWS region to test
# Test common naming conventions
for name in backup logs data archive test staging prod; do
  s3scanner -b "company-$name" --format json
done
s3scanner -l bucket_names.txt --out-file scan_results.json --format json
# Find accessible buckets and list their contents
s3scanner -b target-bucket --enumerate --max-keys 100
# Test for public-read permission
s3scanner -b bucket-name --verbose
# Scan specific AWS region
s3scanner -b bucket-name --region us-east-1
s3scanner -b bucket-name --region eu-west-1
# Scan multiple buckets with 10 threads
s3scanner -l bucket_list.txt --threads 10 --out-file results.json
# Download files from accessible bucket
s3scanner -b vulnerable-bucket --enumerate --dump --out-file downloaded_files/
# After S3Scanner identifies accessible bucket
aws s3 ls s3://bucket-name/
aws s3 cp s3://bucket-name/object local_file
# Scan buckets across different regions
for region in us-east-1 us-west-2 eu-west-1 ap-southeast-1; do
  s3scanner -b company-data --region $region
done
s3scanner -b example-bucket --format json | jq .
FieldMeaning
bucketThe S3 bucket name tested
existsWhether the bucket exists
publicIf bucket is publicly accessible
access_levelPublic-read, authenticated-read, or private
owner_idAWS account ID of bucket owner
key_countNumber of objects in bucket
regionAWS region where bucket resides
aclBucket ACL permissions
policyBucket policy details
# Bucket exists but not accessible
{"bucket": "target", "exists": true, "public": false}

# Bucket exists and publicly readable
{"bucket": "target", "exists": true, "public": true, "access_level": "public-read"}

# Bucket doesn't exist
{"bucket": "target", "exists": false}
# Use common naming patterns
cat > generate_buckets.sh << 'EOF'
#!/bin/bash
company="mycompany"
patterns=("backup" "backup-" "backups" "bak" "data" "db" "database" 
          "logs" "log-" "prod" "production" "staging" "test" "dev" "tmp")

for pattern in "${patterns[@]}"; do
  echo "${company}-${pattern}"
  echo "${company}${pattern}"
  echo "${pattern}-${company}"
done
EOF

chmod +x generate_buckets.sh
./generate_buckets.sh > bucket_wordlist.txt
# S3 bucket name wordlists from security research
wget https://raw.githubusercontent.com/sa7mon/S3Scanner/master/wordlists/common.txt
# Set AWS credentials for authenticated testing
export AWS_ACCESS_KEY_ID="your_access_key"
export AWS_SECRET_ACCESS_KEY="your_secret_key"
export AWS_DEFAULT_REGION="us-east-1"

s3scanner -b target-bucket --enumerate
# Use specific IAM role credentials
AWS_PROFILE=penetration-test-role s3scanner -l bucket_list.txt
# Scan common bucket patterns
s3scanner -l common_bucket_names.txt --format json --out-file initial_scan.json
# Test confirmed accessible buckets manually
aws s3 ls s3://confirmed-bucket/
# Create detailed report of vulnerable buckets
cat initial_scan.json | jq '.[] | select(.public == true)'
# Examine bucket policies of vulnerable buckets
aws s3api get-bucket-policy --bucket vulnerable-bucket
aws s3api get-bucket-acl --bucket vulnerable-bucket
# Verify AWS credentials are set correctly
aws sts get-caller-identity

# Check credential file permissions
chmod 600 ~/.aws/credentials
# Reduce thread count for unreliable connections
s3scanner -l bucket_list.txt --threads 2
# S3Scanner implements delays automatically
# For very large scans, use longer intervals
s3scanner -l huge_wordlist.txt --threads 1
# Update CA certificates if needed
pip3 install --upgrade certifi
  • Always obtain written authorization before scanning AWS resources
  • Use separate AWS accounts for penetration testing
  • Document all test parameters and results
  • Follow AWS responsible disclosure policies
  • Maintain separate wordlists for different assessment targets
  • Combine common patterns with company-specific naming conventions
  • Update wordlists based on discovered bucket naming schemes
  • Organize results by date and target organization
  • Test during agreed-upon maintenance windows
  • Limit enumeration to minimize API calls and costs
  • Use minimal threads to avoid overwhelming target infrastructure
  • Remove or disable test buckets after assessment completion
# Create comprehensive report
s3scanner -l bucket_list.txt \
  --format json \
  --out-file report_$(date +%Y%m%d).json \
  --verbose
# Find all publicly accessible buckets
jq '.[] | select(.public == true) | .bucket' results.json

# Count vulnerable buckets
jq '[.[] | select(.public == true)] | length' results.json
# Get bucket regions from S3Scanner results
jq -r '.[] | select(.public == true) | .region' results.json

# Get policy details for vulnerable buckets
while read bucket; do
  echo "=== $bucket ==="
  aws s3api get-bucket-policy --bucket "$bucket" 2>/dev/null
done < vulnerable_buckets.txt
# Export URLs for web proxy analysis
jq -r '.[] | select(.public == true) | "https://\(.bucket).s3.amazonaws.com/"' results.json
  • S3Scanner is designed for authorized security testing only
  • Unauthorized access to S3 buckets violates AWS terms of service and may violate laws like the Computer Fraud and Abuse Act (CFAA)
  • Always operate within the scope of written penetration testing agreements
  • Report findings through proper channels and remediation processes
  • Maintain confidentiality of discovered sensitive data
  • Follow responsible disclosure timelines
  • AWS S3 Security Best Practices Documentation
  • AWS Bucket Policy Examples and IAM Policies
  • OWASP Cloud Security Testing Guide
  • AWS Penetration Testing Authorization and Guidelines