CloudSploit Cheat Sheet
Überblick
CloudSploit ist ein Open-Source-Cloud-Sicherheitshaltungsmanagement (CSPM), das Cloud-Infrastruktur für Sicherheitsfehler und Compliance-Verstöße scannt. Ursprünglich von Aqua Security entwickelt, unterstützt es AWS, Azure und Google Cloud Platform und bietet umfassende Sicherheitsbewertungen mit über 100 integrierten Plugins zur Erkennung gemeinsamer Sicherheitsprobleme.
RECHT *Key Features: Multi-Cloud-Sicherheits-Scanning, 100+ Sicherheits-Plugins, Compliance-Frameworks (CIS, PCI DSS, HIPAA), automatisierte Sanierungsvorschläge, CI/CD-Integration und detaillierte Sicherheitsberichte.
Installation und Inbetriebnahme
Node.js Installation
```bash
Install Node.js (required)
curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash - sudo apt-get install -y nodejs
Verify Node.js installation
node --version npm --version
Clone CloudSploit repository
git clone https://github.com/aquasecurity/cloudsploit.git cd cloudsploit
Install dependencies
npm install
Verify installation
node index.js --help ```_
Docker Installation
```bash
Pull CloudSploit Docker image
docker pull aquasec/cloudsploit
Run CloudSploit in Docker
docker run --rm -it aquasec/cloudsploit --help
Create Docker alias for easier usage
echo 'alias cloudsploit="docker run --rm -it -v ~/.aws:/root/.aws -v ~/.azure:/root/.azure -v ~/.config/gcloud:/root/.config/gcloud aquasec/cloudsploit"' >> ~/.bashrc source ~/.bashrc
Run with volume mounts for cloud credentials
docker run --rm -it \ -v ~/.aws:/root/.aws \ -v ~/.azure:/root/.azure \ -v ~/.config/gcloud:/root/.config/gcloud \ -v $(pwd)/output:/app/output \ aquasec/cloudsploit \ --cloud aws \ --format json \ --output /app/output/aws_scan.json
Create Docker Compose file
cat > docker-compose.yml << 'EOF' version: '3.8' services: cloudsploit: image: aquasec/cloudsploit volumes: - ~/.aws:/root/.aws - ~/.azure:/root/.azure - ~/.config/gcloud:/root/.config/gcloud - ./output:/app/output environment: - AWS_PROFILE=default - AZURE_SUBSCRIPTION_ID=${AZURE_SUBSCRIPTION_ID} - GOOGLE_APPLICATION_CREDENTIALS=/root/.config/gcloud/application_default_credentials.json EOF
Run with Docker Compose
docker-compose run cloudsploit --cloud aws --format json ```_
Installation des Paketmanagers
```bash
Install via npm globally
npm install -g cloudsploit
Verify global installation
cloudsploit --help
Alternative: Install from source
git clone https://github.com/aquasecurity/cloudsploit.git cd cloudsploit npm install -g .
Create symbolic link for easier access
sudo ln -sf $(pwd)/index.js /usr/local/bin/cloudsploit ```_
Cloud Provider Setup
AWS Konfiguration
```bash
Install AWS CLI
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" unzip awscliv2.zip sudo ./aws/install
Configure AWS credentials
aws configure
Enter: Access Key ID, Secret Access Key, Region, Output format
Alternative: Use environment variables
export AWS_ACCESS_KEY_ID="your-access-key" export AWS_SECRET_ACCESS_KEY="your-secret-key" export AWS_DEFAULT_REGION="us-east-1"
Alternative: Use IAM roles (recommended for EC2)
Attach IAM role with required permissions to EC2 instance
Test AWS configuration
aws sts get-caller-identity
Required AWS permissions for CloudSploit
cat > cloudsploit-policy.json << 'EOF' { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "acm:DescribeCertificate", "acm:ListCertificates", "cloudformation:DescribeStacks", "cloudformation:GetTemplate", "cloudformation:ListStacks", "cloudtrail:DescribeTrails", "cloudtrail:GetTrailStatus", "cloudwatch:DescribeAlarms", "ec2:DescribeAddresses", "ec2:DescribeImages", "ec2:DescribeInstances", "ec2:DescribeNetworkAcls", "ec2:DescribeRegions", "ec2:DescribeRouteTables", "ec2:DescribeSecurityGroups", "ec2:DescribeSnapshots", "ec2:DescribeSubnets", "ec2:DescribeVolumes", "ec2:DescribeVpcs", "iam:GenerateCredentialReport", "iam:GetAccountPasswordPolicy", "iam:GetCredentialReport", "iam:GetGroup", "iam:GetGroupPolicy", "iam:GetLoginProfile", "iam:GetPolicy", "iam:GetPolicyVersion", "iam:GetRole", "iam:GetRolePolicy", "iam:GetUser", "iam:GetUserPolicy", "iam:ListAccessKeys", "iam:ListAttachedGroupPolicies", "iam:ListAttachedRolePolicies", "iam:ListAttachedUserPolicies", "iam:ListEntitiesForPolicy", "iam:ListGroupPolicies", "iam:ListGroups", "iam:ListGroupsForUser", "iam:ListMFADevices", "iam:ListPolicies", "iam:ListRolePolicies", "iam:ListRoles", "iam:ListServerCertificates", "iam:ListUserPolicies", "iam:ListUsers", "iam:ListVirtualMFADevices", "kms:DescribeKey", "kms:GetKeyPolicy", "kms:GetKeyRotationStatus", "kms:ListAliases", "kms:ListGrants", "kms:ListKeys", "rds:DescribeDBInstances", "rds:DescribeDBParameterGroups", "rds:DescribeDBParameters", "rds:DescribeDBSecurityGroups", "rds:DescribeDBSnapshots", "rds:DescribeDBSubnetGroups", "route53:ListHostedZones", "route53:ListResourceRecordSets", "s3:GetBucketAcl", "s3:GetBucketLocation", "s3:GetBucketLogging", "s3:GetBucketPolicy", "s3:GetBucketVersioning", "s3:ListAllMyBuckets", "ses:GetIdentityDkimAttributes", "ses:GetIdentityVerificationAttributes", "ses:ListIdentities", "ses:ListVerifiedEmailAddresses", "sns:GetTopicAttributes", "sns:ListSubscriptions", "sns:ListTopics", "sqs:GetQueueAttributes", "sqs:ListQueues" ], "Resource": "*" } ] } EOF
Create IAM policy
aws iam create-policy \ --policy-name CloudSploitPolicy \ --policy-document file://cloudsploit-policy.json
Attach policy to user
aws iam attach-user-policy \ --user-name your-username \ --policy-arn arn:aws:iam::ACCOUNT-ID:policy/CloudSploitPolicy ```_
Azure Konfiguration
```bash
Install Azure CLI
curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
Login to Azure
az login
Set subscription
az account set --subscription "your-subscription-id"
Create service principal for CloudSploit
az ad sp create-for-rbac \ --name "CloudSploitSP" \ --role "Security Reader" \ --scopes "/subscriptions/your-subscription-id"
Note the output: appId, password, tenant
Set environment variables
export AZURE_CLIENT_ID="app-id-from-output" export AZURE_CLIENT_SECRET="password-from-output" export AZURE_TENANT_ID="tenant-from-output" export AZURE_SUBSCRIPTION_ID="your-subscription-id"
Alternative: Use Azure CLI authentication
az login --service-principal \ --username $AZURE_CLIENT_ID \ --password $AZURE_CLIENT_SECRET \ --tenant $AZURE_TENANT_ID
Test Azure configuration
az account show ```_
Google Cloud Konfiguration
```bash
Install Google Cloud SDK
curl https://sdk.cloud.google.com | bash exec -l $SHELL
Initialize gcloud
gcloud init
Authenticate
gcloud auth login
Set project
gcloud config set project your-project-id
Create service account for CloudSploit
gcloud iam service-accounts create cloudsploit-sa \ --display-name="CloudSploit Service Account"
Grant required roles
gcloud projects add-iam-policy-binding your-project-id \ --member="serviceAccount:cloudsploit-sa@your-project-id.iam.gserviceaccount.com" \ --role="roles/security.securityReviewer"
gcloud projects add-iam-policy-binding your-project-id \ --member="serviceAccount:cloudsploit-sa@your-project-id.iam.gserviceaccount.com" \ --role="roles/viewer"
Create and download service account key
gcloud iam service-accounts keys create cloudsploit-key.json \ --iam-account=cloudsploit-sa@your-project-id.iam.gserviceaccount.com
Set environment variable
export GOOGLE_APPLICATION_CREDENTIALS="$(pwd)/cloudsploit-key.json"
Test GCP configuration
gcloud auth list gcloud projects list ```_
Basisnutzung und Scanning
Einfache Scans
```bash
Basic AWS scan
node index.js --cloud aws
Basic Azure scan
node index.js --cloud azure
Basic GCP scan
node index.js --cloud gcp
Scan specific region
node index.js --cloud aws --region us-east-1
Scan multiple regions
node index.js --cloud aws --region us-east-1,us-west-2,eu-west-1
Scan with specific profile
node index.js --cloud aws --profile production
Scan with custom output format
node index.js --cloud aws --format json node index.js --cloud aws --format csv node index.js --cloud aws --format table ```_
Erweiterte Scanoptionen
```bash
Scan with specific plugins
node index.js --cloud aws --plugins ec2,s3,iam
Exclude specific plugins
node index.js --cloud aws --ignore-plugins cloudtrail,vpc
Scan with compliance framework
node index.js --cloud aws --compliance cis
Scan with custom severity levels
node index.js --cloud aws --severity high,critical
Scan with output file
node index.js --cloud aws --format json --output aws_scan_results.json
Scan with detailed output
node index.js --cloud aws --verbose
Scan with custom timeout
node index.js --cloud aws --timeout 300
Scan with parallel execution
node index.js --cloud aws --parallel 10 ```_
Multi-Cloud-Scanning
```bash
Scan all configured clouds
node index.js --cloud all
Scan multiple clouds with output
node index.js --cloud aws,azure,gcp --format json --output multi_cloud_scan.json
Create multi-cloud scanning script
cat > multi_cloud_scan.sh << 'EOF'
!/bin/bash
Multi-cloud security scanning with CloudSploit
TIMESTAMP=$(date +%Y%m%d_%H%M%S) OUTPUT_DIR="scans/$TIMESTAMP"
mkdir -p "$OUTPUT_DIR"
echo "Starting multi-cloud security scan..."
AWS Scan
if aws sts get-caller-identity > /dev/null 2>&1; then echo "Scanning AWS..." node index.js --cloud aws --format json --output "$OUTPUT_DIR/aws_scan.json" echo "AWS scan completed" else echo "AWS credentials not configured, skipping..." fi
Azure Scan
if az account show > /dev/null 2>&1; then echo "Scanning Azure..." node index.js --cloud azure --format json --output "$OUTPUT_DIR/azure_scan.json" echo "Azure scan completed" else echo "Azure credentials not configured, skipping..." fi
GCP Scan
if gcloud auth list --filter=status:ACTIVE --format="value(account)" | head -n1 > /dev/null 2>&1; then echo "Scanning GCP..." node index.js --cloud gcp --format json --output "$OUTPUT_DIR/gcp_scan.json" echo "GCP scan completed" else echo "GCP credentials not configured, skipping..." fi
echo "Multi-cloud scan completed. Results in: $OUTPUT_DIR"
Generate summary report
python3 << 'PYTHON' import json import os import glob
def generate_summary(output_dir): summary = { "timestamp": "$TIMESTAMP", "clouds": {}, "total_findings": 0, "severity_breakdown": {"CRITICAL": 0, "HIGH": 0, "MEDIUM": 0, "LOW": 0} }
for scan_file in glob.glob(f"{output_dir}/*.json"):
cloud_name = os.path.basename(scan_file).replace("_scan.json", "")
try:
with open(scan_file, 'r') as f:
data = json.load(f)
findings = 0
for result in data:
if result.get("status") == "FAIL":
findings += 1
severity = result.get("severity", "UNKNOWN")
if severity in summary["severity_breakdown"]:
summary["severity_breakdown"][severity] += 1
summary["clouds"][cloud_name] = {
"findings": findings,
"total_checks": len(data)
}
summary["total_findings"] += findings
except Exception as e:
print(f"Error processing {scan_file}: {e}")
with open(f"{output_dir}/summary.json", 'w') as f:
json.dump(summary, f, indent=2)
print(f"Summary report generated: {output_dir}/summary.json")
generate_summary("$OUTPUT_DIR") PYTHON
EOF
chmod +x multi_cloud_scan.sh
Run multi-cloud scan
./multi_cloud_scan.sh ```_
Plugin Management und Anpassung
Verfügbare Plugins
```bash
List all available plugins
node index.js --list-plugins
List plugins by cloud provider
node index.js --list-plugins --cloud aws node index.js --list-plugins --cloud azure node index.js --list-plugins --cloud gcp
Show plugin details
node index.js --describe-plugin ec2/instancesInPublicSubnet
List plugins by category
| grep -r "category:" plugins/ | sort | uniq |
List plugins by severity
| grep -r "severity:" plugins/ | sort | uniq | ```_
Benutzerdefinierte Plugin Entwicklung
```javascript // Create custom plugin: plugins/aws/ec2/customSecurityCheck.js
var async = require('async'); var helpers = require('../../../helpers/aws');
module.exports = { title: 'Custom EC2 Security Check', category: 'EC2', description: 'Ensures EC2 instances follow custom security requirements', more_info: 'Custom security check for EC2 instances based on organizational policies.', link: 'https://docs.aws.amazon.com/ec2/', recommended_action: 'Review and remediate EC2 instances that do not meet security requirements.', apis: ['EC2:describeInstances'], compliance: { cis: '2.1.1' }, severity: 'High',
run: function(cache, settings, callback) {
var results = [];
var source = {};
var regions = helpers.regions(settings);
async.each(regions.ec2, function(region, rcb){
var describeInstances = helpers.addSource(cache, source,
['ec2', 'describeInstances', region]);
if (!describeInstances) return rcb();
| if (describeInstances.err | | !describeInstances.data) { | helpers.addResult(results, 3, 'Unable to query for instances: ' + helpers.addError(describeInstances), region); return rcb(); }
if (!describeInstances.data.length) {
helpers.addResult(results, 0, 'No instances found', region);
return rcb();
}
describeInstances.data.forEach(function(reservation){
reservation.Instances.forEach(function(instance){
var resource = instance.InstanceId;
// Custom security checks
var securityIssues = [];
// Check 1: Instance should have specific tags
var requiredTags = ['Environment', 'Owner', 'Project'];
| var instanceTags = instance.Tags | | []; | var tagNames = instanceTags.map(tag => tag.Key);
requiredTags.forEach(function(requiredTag) {
if (!tagNames.includes(requiredTag)) {
securityIssues.push(`Missing required tag: ${requiredTag}`);
}
});
// Check 2: Instance should not be in default VPC
if (instance.VpcId && instance.VpcId.includes('default')) {
securityIssues.push('Instance is running in default VPC');
}
// Check 3: Instance should have detailed monitoring enabled
| if (!instance.Monitoring | | instance.Monitoring.State !== 'enabled') { | securityIssues.push('Detailed monitoring is not enabled'); }
// Check 4: Instance should not have public IP in production
var envTag = instanceTags.find(tag => tag.Key === 'Environment');
if (envTag && envTag.Value.toLowerCase() === 'production' && instance.PublicIpAddress) {
securityIssues.push('Production instance has public IP address');
}
if (securityIssues.length > 0) {
helpers.addResult(results, 2,
`Instance has security issues: ${securityIssues.join(', ')}`,
region, resource);
} else {
helpers.addResult(results, 0,
'Instance meets custom security requirements',
region, resource);
}
});
});
rcb();
}, function(){
callback(null, results, source);
});
}
}; ```_
Plugin Konfiguration
```javascript // Create plugin configuration file: config/plugins.js
module.exports = { // Global plugin settings settings: { // Timeout for API calls (seconds) timeout: 120,
// Maximum number of parallel API calls
parallelism: 10,
// Retry configuration
retries: 3,
retryDelay: 1000,
// Custom severity mappings
severityMapping: {
'CRITICAL': 4,
'HIGH': 3,
'MEDIUM': 2,
'LOW': 1,
'INFO': 0
}
},
// Plugin-specific configurations
plugins: {
// AWS EC2 plugins
'ec2/instancesInPublicSubnet': {
enabled: true,
severity: 'HIGH',
settings: {
// Allow specific instance types in public subnets
allowedInstanceTypes: ['t3.nano', 't3.micro']
}
},
// S3 plugins
's3/bucketAllUsersPolicy': {
enabled: true,
severity: 'CRITICAL',
settings: {
// Whitelist specific buckets
whitelistedBuckets: ['public-website-bucket']
}
},
// IAM plugins
'iam/rootAccessKeys': {
enabled: true,
severity: 'CRITICAL'
},
// Custom plugins
'ec2/customSecurityCheck': {
enabled: true,
severity: 'HIGH',
settings: {
requiredTags: ['Environment', 'Owner', 'Project', 'CostCenter'],
allowPublicIpEnvironments: ['development', 'staging']
}
}
},
// Compliance framework mappings
compliance: {
cis: {
enabled: true,
version: '1.4.0',
plugins: [
'iam/rootAccessKeys',
'iam/mfaEnabled',
'cloudtrail/cloudtrailEnabled',
's3/bucketAllUsersPolicy'
]
},
pci: {
enabled: true,
version: '3.2.1',
plugins: [
'ec2/instancesInPublicSubnet',
's3/bucketAllUsersPolicy',
'rds/rdsEncryptionEnabled'
]
},
hipaa: {
enabled: true,
plugins: [
'rds/rdsEncryptionEnabled',
's3/bucketEncryptionInTransit',
'kms/kmsKeyRotation'
]
}
}
}; ```_
Benutzerdefinierte Plugins ausführen
```bash
Run with custom plugin configuration
node index.js --cloud aws --config config/plugins.js
Run specific custom plugin
node index.js --cloud aws --plugins ec2/customSecurityCheck
Test custom plugin
node index.js --cloud aws --plugins ec2/customSecurityCheck --verbose --dry-run
Validate plugin syntax
node -c plugins/aws/ec2/customSecurityCheck.js
Run plugin development helper
cat > test_plugin.js << 'EOF' // Plugin testing helper
const plugin = require('./plugins/aws/ec2/customSecurityCheck.js'); const mockCache = require('./test/mock_cache.js');
// Test plugin with mock data plugin.run(mockCache, {}, function(err, results, source) { if (err) { console.error('Plugin error:', err); return; }
console.log('Plugin results:');
results.forEach(function(result) {
console.log(`${result.status}: ${result.message} (${result.region})`);
});
}); EOF
node test_plugin.js ```_
Compliance und Reporting
Compliance Framework Scanning
```bash
CIS Benchmark scanning
node index.js --cloud aws --compliance cis --format json --output cis_aws_scan.json
PCI DSS compliance scanning
node index.js --cloud aws --compliance pci --format json --output pci_aws_scan.json
HIPAA compliance scanning
node index.js --cloud aws --compliance hipaa --format json --output hipaa_aws_scan.json
SOC 2 compliance scanning
node index.js --cloud aws --compliance soc2 --format json --output soc2_aws_scan.json
Custom compliance scanning
node index.js --cloud aws --compliance custom --config compliance/custom_framework.js
Multi-framework compliance scanning
cat > compliance_scan.sh << 'EOF'
!/bin/bash
Comprehensive compliance scanning
TIMESTAMP=$(date +%Y%m%d_%H%M%S) COMPLIANCE_DIR="compliance_reports/$TIMESTAMP"
mkdir -p "$COMPLIANCE_DIR"
echo "Starting compliance scanning..."
CIS Benchmark
echo "Running CIS Benchmark scan..." node index.js --cloud aws --compliance cis --format json --output "$COMPLIANCE_DIR/cis_report.json"
PCI DSS
echo "Running PCI DSS scan..." node index.js --cloud aws --compliance pci --format json --output "$COMPLIANCE_DIR/pci_report.json"
HIPAA
echo "Running HIPAA scan..." node index.js --cloud aws --compliance hipaa --format json --output "$COMPLIANCE_DIR/hipaa_report.json"
Generate compliance summary
python3 << 'PYTHON' import json import os
def generate_compliance_summary(compliance_dir): frameworks = ['cis', 'pci', 'hipaa'] summary = { "timestamp": "$TIMESTAMP", "frameworks": {}, "overall_score": 0 }
total_score = 0
framework_count = 0
for framework in frameworks:
report_file = f"{compliance_dir}/{framework}_report.json"
if os.path.exists(report_file):
with open(report_file, 'r') as f:
data = json.load(f)
total_checks = len(data)
passed_checks = sum(1 for result in data if result.get("status") == "OK")
failed_checks = total_checks - passed_checks
score = (passed_checks / total_checks * 100) if total_checks > 0 else 0
summary["frameworks"][framework] = {
"total_checks": total_checks,
"passed_checks": passed_checks,
"failed_checks": failed_checks,
"compliance_score": round(score, 2)
}
total_score += score
framework_count += 1
if framework_count > 0:
summary["overall_score"] = round(total_score / framework_count, 2)
with open(f"{compliance_dir}/compliance_summary.json", 'w') as f:
json.dump(summary, f, indent=2)
print(f"Compliance summary generated: {compliance_dir}/compliance_summary.json")
generate_compliance_summary("$COMPLIANCE_DIR") PYTHON
echo "Compliance scanning completed. Reports in: $COMPLIANCE_DIR" EOF
chmod +x compliance_scan.sh ./compliance_scan.sh ```_
Zollkodex
```javascript // Create custom compliance framework: compliance/custom_framework.js
module.exports = { name: 'Custom Security Framework', version: '1.0.0', description: 'Organization-specific security compliance framework',
controls: {
'CSF-001': {
title: 'Data Encryption at Rest',
description: 'All data must be encrypted at rest',
plugins: [
's3/bucketEncryption',
'rds/rdsEncryptionEnabled',
'ec2/encryptedEbsSnapshots'
],
severity: 'CRITICAL'
},
'CSF-002': {
title: 'Network Security',
description: 'Network access must be properly controlled',
plugins: [
'ec2/instancesInPublicSubnet',
'vpc/defaultSecurityGroup',
'ec2/openSecurityGroups'
],
severity: 'HIGH'
},
'CSF-003': {
title: 'Access Management',
description: 'Access must follow principle of least privilege',
plugins: [
'iam/rootAccessKeys',
'iam/mfaEnabled',
'iam/unusedCredentials'
],
severity: 'HIGH'
},
'CSF-004': {
title: 'Monitoring and Logging',
description: 'All activities must be logged and monitored',
plugins: [
'cloudtrail/cloudtrailEnabled',
'cloudwatch/logGroupEncryption',
'vpc/flowLogsEnabled'
],
severity: 'MEDIUM'
},
'CSF-005': {
title: 'Backup and Recovery',
description: 'Critical data must be backed up regularly',
plugins: [
'rds/rdsBackupEnabled',
'ec2/ebsSnapshotPublic',
's3/bucketVersioning'
],
severity: 'MEDIUM'
}
},
// Scoring methodology
scoring: {
'CRITICAL': 10,
'HIGH': 7,
'MEDIUM': 4,
'LOW': 1
},
// Compliance thresholds
thresholds: {
'excellent': 95,
'good': 85,
'fair': 70,
'poor': 50
}
}; ```_
Erweiterte Meldung
```python
!/usr/bin/env python3
Advanced CloudSploit reporting and analysis
import json import csv import datetime import argparse import matplotlib.pyplot as plt import pandas as pd from jinja2 import Template import smtplib from email.mime.text import MIMEText from email.mime.multipart import MIMEMultipart from email.mime.base import MIMEBase from email import encoders
class CloudSploitReporter: """Advanced reporting for CloudSploit scan results"""
def __init__(self):
self.timestamp = datetime.datetime.now().strftime('%Y-%m-%d_%H-%M-%S')
def load_scan_results(self, file_path):
"""Load CloudSploit scan results from JSON file"""
try:
with open(file_path, 'r') as f:
return json.load(f)
except Exception as e:
print(f"Error loading scan results: {e}")
return None
def analyze_results(self, results):
"""Analyze scan results and generate statistics"""
analysis = {
'total_checks': len(results),
'passed_checks': 0,
'failed_checks': 0,
'unknown_checks': 0,
'severity_breakdown': {'CRITICAL': 0, 'HIGH': 0, 'MEDIUM': 0, 'LOW': 0, 'INFO': 0},
'category_breakdown': {},
'region_breakdown': {},
'top_issues': []
}
for result in results:
status = result.get('status', 'UNKNOWN')
severity = result.get('severity', 'UNKNOWN')
category = result.get('category', 'UNKNOWN')
region = result.get('region', 'UNKNOWN')
# Count by status
if status == 'OK':
analysis['passed_checks'] += 1
elif status == 'FAIL':
analysis['failed_checks'] += 1
else:
analysis['unknown_checks'] += 1
# Count by severity
if severity in analysis['severity_breakdown']:
analysis['severity_breakdown'][severity] += 1
# Count by category
if category not in analysis['category_breakdown']:
analysis['category_breakdown'][category] = {'total': 0, 'failed': 0}
analysis['category_breakdown'][category]['total'] += 1
if status == 'FAIL':
analysis['category_breakdown'][category]['failed'] += 1
# Count by region
if region not in analysis['region_breakdown']:
analysis['region_breakdown'][region] = {'total': 0, 'failed': 0}
analysis['region_breakdown'][region]['total'] += 1
if status == 'FAIL':
analysis['region_breakdown'][region]['failed'] += 1
# Collect failed checks for top issues
if status == 'FAIL':
analysis['top_issues'].append({
'title': result.get('title', 'Unknown'),
'severity': severity,
'category': category,
'region': region,
'resource': result.get('resource', 'Unknown')
})
# Sort top issues by severity
severity_order = {'CRITICAL': 4, 'HIGH': 3, 'MEDIUM': 2, 'LOW': 1, 'INFO': 0}
analysis['top_issues'].sort(key=lambda x: severity_order.get(x['severity'], 0), reverse=True)
return analysis
def generate_html_report(self, analysis, scan_results, output_file):
"""Generate HTML report"""
html_template = """
CloudSploit Security Report
Generated on: {{ timestamp }}
Total Checks: {{ analysis.total_checks }}
Passed Checks
Failed Checks
Compliance Score
Severity Breakdown
Severity | Count | Percentage |
---|---|---|
{{ severity }} | {{ count }} | {{ "%.1f"|format((count / analysis.total_checks * 100) if analysis.total_checks > 0 else 0) }}% |
Top Security Issues
Issue | Severity | Category | Region | Resource |
---|---|---|---|---|
{{ issue.title }} | {{ issue.severity }} | {{ issue.category }} | {{ issue.region }} | {{ issue.resource }} |
Category Breakdown
Category | Total Checks | Failed Checks | Success Rate |
---|---|---|---|
{{ category }} | {{ stats.total }} | {{ stats.failed }} | {{ "%.1f"|format(((stats.total - stats.failed) / stats.total * 100) if stats.total > 0 else 0) }}% |
Regional Analysis
Region | Total Checks | Failed Checks | Success Rate |
---|---|---|---|
{{ region }} | {{ stats.total }} | {{ stats.failed }} | {{ "%.1f"|format(((stats.total - stats.failed) / stats.total * 100) if stats.total > 0 else 0) }}% |
"""
template = Template(html_template)
html_content = template.render(
timestamp=self.timestamp,
analysis=analysis
)
with open(output_file, 'w') as f:
f.write(html_content)
print(f"HTML report generated: {output_file}")
return output_file
def generate_csv_report(self, scan_results, output_file):
"""Generate CSV report"""
with open(output_file, 'w', newline='') as csvfile:
fieldnames = ['title', 'status', 'severity', 'category', 'region', 'resource', 'message']
writer = csv.DictWriter(csvfile, fieldnames=fieldnames)
writer.writeheader()
for result in scan_results:
writer.writerow({
'title': result.get('title', ''),
'status': result.get('status', ''),
'severity': result.get('severity', ''),
'category': result.get('category', ''),
'region': result.get('region', ''),
'resource': result.get('resource', ''),
'message': result.get('message', '')
})
print(f"CSV report generated: {output_file}")
return output_file
def generate_charts(self, analysis, output_dir):
"""Generate charts and visualizations"""
import os
os.makedirs(output_dir, exist_ok=True)
# Severity breakdown pie chart
plt.figure(figsize=(10, 6))
# Filter out zero values
severity_data = {k: v for k, v in analysis['severity_breakdown'].items() if v > 0}
if severity_data:
plt.subplot(1, 2, 1)
plt.pie(severity_data.values(), labels=severity_data.keys(), autopct='%1.1f%%')
plt.title('Findings by Severity')
# Pass/Fail breakdown
plt.subplot(1, 2, 2)
status_data = [analysis['passed_checks'], analysis['failed_checks']]
status_labels = ['Passed', 'Failed']
colors = ['#4CAF50', '#f44336']
plt.pie(status_data, labels=status_labels, colors=colors, autopct='%1.1f%%')
plt.title('Overall Compliance Status')
plt.tight_layout()
chart_file = f"{output_dir}/security_overview.png"
plt.savefig(chart_file)
plt.close()
print(f"Charts generated: {chart_file}")
return chart_file
def send_email_report(self, report_files, email_config):
"""Send report via email"""
msg = MIMEMultipart()
msg['From'] = email_config['from']
msg['To'] = ', '.join(email_config['to'])
msg['Subject'] = f"CloudSploit Security Report - {self.timestamp}"
body = f"""
CloudSploit Security Report
Generated on: {self.timestamp}
Please find the attached security scan reports for review.
Best regards, CloudSploit Automation """
msg.attach(MIMEText(body, 'plain'))
# Attach report files
for report_file in report_files:
with open(report_file, 'rb') as attachment:
part = MIMEBase('application', 'octet-stream')
part.set_payload(attachment.read())
encoders.encode_base64(part)
part.add_header(
'Content-Disposition',
f'attachment; filename= {os.path.basename(report_file)}'
)
msg.attach(part)
# Send email
try:
server = smtplib.SMTP(email_config['smtp_server'], email_config['smtp_port'])
server.starttls()
server.login(email_config['username'], email_config['password'])
text = msg.as_string()
server.sendmail(email_config['from'], email_config['to'], text)
server.quit()
print("Email sent successfully")
except Exception as e:
print(f"Error sending email: {e}")
def main(): parser = argparse.ArgumentParser(description='CloudSploit Advanced Reporter') parser.add_argument('scan_file', help='CloudSploit scan results JSON file') parser.add_argument('--output-dir', default='reports', help='Output directory for reports') parser.add_argument('--email-config', help='Email configuration file (JSON)') parser.add_argument('--charts', action='store_true', help='Generate charts and visualizations')
args = parser.parse_args()
reporter = CloudSploitReporter()
# Load scan results
scan_results = reporter.load_scan_results(args.scan_file)
if not scan_results:
return
# Analyze results
analysis = reporter.analyze_results(scan_results)
# Create output directory
import os
os.makedirs(args.output_dir, exist_ok=True)
# Generate reports
report_files = []
# HTML report
html_file = f"{args.output_dir}/security_report_{reporter.timestamp}.html"
reporter.generate_html_report(analysis, scan_results, html_file)
report_files.append(html_file)
# CSV report
csv_file = f"{args.output_dir}/security_report_{reporter.timestamp}.csv"
reporter.generate_csv_report(scan_results, csv_file)
report_files.append(csv_file)
# Charts
if args.charts:
chart_file = reporter.generate_charts(analysis, args.output_dir)
report_files.append(chart_file)
# Send email if configuration provided
if args.email_config:
with open(args.email_config, 'r') as f:
email_config = json.load(f)
reporter.send_email_report(report_files, email_config)
print(f"Report generation completed. Files: {report_files}")
if name == "main": main() ```_
CI/CD Integration und Automatisierung
GitHub Aktionen Integration
```yaml
.github/workflows/cloudsploit-security-scan.yml
name: CloudSploit Security Scan
on: push: branches: [ main, develop ] pull_request: branches: [ main ] schedule: # Run daily at 2 AM UTC - cron: '0 2 * * *'
jobs: security-scan: runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: '18'
- name: Install CloudSploit
run: |
git clone https://github.com/aquasecurity/cloudsploit.git
cd cloudsploit
npm install
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Run CloudSploit scan
run: |
cd cloudsploit
node index.js --cloud aws --format json --output ../aws_scan_results.json
- name: Generate security report
run: |
python3 -m pip install jinja2 matplotlib pandas
python3 scripts/generate_report.py aws_scan_results.json --output-dir reports --charts
- name: Upload scan results
uses: actions/upload-artifact@v3
with:
name: security-scan-results
path: |
aws_scan_results.json
reports/
- name: Check for critical issues
run: |
python3 << 'EOF'
import json
import sys
with open('aws_scan_results.json', 'r') as f:
results = json.load(f)
critical_issues = [r for r in results if r.get('status') == 'FAIL' and r.get('severity') == 'CRITICAL']
if critical_issues:
print(f"Found {len(critical_issues)} critical security issues!")
for issue in critical_issues[:5]: # Show first 5
print(f"- {issue.get('title', 'Unknown')}: {issue.get('message', 'No message')}")
sys.exit(1)
else:
print("No critical security issues found.")
EOF
- name: Comment PR with results
if: github.event_name == 'pull_request'
uses: actions/github-script@v6
with:
script: |
const fs = require('fs');
const results = JSON.parse(fs.readFileSync('aws_scan_results.json', 'utf8'));
const totalChecks = results.length;
const failedChecks = results.filter(r => r.status === 'FAIL').length;
const passedChecks = totalChecks - failedChecks;
const complianceScore = ((passedChecks / totalChecks) * 100).toFixed(1);
const criticalIssues = results.filter(r => r.status === 'FAIL' && r.severity === 'CRITICAL').length;
const highIssues = results.filter(r => r.status === 'FAIL' && r.severity === 'HIGH').length;
const comment = `## 🔒 CloudSploit Security Scan Results
**Compliance Score:** ${complianceScore}%
**Summary:**
- ✅ Passed: ${passedChecks}
- ❌ Failed: ${failedChecks}
- 🔴 Critical: ${criticalIssues}
- 🟠 High: ${highIssues}
${criticalIssues > 0 ? '⚠️ **Critical security issues found! Please review and remediate.**' : '✅ No critical security issues found.'}
[View detailed report](https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }})`;
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: comment
});
```_
GitLab CI Integration
```yaml
.gitlab-ci.yml
stages: - security-scan - report - notify
variables: CLOUDSPLOIT_VERSION: "latest"
security-scan: stage: security-scan image: node:18 before_script: - apt-get update && apt-get install -y python3 python3-pip - git clone https: //github.com/aquasecurity/cloudsploit.git - cd cloudsploit && npm install script: - cd cloudsploit - node index.js --cloud aws --format json --output ../aws_scan_results.json | - node index.js --cloud azure --format json --output ../azure_scan_results.json | | true | | - node index.js --cloud gcp --format json --output ../gcp_scan_results.json | | true | artifacts: reports: junit: scan_results.xml paths: - "*_scan_results.json" expire_in: 1 week only: - main - develop - merge_requests
generate-report: stage: report image: python:3.9 dependencies: - security-scan before_script: - pip install jinja2 matplotlib pandas script: - python3 scripts/generate_report.py aws_scan_results.json --output-dir reports --charts - python3 scripts/generate_compliance_report.py aws_scan_results.json --framework cis artifacts: paths: - reports/ expire_in: 1 month
security-notification: stage: notify image: alpine:latest dependencies: - security-scan before_script: - apk add --no-cache curl jq script: - | | CRITICAL_COUNT=$(jq '[.[] | select(.status == "FAIL" and .severity == "CRITICAL")] | length' aws_scan_results.json) | | HIGH_COUNT=$(jq '[.[] | select(.status == "FAIL" and .severity == "HIGH")] | length' aws_scan_results.json) |
| if [ "$CRITICAL_COUNT" -gt 0 ] | | [ "$HIGH_COUNT" -gt 5 ]; then | curl -X POST -H 'Content-type: application/json' \ --data "{\"text\": \"🚨 Security Alert: $CRITICAL_COUNT critical and $HIGH_COUNT high severity issues found in $CI_PROJECT_NAME\"}" \ $SLACK_WEBHOOK_URL fi only: - main ```_
Jenkins Pipeline
```groovy // Jenkinsfile
pipeline { agent any
parameters {
choice(
name: 'CLOUD_PROVIDER',
choices: ['aws', 'azure', 'gcp', 'all'],
description: 'Cloud provider to scan'
)
booleanParam(
name: 'GENERATE_CHARTS',
defaultValue: true,
description: 'Generate charts and visualizations'
)
}
environment {
CLOUDSPLOIT_DIR = 'cloudsploit'
REPORTS_DIR = 'reports'
TIMESTAMP = sh(script: 'date +%Y%m%d_%H%M%S', returnStdout: true).trim()
}
stages {
stage('Setup') {
steps {
script {
// Clean workspace
deleteDir()
// Clone CloudSploit
sh '''
git clone https://github.com/aquasecurity/cloudsploit.git
cd cloudsploit
npm install
'''
}
}
}
stage('Security Scan') {
parallel {
stage('AWS Scan') {
when {
anyOf {
params.CLOUD_PROVIDER == 'aws'
params.CLOUD_PROVIDER == 'all'
}
}
steps {
withCredentials([
[$class: 'AmazonWebServicesCredentialsBinding',
credentialsId: 'aws-credentials']
]) {
sh '''
cd cloudsploit
node index.js --cloud aws --format json --output ../aws_scan_${TIMESTAMP}.json
'''
}
}
}
stage('Azure Scan') {
when {
anyOf {
params.CLOUD_PROVIDER == 'azure'
params.CLOUD_PROVIDER == 'all'
}
}
steps {
withCredentials([
azureServicePrincipal('azure-credentials')
]) {
sh '''
cd cloudsploit
node index.js --cloud azure --format json --output ../azure_scan_${TIMESTAMP}.json
'''
}
}
}
stage('GCP Scan') {
when {
anyOf {
params.CLOUD_PROVIDER == 'gcp'
params.CLOUD_PROVIDER == 'all'
}
}
steps {
withCredentials([
file(credentialsId: 'gcp-service-account', variable: 'GOOGLE_APPLICATION_CREDENTIALS')
]) {
sh '''
cd cloudsploit
node index.js --cloud gcp --format json --output ../gcp_scan_${TIMESTAMP}.json
'''
}
}
}
}
}
stage('Generate Reports') {
steps {
script {
sh '''
mkdir -p ${REPORTS_DIR}
# Install Python dependencies
pip3 install jinja2 matplotlib pandas
# Generate reports for each scan result
for scan_file in *_scan_${TIMESTAMP}.json; do
if [ -f "$scan_file" ]; then
echo "Generating report for $scan_file"
python3 scripts/generate_report.py "$scan_file" \
--output-dir "${REPORTS_DIR}" \
${params.GENERATE_CHARTS ? '--charts' : ''}
fi
done
# Generate compliance reports
for scan_file in *_scan_${TIMESTAMP}.json; do
if [ -f "$scan_file" ]; then
python3 scripts/generate_compliance_report.py "$scan_file" \
--framework cis \
--output-dir "${REPORTS_DIR}"
fi
done
'''
}
}
}
stage('Security Gate') {
steps {
script {
def criticalIssues = 0
def highIssues = 0
// Check each scan result for critical issues
sh '''
for scan_file in *_scan_${TIMESTAMP}.json; do
if [ -f "$scan_file" ]; then
| critical=$(jq '[.[] | select(.status == "FAIL" and .severity == "CRITICAL")] | length' "$scan_file") | | high=$(jq '[.[] | select(.status == "FAIL" and .severity == "HIGH")] | length' "$scan_file") |
echo "File: $scan_file - Critical: $critical, High: $high"
# Fail build if critical issues found
if [ "$critical" -gt 0 ]; then
echo "CRITICAL SECURITY ISSUES FOUND!"
exit 1
fi
# Warn if too many high issues
if [ "$high" -gt 10 ]; then
echo "WARNING: High number of high-severity issues found!"
fi
fi
done
'''
}
}
}
stage('Archive Results') {
steps {
archiveArtifacts artifacts: '*_scan_*.json, reports/**/*', fingerprint: true
publishHTML([
allowMissing: false,
alwaysLinkToLastBuild: true,
keepAll: true,
reportDir: 'reports',
reportFiles: '*.html',
reportName: 'CloudSploit Security Report'
])
}
}
stage('Notify') {
steps {
script {
// Send Slack notification
def scanResults = readJSON file: "aws_scan_${env.TIMESTAMP}.json"
def totalChecks = scanResults.size()
def failedChecks = scanResults.count { it.status == 'FAIL' }
def complianceScore = ((totalChecks - failedChecks) / totalChecks * 100).round(1)
slackSend(
channel: '#security',
color: complianceScore > 90 ? 'good' : complianceScore > 70 ? 'warning' : 'danger',
message: """
CloudSploit Security Scan Completed Project: ${env.JOB_NAME} Build: ${env.BUILD_NUMBER} Compliance Score: ${complianceScore}% Failed Checks: ${failedChecks}/${totalChecks} Report: ${env.BUILD_URL}CloudSploit_Security_Report/ """.trim() ) } } } }
post {
always {
cleanWs()
}
failure {
emailext(
subject: "CloudSploit Security Scan Failed - ${env.JOB_NAME} #${env.BUILD_NUMBER}",
body: """
The CloudSploit security scan has failed.
Project: ${env.JOB_NAME} Build Number: ${env.BUILD_NUMBER} Build URL: ${env.BUILD_URL}
Please check the build logs for more details. """, to: "${env.CHANGE_AUTHOR_EMAIL ?: 'security-team@company.com'}" ) } } } ```_
Leistungsoptimierung und Fehlerbehebung
Leistungsoptimierung
```bash
!/bin/bash
CloudSploit performance optimization
optimize_cloudsploit_performance() { echo "Optimizing CloudSploit performance..."
# 1. Node.js optimization
export NODE_OPTIONS="--max-old-space-size=4096"
export UV_THREADPOOL_SIZE=128
# 2. Create optimized configuration
cat > config/performance.js << 'EOF'
module.exports = { // Performance settings settings: { // Increase timeout for slow APIs timeout: 120000,
// Increase parallelism
parallelism: 20,
// Retry configuration
retries: 3,
retryDelay: 2000,
// Memory optimization
maxMemoryUsage: '4GB',
// Cache settings
enableCache: true,
cacheTimeout: 300000
},
// Region optimization
regions: {
aws: [
'us-east-1', 'us-west-2', 'eu-west-1' // Limit to essential regions
],
azure: [
'eastus', 'westus2', 'westeurope'
],
gcp: [
'us-central1', 'us-west1', 'europe-west1'
]
},
// Plugin optimization
plugins: {
// Disable slow or unnecessary plugins
disabled: [
'cloudtrail/cloudtrailFileValidation', // Slow plugin
'ec2/classicInstances', // Legacy check
's3/bucketDnsCompliantName' // Non-critical
],
// Plugin-specific timeouts
timeouts: {
'iam/accessKeys': 60000,
'ec2/instances': 120000,
's3/buckets': 90000
}
}
}; EOF
# 3. Create performance monitoring script
cat > monitor_performance.sh << 'EOF'
!/bin/bash
Monitor CloudSploit performance
SCAN_START=$(date +%s) MEMORY_LOG="memory_usage.log" CPU_LOG="cpu_usage.log"
Start monitoring in background
monitor_resources() { while true; do # Memory usage | ps aux | grep "node.*index.js" | grep -v grep | awk '{print $6}' >> "$MEMORY_LOG" |
# CPU usage
| ps aux | grep "node.*index.js" | grep -v grep | awk '{print $3}' >> "$CPU_LOG" |
sleep 5
done
}
Start monitoring
monitor_resources & MONITOR_PID=$!
Run CloudSploit scan
node index.js --cloud aws --config config/performance.js --format json --output performance_scan.json
Stop monitoring
kill $MONITOR_PID
SCAN_END=$(date +%s) SCAN_DURATION=$((SCAN_END - SCAN_START))
echo "Scan completed in $SCAN_DURATION seconds"
Analyze performance
python3 << 'PYTHON' import matplotlib.pyplot as plt import numpy as np
Read memory usage
try: with open('memory_usage.log', 'r') as f: memory_data = [int(line.strip()) for line in f if line.strip()]
# Convert to MB
memory_mb = [m / 1024 for m in memory_data]
# Read CPU usage
with open('cpu_usage.log', 'r') as f:
cpu_data = [float(line.strip()) for line in f if line.strip()]
# Create performance charts
fig, (ax1, ax2) = plt.subplots(2, 1, figsize=(12, 8))
# Memory usage chart
ax1.plot(memory_mb)
ax1.set_title('Memory Usage During Scan')
ax1.set_ylabel('Memory (MB)')
ax1.grid(True)
# CPU usage chart
ax2.plot(cpu_data)
ax2.set_title('CPU Usage During Scan')
ax2.set_ylabel('CPU (%)')
ax2.set_xlabel('Time (5-second intervals)')
ax2.grid(True)
plt.tight_layout()
plt.savefig('performance_analysis.png')
# Print statistics
print(f"Average Memory Usage: {np.mean(memory_mb):.1f} MB")
print(f"Peak Memory Usage: {np.max(memory_mb):.1f} MB")
print(f"Average CPU Usage: {np.mean(cpu_data):.1f}%")
print(f"Peak CPU Usage: {np.max(cpu_data):.1f}%")
except Exception as e: print(f"Error analyzing performance: {e}") PYTHON
Cleanup
rm -f memory_usage.log cpu_usage.log EOF
chmod +x monitor_performance.sh
echo "Performance optimization setup complete"
}
Memory optimization
optimize_memory_usage() { echo "Optimizing memory usage..."
# Create memory-optimized scanning script
cat > memory_optimized_scan.sh << 'EOF'
!/bin/bash
Memory-optimized CloudSploit scanning
Set memory limits
export NODE_OPTIONS="--max-old-space-size=2048 --optimize-for-size"
Scan in batches to reduce memory usage
REGIONS=("us-east-1" "us-west-2" "eu-west-1" "ap-southeast-1") TIMESTAMP=$(date +%Y%m%d_%H%M%S) OUTPUT_DIR="scans_$TIMESTAMP"
mkdir -p "$OUTPUT_DIR"
echo "Starting memory-optimized scanning..."
for region in "${REGIONS[@]}"; do echo "Scanning region: $region"
# Scan single region to limit memory usage
node index.js \
--cloud aws \
--region "$region" \
--format json \
--output "$OUTPUT_DIR/aws_${region}_scan.json"
# Small delay to allow garbage collection
sleep 10
done
Combine results
echo "Combining scan results..." python3 << 'PYTHON' import json import glob import os
combined_results = []
for scan_file in glob.glob(f"$OUTPUT_DIR/aws_*_scan.json"): try: with open(scan_file, 'r') as f: data = json.load(f) combined_results.extend(data) except Exception as e: print(f"Error reading {scan_file}: {e}")
Save combined results
with open(f"$OUTPUT_DIR/aws_combined_scan.json", 'w') as f: json.dump(combined_results, f, indent=2)
print(f"Combined {len(combined_results)} results") PYTHON
echo "Memory-optimized scan completed. Results in: $OUTPUT_DIR" EOF
chmod +x memory_optimized_scan.sh
echo "Memory optimization complete"
}
Run optimizations
optimize_cloudsploit_performance optimize_memory_usage ```_
Leitfaden zur Fehlerbehebung
```bash
!/bin/bash
CloudSploit troubleshooting guide
troubleshoot_cloudsploit() { echo "CloudSploit Troubleshooting Guide" echo "================================="
# Check Node.js installation
if ! command -v node &> /dev/null; then
echo "❌ Node.js not found"
echo "Solution: Install Node.js 14 or later"
echo " curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash -"
echo " sudo apt-get install -y nodejs"
return 1
fi
node_version=$(node --version | sed 's/v//')
echo "✅ Node.js found: v$node_version"
# Check npm installation
if ! command -v npm &> /dev/null; then
echo "❌ npm not found"
echo "Solution: Install npm"
echo " sudo apt-get install -y npm"
return 1
fi
echo "✅ npm found: $(npm --version)"
# Check CloudSploit installation
if [ ! -f "index.js" ]; then
echo "❌ CloudSploit not found in current directory"
echo "Solution: Clone CloudSploit repository"
echo " git clone https://github.com/aquasecurity/cloudsploit.git"
echo " cd cloudsploit && npm install"
return 1
fi
echo "✅ CloudSploit found"
# Check dependencies
if [ ! -d "node_modules" ]; then
echo "⚠️ Dependencies not installed"
echo "Solution: Install dependencies"
echo " npm install"
else
echo "✅ Dependencies installed"
fi
# Check cloud credentials
echo ""
echo "Checking cloud credentials..."
# AWS credentials
if aws sts get-caller-identity > /dev/null 2>&1; then
echo "✅ AWS credentials configured"
else
echo "⚠️ AWS credentials not configured or invalid"
echo "Solution: Configure AWS credentials"
echo " aws configure"
echo " or set environment variables:"
echo " export AWS_ACCESS_KEY_ID=your-key"
echo " export AWS_SECRET_ACCESS_KEY=your-secret"
fi
# Azure credentials
if az account show > /dev/null 2>&1; then
echo "✅ Azure credentials configured"
else
echo "⚠️ Azure credentials not configured"
echo "Solution: Login to Azure"
echo " az login"
fi
# GCP credentials
if gcloud auth list --filter=status:ACTIVE --format="value(account)" | head -n1 > /dev/null 2>&1; then
echo "✅ GCP credentials configured"
else
echo "⚠️ GCP credentials not configured"
echo "Solution: Authenticate with GCP"
echo " gcloud auth login"
echo " or set service account key:"
echo " export GOOGLE_APPLICATION_CREDENTIALS=path/to/key.json"
fi
# Test basic functionality
echo ""
echo "Testing basic functionality..."
if node index.js --help > /dev/null 2>&1; then
echo "✅ CloudSploit help command works"
else
echo "❌ CloudSploit help command failed"
echo "Solution: Check Node.js version and dependencies"
fi
# Check system resources
echo ""
echo "Checking system resources..."
available_memory=$(free -m | awk 'NR==2{printf "%.1f", $7/1024}')
if (( $(echo "$available_memory < 1.0" | bc -l) )); then
echo "⚠️ Low available memory: ${available_memory}GB"
echo "Recommendation: Ensure at least 2GB available memory for large scans"
else
echo "✅ Available memory: ${available_memory}GB"
fi
# Check disk space
| disk_usage=$(df . | tail -1 | awk '{print $5}' | sed 's/%//') | if [ "$disk_usage" -gt 90 ]; then echo "⚠️ High disk usage: ${disk_usage}%" echo "Solution: Free up disk space" else echo "✅ Disk usage: ${disk_usage}%" fi
echo ""
echo "Troubleshooting completed"
}
Common error solutions
fix_common_errors() { echo "Common CloudSploit Errors and Solutions" echo "======================================"
cat << 'EOF'
-
"Error: Cannot find module 'xyz'" Solution:
- Run: npm install
- If still failing: rm -rf node_modules && npm install
-
"AWS credentials not configured" Solution:
- Run: aws configure
- Or set environment variables: export AWS_ACCESS_KEY_ID=your-key export AWS_SECRET_ACCESS_KEY=your-secret
-
"Request timeout" or "Connection timeout" Solution:
- Increase timeout in configuration
- Check internet connectivity
- Reduce parallelism: --parallel 5
-
"Memory allocation failed" or "Out of memory" Solution:
- Increase Node.js memory: export NODE_OPTIONS="--max-old-space-size=4096"
- Scan fewer regions at once
- Use memory-optimized scanning script
-
"Permission denied" errors Solution:
- Check IAM permissions for AWS
- Ensure service principal has required roles for Azure
- Verify service account permissions for GCP
-
"Plugin not found" or "Plugin failed" Solution:
- Check plugin name spelling
- Update CloudSploit: git pull && npm install
- Disable problematic plugins: --ignore-plugins plugin-name
-
"JSON parse error" in output Solution:
- Check for mixed output formats
- Ensure clean JSON output: --format json
- Redirect stderr: 2>/dev/null
-
"Rate limiting" or "API throttling" Solution:
- Reduce parallelism: --parallel 5
- Add delays between API calls
- Use multiple API keys/accounts
-
"SSL/TLS certificate errors" Solution:
- Update Node.js and npm
- Set NODE_TLS_REJECT_UNAUTHORIZED=0 (not recommended for production)
- Update system certificates
-
"Scan takes too long" or "Hangs" Solution:
- Use region filtering: --region us-east-1
- Disable slow plugins
- Monitor with timeout: timeout 1800 node index.js ... EOF }
Performance diagnostics
diagnose_performance() { echo "Diagnosing CloudSploit Performance" echo "=================================="
# Test scan performance
echo "Running performance test..."
start_time=$(date +%s.%N)
# Run a simple scan
timeout 60 node index.js --cloud aws --region us-east-1 --plugins iam/rootAccessKeys --format json > /dev/null 2>&1
exit_code=$?
end_time=$(date +%s.%N)
duration=$(echo "$end_time - $start_time" | bc)
if [ $exit_code -eq 0 ]; then
echo "✅ Performance test completed in ${duration}s"
elif [ $exit_code -eq 124 ]; then
echo "⚠️ Performance test timed out (>60s)"
echo "Recommendation: Check network connectivity and API performance"
else
echo "❌ Performance test failed"
echo "Recommendation: Check configuration and credentials"
fi
# Check Node.js performance
node_memory=$(node -e "console.log(process.memoryUsage().heapUsed / 1024 / 1024)")
echo "Node.js memory usage: ${node_memory}MB"
# System load
| load_avg=$(uptime | awk -F'load average:' '{print $2}' | awk '{print $1}' | sed 's/,//') | echo "System load average: $load_avg"
# Recommendations
echo ""
echo "Performance Recommendations:"
echo "- Use region filtering for faster scans"
echo "- Disable unnecessary plugins"
echo "- Increase Node.js memory for large environments"
echo "- Use SSD storage for better I/O performance"
echo "- Monitor API rate limits"
}
Main troubleshooting function
main() { troubleshoot_cloudsploit echo "" fix_common_errors echo "" diagnose_performance }
Run troubleshooting
main ```_
Ressourcen und Dokumentation
Offizielle Mittel
- CloudSploit GitHub Repository - Quellcode und Dokumentation
- Aqua Security Documentation - Offizielle Aqua Security Dokumentation
- CloudSploit Plugin Dokumentation - Plugin Referenz
- CloudSploit Wiki - Gemeinschaftsdokumentation
Gemeinschaftsmittel
- CloudSploit Issues - Fehlerberichte und Funktionsanforderungen
- Aqua Security Community - Gemeinschaftsforen und -ressourcen
- Cloud Security Alliance - Cloud Security Best Practices
- OWASP Cloud Security - Richtlinien für die Cloud-Sicherheit
Integrationsbeispiele
- CI/CD Integrationsbeispiele - Offizielle Integrationsbeispiele
- Terraform Integration - Terraform Anbieter für Aqua Security
- Kubernetes Integration - Kubernetes Security Scannen
- Docker Integration - Container-Sicherheits-Scannen