Zum Inhalt

Prower Cheat Sheet

generieren

Überblick

Prowler ist ein Open-Source-Sicherheitstool für AWS-Sicherheitsbewertungen, Audits, Vorfallreaktion, kontinuierliche Überwachung, Aushärtung und Forensikbereitschaft. Es enthält Hunderte von Kontrollen für CIS, PCI DSS, ISO27001, DSGVO, HIPAA, FFIEC, SOC2, AWS FTR, ENS und benutzerdefinierte Sicherheitsrahmen. Prowler ist ein Befehlszeilen-Tool basierend auf AWS CLI-Befehlen und Einzeilen-Skripten, die AWS APIs verwenden, um Sicherheitsbest Practices zu überprüfen.

RECHT *Key Features: 300+ Sicherheitskontrollen, mehrere Compliance-Frames, Multi-Account-Unterstützung, benutzerdefinierte Überprüfungen, detaillierte Berichterstattung, CI/CD-Integration und umfangreiche Ausgabeformate einschließlich JSON, CSV, HTML und JUnit XML.

Installation und Inbetriebnahme

Python Paket Installation

```bash

Install Python 3.9+ (required)

python3 --version

Install Prowler via pip

pip3 install prowler

Alternative: Install with all dependencies

pip3 install prowler[all]

Verify installation

prowler --version

Update Prowler

pip3 install --upgrade prowler

Install specific version

pip3 install prowler==3.12.0 ```_

Docker Installation

```bash

Pull Prowler Docker image

docker pull toniblyx/prowler:latest

Run Prowler in Docker

docker run --rm -it \ -v ~/.aws:/root/.aws \ toniblyx/prowler:latest \ aws --help

Create Docker alias for easier usage

echo 'alias prowler="docker run --rm -it -v ~/.aws:/root/.aws -v $(pwd):/prowler/output toniblyx/prowler:latest"' >> ~/.bashrc source ~/.bashrc

Run with volume mounts for output

docker run --rm -it \ -v ~/.aws:/root/.aws \ -v $(pwd)/output:/prowler/output \ toniblyx/prowler:latest \ aws --output-directory /prowler/output

Create Docker Compose file

cat > docker-compose.yml << 'EOF' version: '3.8' services: prowler: image: toniblyx/prowler:latest volumes: - ~/.aws:/root/.aws - ./output:/prowler/output environment: - AWS_PROFILE=default - AWS_DEFAULT_REGION=us-east-1 EOF

Run with Docker Compose

docker-compose run prowler aws --output-directory /prowler/output ```_

Quelle Installation

```bash

Clone Prowler repository

git clone https://github.com/prowler-cloud/prowler.git cd prowler

Install dependencies

pip3 install -r requirements.txt

Make executable

chmod +x prowler

Add to PATH

echo 'export PATH=$PATH:'$(pwd) >> ~/.bashrc source ~/.bashrc

Verify installation

./prowler --version

Create symbolic link

sudo ln -sf $(pwd)/prowler /usr/local/bin/prowler ```_

AWS Konfiguration

```bash

Install AWS CLI

curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" unzip awscliv2.zip sudo ./aws/install

Configure AWS credentials

aws configure

Enter: Access Key ID, Secret Access Key, Region, Output format

Alternative: Use environment variables

export AWS_ACCESS_KEY_ID="your-access-key" export AWS_SECRET_ACCESS_KEY="your-secret-key" export AWS_DEFAULT_REGION="us-east-1"

Alternative: Use IAM roles (recommended for EC2)

Attach IAM role with required permissions to EC2 instance

Test AWS configuration

aws sts get-caller-identity

Configure multiple profiles

aws configure --profile production aws configure --profile development aws configure --profile staging

List configured profiles

aws configure list-profiles

Set default profile

export AWS_PROFILE=production

Required AWS permissions for Prowler

cat > prowler-policy.json << 'EOF' { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "access-analyzer:List", "account:Get", "acm:Describe", "acm:List", "apigateway:GET", "application-autoscaling:Describe", "appstream:Describe", "appstream:List", "autoscaling:Describe", "backup:List", "cloudformation:Describe", "cloudformation:Get", "cloudformation:List", "cloudfront:Get", "cloudfront:List", "cloudtrail:Describe", "cloudtrail:Get", "cloudtrail:List", "cloudwatch:Describe", "cloudwatch:Get", "cloudwatch:List", "codebuild:List", "config:Describe", "config:Get", "config:List", "dax:Describe", "dax:List", "directconnect:Describe", "dms:Describe", "dms:List", "ds:Describe", "ds:Get", "ds:List", "dynamodb:Describe", "dynamodb:List", "ec2:Describe", "ec2:Get", "ecr:Describe", "ecr:Get", "ecr:List", "ecs:Describe", "ecs:List", "efs:Describe", "eks:Describe", "eks:List", "elasticache:Describe", "elasticbeanstalk:Describe", "elasticfilesystem:Describe", "elasticloadbalancing:Describe", "elasticmapreduce:Describe", "elasticmapreduce:List", "es:Describe", "es:List", "events:Describe", "events:List", "firehose:Describe", "firehose:List", "fsx:Describe", "fsx:List", "glue:Get", "glue:List", "guardduty:Get", "guardduty:List", "iam:Generate", "iam:Get", "iam:List", "iam:Simulate", "inspector:Describe", "inspector:Get", "inspector:List", "kinesis:Describe", "kinesis:List", "kms:Describe", "kms:Get", "kms:List", "lambda:Get", "lambda:List", "logs:Describe", "logs:Get", "logs:List", "macie2:Get", "macie2:List", "organizations:Describe", "organizations:List", "rds:Describe", "rds:List", "redshift:Describe", "route53:Get", "route53:List", "route53domains:Get", "route53domains:List", "s3:Get", "s3:List", "sagemaker:Describe", "sagemaker:List", "secretsmanager:Describe", "secretsmanager:Get", "secretsmanager:List", "securityhub:Describe", "securityhub:Get", "securityhub:List", "ses:Get", "ses:List", "shield:Describe", "shield:Get", "shield:List", "sns:Get", "sns:List", "sqs:Get", "sqs:List", "ssm:Describe", "ssm:Get", "ssm:List", "support:Describe", "trustedadvisor:Describe", "waf:Get", "waf:List", "wafv2:Get", "wafv2:List", "workspaces:Describe" ], "Resource": "" } ] } EOF

Create IAM policy

aws iam create-policy \ --policy-name ProwlerPolicy \ --policy-document file://prowler-policy.json

Attach policy to user

aws iam attach-user-policy \ --user-name your-username \ --policy-arn arn:aws:iam::ACCOUNT-ID:policy/ProwlerPolicy

Create IAM role for cross-account access

aws iam create-role \ --role-name ProwlerRole \ --assume-role-policy-document '{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::TRUSTED-ACCOUNT-ID:root" }, "Action": "sts:AssumeRole" } ] }'

Attach policy to role

aws iam attach-role-policy \ --role-name ProwlerRole \ --policy-arn arn:aws:iam::ACCOUNT-ID:policy/ProwlerPolicy ```_

Basisnutzung und Scanning

Einfache Scans

```bash

Basic AWS scan (all checks)

prowler aws

Scan specific region

prowler aws --region us-east-1

Scan multiple regions

prowler aws --region us-east-1,us-west-2,eu-west-1

Scan with specific profile

prowler aws --profile production

Quick scan (essential checks only)

prowler aws --quick

Scan specific service

prowler aws --service s3 prowler aws --service ec2 prowler aws --service iam

Scan multiple services

prowler aws --service s3,ec2,iam,vpc

List available services

prowler aws --list-services ```_

Erweiterte Scanoptionen

```bash

Scan with specific checks

prowler aws --check s3_bucket_public_access_block prowler aws --check ec2_instance_public_ip prowler aws --check iam_root_access_key_check

Scan with check patterns

prowler aws --check "s3" prowler aws --check "encryption" prowler aws --check "public"

Exclude specific checks

prowler aws --excluded-checks s3_bucket_public_access_block prowler aws --excluded-checks "logging"

Scan with compliance framework

prowler aws --compliance cis_1.5_aws prowler aws --compliance pci_3.2.1_aws prowler aws --compliance iso27001_2013_aws prowler aws --compliance gdpr_aws prowler aws --compliance hipaa_aws prowler aws --compliance soc2_aws

List available compliance frameworks

prowler aws --list-compliance

Scan with severity filter

prowler aws --severity critical prowler aws --severity high,critical prowler aws --severity medium,high,critical

Scan with custom configuration

prowler aws --config-file custom_config.yaml

Scan with resource filtering

prowler aws --resource-tags Environment=Production prowler aws --resource-tags "Owner=SecurityTeam,Environment=Production"

Scan with output options

prowler aws --output-formats json prowler aws --output-formats csv,html,json prowler aws --output-directory ./reports prowler aws --output-filename custom_report ```_

Multi-Account Scanning

```bash

Scan using assume role

prowler aws --role arn:aws:iam::123456789012:role/ProwlerRole

Scan multiple accounts

cat > accounts.txt << 'EOF' 123456789012:arn:aws:iam::123456789012:role/ProwlerRole 234567890123:arn:aws:iam::234567890123:role/ProwlerRole 345678901234:arn:aws:iam::345678901234:role/ProwlerRole EOF

Scan all accounts in file

prowler aws --organizations-role ProwlerRole

Create multi-account scanning script

cat > multi_account_scan.sh << 'EOF'

!/bin/bash

Multi-account Prowler scanning

ACCOUNTS=( "123456789012:arn:aws:iam::123456789012:role/ProwlerRole" "234567890123:arn:aws:iam::234567890123:role/ProwlerRole" "345678901234:arn:aws:iam::345678901234:role/ProwlerRole" )

TIMESTAMP=$(date +%Y%m%d_%H%M%S) OUTPUT_DIR="multi_account_scan_$TIMESTAMP"

mkdir -p "$OUTPUT_DIR"

echo "Starting multi-account Prowler scan..."

for account_info in "${ACCOUNTS[@]}"; do account_id=$(echo "$account_info" | cut -d':' -f1) role_arn=$(echo "$account_info" | cut -d':' -f2-)

echo "Scanning account: $account_id"

prowler aws \
    --role "$role_arn" \
    --output-directory "$OUTPUT_DIR" \
    --output-filename "prowler_${account_id}" \
    --output-formats json,csv,html

echo "Completed scan for account: $account_id"

done

Generate summary report

python3 << 'PYTHON' import json import glob import os

def generate_summary(output_dir): summary = { "timestamp": "$TIMESTAMP", "accounts": {}, "total_findings": 0, "severity_breakdown": {"CRITICAL": 0, "HIGH": 0, "MEDIUM": 0, "LOW": 0, "INFO": 0} }

for json_file in glob.glob(f"{output_dir}/prowler_*.json"):
    account_id = os.path.basename(json_file).replace("prowler_", "").replace(".json", "")

    try:
        with open(json_file, 'r') as f:
            data = json.load(f)

        findings = 0
        for result in data:
            if result.get("Status") == "FAIL":
                findings += 1
                severity = result.get("Severity", "UNKNOWN")
                if severity in summary["severity_breakdown"]:
                    summary["severity_breakdown"][severity] += 1

        summary["accounts"][account_id] = {
            "findings": findings,
            "total_checks": len(data)
        }
        summary["total_findings"] += findings

    except Exception as e:
        print(f"Error processing {json_file}: {e}")

with open(f"{output_dir}/multi_account_summary.json", 'w') as f:
    json.dump(summary, f, indent=2)

print(f"Summary report generated: {output_dir}/multi_account_summary.json")

generate_summary("$OUTPUT_DIR") PYTHON

echo "Multi-account scan completed. Results in: $OUTPUT_DIR" EOF

chmod +x multi_account_scan.sh ./multi_account_scan.sh ```_

Compliance Framework Scanning

GUS Benchmarks

```bash

CIS AWS Foundations Benchmark v1.5

prowler aws --compliance cis_1.5_aws

CIS AWS Foundations Benchmark v1.4

prowler aws --compliance cis_1.4_aws

CIS AWS Foundations Benchmark v1.3

prowler aws --compliance cis_1.3_aws

CIS Controls v8

prowler aws --compliance cis_controls_v8_aws

Run specific CIS sections

prowler aws --check "cis_1." # Identity and Access Management prowler aws --check "cis_2." # Storage prowler aws --check "cis_3." # Logging prowler aws --check "cis_4." # Monitoring prowler aws --check "cis_5.*" # Networking

Generate CIS compliance report

prowler aws \ --compliance cis_1.5_aws \ --output-formats json,html,csv \ --output-directory ./cis_reports \ --output-filename cis_1.5_compliance_report ```_

Industriestandards

```bash

PCI DSS 3.2.1

prowler aws --compliance pci_3.2.1_aws

ISO 27001:2013

prowler aws --compliance iso27001_2013_aws

NIST 800-53 Rev 5

prowler aws --compliance nist_800_53_rev5_aws

NIST 800-171 Rev 2

prowler aws --compliance nist_800_171_rev2_aws

NIST Cybersecurity Framework

prowler aws --compliance nist_csf_1.1_aws

SOC 2

prowler aws --compliance soc2_aws

GDPR

prowler aws --compliance gdpr_aws

HIPAA

prowler aws --compliance hipaa_aws

FFIEC

prowler aws --compliance ffiec_aws

ENS (Esquema Nacional de Seguridad)

prowler aws --compliance ens_rd2022_aws

AWS Foundational Technical Review (FTR)

prowler aws --compliance aws_foundational_technical_review

Generate comprehensive compliance report

cat > compliance_scan.sh << 'EOF'

!/bin/bash

Comprehensive compliance scanning

FRAMEWORKS=( "cis_1.5_aws" "pci_3.2.1_aws" "iso27001_2013_aws" "nist_800_53_rev5_aws" "soc2_aws" "gdpr_aws" "hipaa_aws" )

TIMESTAMP=$(date +%Y%m%d_%H%M%S) COMPLIANCE_DIR="compliance_reports_$TIMESTAMP"

mkdir -p "$COMPLIANCE_DIR"

echo "Starting comprehensive compliance scanning..."

for framework in "${FRAMEWORKS[@]}"; do echo "Running $framework compliance scan..."

prowler aws \
    --compliance "$framework" \
    --output-formats json,html,csv \
    --output-directory "$COMPLIANCE_DIR" \
    --output-filename "${framework}_report"

echo "Completed $framework scan"

done

Generate compliance summary

python3 << 'PYTHON' import json import glob import os

def generate_compliance_summary(compliance_dir): frameworks = [ "cis_1.5_aws", "pci_3.2.1_aws", "iso27001_2013_aws", "nist_800_53_rev5_aws", "soc2_aws", "gdpr_aws", "hipaa_aws" ]

summary = {
    "timestamp": "$TIMESTAMP",
    "frameworks": {},
    "overall_score": 0
}

total_score = 0
framework_count = 0

for framework in frameworks:
    report_file = f"{compliance_dir}/{framework}_report.json"

    if os.path.exists(report_file):
        with open(report_file, 'r') as f:
            data = json.load(f)

        total_checks = len(data)
        passed_checks = sum(1 for result in data if result.get("Status") == "PASS")
        failed_checks = total_checks - passed_checks
        score = (passed_checks / total_checks * 100) if total_checks > 0 else 0

        summary["frameworks"][framework] = {
            "total_checks": total_checks,
            "passed_checks": passed_checks,
            "failed_checks": failed_checks,
            "compliance_score": round(score, 2)
        }

        total_score += score
        framework_count += 1

if framework_count > 0:
    summary["overall_score"] = round(total_score / framework_count, 2)

with open(f"{compliance_dir}/compliance_summary.json", 'w') as f:
    json.dump(summary, f, indent=2)

print(f"Compliance summary generated: {compliance_dir}/compliance_summary.json")

generate_compliance_summary("$COMPLIANCE_DIR") PYTHON

echo "Comprehensive compliance scanning completed. Reports in: $COMPLIANCE_DIR" EOF

chmod +x compliance_scan.sh ./compliance_scan.sh ```_

Zollkodex

```yaml

Create custom compliance framework: custom_framework.yaml

metadata: name: "Custom Security Framework" version: "1.0" description: "Organization-specific security compliance framework" author: "Security Team"

requirements: - requirement_id: "CSF-001" requirement_description: "Data Encryption Requirements" requirement_attributes: - name: "Data at Rest Encryption" description: "All data must be encrypted at rest" checks: - "s3_bucket_default_encryption" - "rds_instance_storage_encrypted" - "ebs_volume_encryption" - "efs_encryption_at_rest_enabled"

  • requirement_id: "CSF-002" requirement_description: "Network Security Controls" requirement_attributes:

    • name: "Network Access Control" description: "Network access must be properly controlled" checks:
      • "ec2_securitygroup_default_restrict_traffic"
      • "ec2_securitygroup_not_used"
      • "vpc_flow_logs_enabled"
      • "ec2_instance_public_ip"
  • requirement_id: "CSF-003" requirement_description: "Identity and Access Management" requirement_attributes:

    • name: "Privileged Access Management" description: "Privileged access must be controlled and monitored" checks:
      • "iam_root_access_key_check"
      • "iam_mfa_enabled_for_root"
      • "iam_user_mfa_enabled_console_access"
      • "iam_policy_attached_only_to_group_or_roles"
  • requirement_id: "CSF-004" requirement_description: "Logging and Monitoring" requirement_attributes:

    • name: "Security Monitoring" description: "Security events must be logged and monitored" checks:
      • "cloudtrail_multi_region_enabled"
      • "cloudtrail_log_file_validation_enabled"
      • "cloudwatch_log_group_retention_policy_specific_days_enabled"
      • "guardduty_is_enabled"
  • requirement_id: "CSF-005" requirement_description: "Backup and Recovery" requirement_attributes:

    • name: "Data Protection" description: "Critical data must be backed up and recoverable" checks:
      • "rds_instance_backup_enabled"
      • "dynamodb_point_in_time_recovery_enabled"
      • "s3_bucket_versioning_enabled"
      • "ec2_ebs_snapshot_encrypted"

Run custom compliance framework

prowler aws --compliance-file custom_framework.yaml ```_

Kundenspezifische Überprüfungen und Erweiterungen

Benutzerdefinierte Überprüfungen erstellen

```python

!/usr/bin/env python3

Custom Prowler check example

""" Custom check: Ensure EC2 instances have required tags """

from prowler.lib.check.models import Check, Check_Report_AWS from prowler.providers.aws.services.ec2.ec2_client import ec2_client

class ec2_instance_required_tags(Check): def execute(self): findings = []

    # Required tags for compliance
    required_tags = ["Environment", "Owner", "Project", "CostCenter"]

    for region in ec2_client.regional_clients:
        for instance in ec2_client.instances[region]:
            report = Check_Report_AWS(self.metadata())
            report.region = region
            report.resource_id = instance.id
            report.resource_arn = instance.arn
            report.resource_tags = instance.tags

            # Check if all required tags are present
            instance_tag_keys = [tag["Key"] for tag in instance.tags] if instance.tags else []
            missing_tags = [tag for tag in required_tags if tag not in instance_tag_keys]

            if missing_tags:
                report.status = "FAIL"
                report.status_extended = f"EC2 instance {instance.id} is missing required tags: {', '.join(missing_tags)}"
            else:
                report.status = "PASS"
                report.status_extended = f"EC2 instance {instance.id} has all required tags"

            findings.append(report)

    return findings

Metadata for the check

def metadata(): return { "CheckID": "ec2_instance_required_tags", "CheckTitle": "Ensure EC2 instances have required tags", "CheckType": ["Software and Configuration Checks"], "ServiceName": "ec2", "SubServiceName": "", "ResourceIdTemplate": "arn:aws:ec2:region:account-id:instance/instance-id", "Severity": "medium", "ResourceType": "AwsEc2Instance", "Description": "Ensure EC2 instances have all required tags for compliance and cost management", "Risk": "Without proper tagging, instances cannot be properly managed, tracked, or billed", "RelatedUrl": "https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/Using_Tags.html", "Remediation": { "Code": { "CLI": "aws ec2 create-tags --resources instance-id --tags Key=TagName,Value=TagValue", "NativeIaC": "", "Other": "https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/Using_Tags.html", "Terraform": "" }, "Recommendation": { "Text": "Add the required tags to all EC2 instances", "Url": "https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/Using_Tags.html" } }, "Categories": ["tagging"], "DependsOn": [], "RelatedTo": [], "Notes": "This check ensures instances have required tags for compliance" } ```_

Benutzerdefinierte Check-Konfiguration

```yaml

custom_checks_config.yaml

custom_checks: ec2_instance_required_tags: enabled: true severity: "medium" required_tags: - "Environment" - "Owner" - "Project" - "CostCenter" - "Application"

s3_bucket_naming_convention: enabled: true severity: "low" | naming_pattern: "^(dev | staging | prod)-[a-z0-9-]+$" |

rds_instance_naming_convention: enabled: true severity: "low" | naming_pattern: "^(dev | staging | prod)-[a-z0-9-]+-db$" |

security_group_description_required: enabled: true severity: "medium" min_description_length: 10

cloudtrail_custom_requirements: enabled: true severity: "high" required_event_selectors: true required_insight_selectors: true required_kms_encryption: true

Run with custom configuration

prowler aws --config-file custom_checks_config.yaml ```_

Bulk Custom Checks

```bash

Create directory for custom checks

mkdir -p custom_checks/aws/ec2 mkdir -p custom_checks/aws/s3 mkdir -p custom_checks/aws/iam

Create multiple custom checks

cat > custom_checks/aws/ec2/ec2_instance_approved_amis.py << 'EOF'

!/usr/bin/env python3

""" Custom check: Ensure EC2 instances use approved AMIs """

from prowler.lib.check.models import Check, Check_Report_AWS from prowler.providers.aws.services.ec2.ec2_client import ec2_client

class ec2_instance_approved_amis(Check): def execute(self): findings = []

    # Approved AMI patterns
    approved_ami_patterns = [
        "ami-0abcdef1234567890",  # Approved base AMI
        "ami-0123456789abcdef0",  # Approved hardened AMI
    ]

    for region in ec2_client.regional_clients:
        for instance in ec2_client.instances[region]:
            report = Check_Report_AWS(self.metadata())
            report.region = region
            report.resource_id = instance.id
            report.resource_arn = instance.arn

            if instance.image_id in approved_ami_patterns:
                report.status = "PASS"
                report.status_extended = f"EC2 instance {instance.id} uses approved AMI {instance.image_id}"
            else:
                report.status = "FAIL"
                report.status_extended = f"EC2 instance {instance.id} uses unapproved AMI {instance.image_id}"

            findings.append(report)

    return findings

EOF

cat > custom_checks/aws/s3/s3_bucket_lifecycle_policy.py << 'EOF'

!/usr/bin/env python3

""" Custom check: Ensure S3 buckets have lifecycle policies """

from prowler.lib.check.models import Check, Check_Report_AWS from prowler.providers.aws.services.s3.s3_client import s3_client

class s3_bucket_lifecycle_policy(Check): def execute(self): findings = []

    for bucket in s3_client.buckets:
        report = Check_Report_AWS(self.metadata())
        report.region = bucket.region
        report.resource_id = bucket.name
        report.resource_arn = bucket.arn

        if bucket.lifecycle_configuration:
            report.status = "PASS"
            report.status_extended = f"S3 bucket {bucket.name} has lifecycle policy configured"
        else:
            report.status = "FAIL"
            report.status_extended = f"S3 bucket {bucket.name} does not have lifecycle policy configured"

        findings.append(report)

    return findings

EOF

Install custom checks

export PYTHONPATH="${PYTHONPATH}:$(pwd)/custom_checks"

Run with custom checks

prowler aws --checks-folder custom_checks ```_

Advanced Reporting und Analyse

Umfassende Berichte

```python

!/usr/bin/env python3

Advanced Prowler reporting and analysis

import json import csv import pandas as pd import matplotlib.pyplot as plt import seaborn as sns from datetime import datetime import argparse from jinja2 import Template

class ProwlerReporter: """Advanced reporting for Prowler scan results"""

def __init__(self):
    self.timestamp = datetime.now().strftime('%Y-%m-%d_%H-%M-%S')

def load_prowler_results(self, file_path):
    """Load Prowler scan results from JSON file"""

    try:
        with open(file_path, 'r') as f:
            return json.load(f)
    except Exception as e:
        print(f"Error loading Prowler results: {e}")
        return None

def analyze_results(self, results):
    """Analyze Prowler scan results and generate statistics"""

    analysis = {
        'total_checks': len(results),
        'passed_checks': 0,
        'failed_checks': 0,
        'manual_checks': 0,
        'severity_breakdown': {'CRITICAL': 0, 'HIGH': 0, 'MEDIUM': 0, 'LOW': 0, 'INFO': 0},
        'service_breakdown': {},
        'region_breakdown': {},
        'compliance_breakdown': {},
        'top_failures': []
    }

    for result in results:
        status = result.get('Status', 'UNKNOWN')
        severity = result.get('Severity', 'UNKNOWN')
        service = result.get('ServiceName', 'UNKNOWN')
        region = result.get('Region', 'UNKNOWN')
        compliance = result.get('Compliance', {})

        # Count by status
        if status == 'PASS':
            analysis['passed_checks'] += 1
        elif status == 'FAIL':
            analysis['failed_checks'] += 1
        elif status == 'MANUAL':
            analysis['manual_checks'] += 1

        # Count by severity
        if severity in analysis['severity_breakdown']:
            analysis['severity_breakdown'][severity] += 1

        # Count by service
        if service not in analysis['service_breakdown']:
            analysis['service_breakdown'][service] = {'total': 0, 'failed': 0}
        analysis['service_breakdown'][service]['total'] += 1
        if status == 'FAIL':
            analysis['service_breakdown'][service]['failed'] += 1

        # Count by region
        if region not in analysis['region_breakdown']:
            analysis['region_breakdown'][region] = {'total': 0, 'failed': 0}
        analysis['region_breakdown'][region]['total'] += 1
        if status == 'FAIL':
            analysis['region_breakdown'][region]['failed'] += 1

        # Count by compliance framework
        for framework, requirements in compliance.items():
            if framework not in analysis['compliance_breakdown']:
                analysis['compliance_breakdown'][framework] = {'total': 0, 'failed': 0}
            analysis['compliance_breakdown'][framework]['total'] += 1
            if status == 'FAIL':
                analysis['compliance_breakdown'][framework]['failed'] += 1

        # Collect failed checks for top failures
        if status == 'FAIL':
            analysis['top_failures'].append({
                'check_id': result.get('CheckID', 'Unknown'),
                'check_title': result.get('CheckTitle', 'Unknown'),
                'severity': severity,
                'service': service,
                'region': region,
                'resource_id': result.get('ResourceId', 'Unknown')
            })

    # Sort top failures by severity
    severity_order = {'CRITICAL': 4, 'HIGH': 3, 'MEDIUM': 2, 'LOW': 1, 'INFO': 0}
    analysis['top_failures'].sort(key=lambda x: severity_order.get(x['severity'], 0), reverse=True)

    return analysis

def generate_executive_summary(self, analysis):
    """Generate executive summary"""

    total_checks = analysis['total_checks']
    failed_checks = analysis['failed_checks']
    compliance_score = ((total_checks - failed_checks) / total_checks * 100) if total_checks > 0 else 0

    critical_issues = analysis['severity_breakdown']['CRITICAL']
    high_issues = analysis['severity_breakdown']['HIGH']

    summary = {
        'compliance_score': round(compliance_score, 1),
        'total_checks': total_checks,
        'failed_checks': failed_checks,
        'critical_issues': critical_issues,
        'high_issues': high_issues,
        'risk_level': self._calculate_risk_level(compliance_score, critical_issues, high_issues),
        'top_services_at_risk': self._get_top_services_at_risk(analysis['service_breakdown']),
        'recommendations': self._generate_recommendations(analysis)
    }

    return summary

def _calculate_risk_level(self, compliance_score, critical_issues, high_issues):
    """Calculate overall risk level"""

    if critical_issues > 0 or compliance_score < 70:
        return "HIGH"
    elif high_issues > 5 or compliance_score < 85:
        return "MEDIUM"
    else:
        return "LOW"

def _get_top_services_at_risk(self, service_breakdown):
    """Get services with most failures"""

    services_at_risk = []
    for service, stats in service_breakdown.items():
        if stats['failed'] > 0:
            failure_rate = (stats['failed'] / stats['total'] * 100) if stats['total'] > 0 else 0
            services_at_risk.append({
                'service': service,
                'failed_checks': stats['failed'],
                'total_checks': stats['total'],
                'failure_rate': round(failure_rate, 1)
            })

    return sorted(services_at_risk, key=lambda x: x['failure_rate'], reverse=True)[:5]

def _generate_recommendations(self, analysis):
    """Generate security recommendations"""

    recommendations = []

    # Critical issues
    if analysis['severity_breakdown']['CRITICAL'] > 0:
        recommendations.append({
            'priority': 'IMMEDIATE',
            'title': 'Address Critical Security Issues',
            'description': f"Immediately remediate {analysis['severity_breakdown']['CRITICAL']} critical security issues"
        })

    # High issues
    if analysis['severity_breakdown']['HIGH'] > 5:
        recommendations.append({
            'priority': 'HIGH',
            'title': 'Reduce High-Severity Findings',
            'description': f"Address {analysis['severity_breakdown']['HIGH']} high-severity security findings"
        })

    # Service-specific recommendations
    for service, stats in analysis['service_breakdown'].items():
        if stats['failed'] > 10:
            recommendations.append({
                'priority': 'MEDIUM',
                'title': f'Improve {service.upper()} Security',
                'description': f"Focus on {service} service with {stats['failed']} failed checks"
            })

    return recommendations[:5]  # Top 5 recommendations

def generate_detailed_report(self, analysis, results, output_file):
    """Generate detailed HTML report"""

    executive_summary = self.generate_executive_summary(analysis)

    html_template = """
Prowler Security Assessment Report - {{ timestamp }}

🔒 Prowler Security Assessment Report

Generated on: {{ timestamp }}

Account: {{ account_id }}

Executive Summary

Compliance Score

{{ executive_summary.compliance_score }}%

Total Checks

{{ executive_summary.total_checks }}

Failed Checks

{{ executive_summary.failed_checks }}

Critical Issues

{{ executive_summary.critical_issues }}

Risk Level

{{ executive_summary.risk_level }}

Key Recommendations

{% for recommendation in executive_summary.recommendations %}

{{ recommendation.priority }}: {{ recommendation.title }}

{{ recommendation.description }}

{% endfor %}

Severity Breakdown

{% for severity, count in analysis.severity_breakdown.items() %} {% endfor %}
SeverityCountPercentage
{{ severity }} {{ count }} {{ "%.1f"|format((count / analysis.total_checks * 100) if analysis.total_checks > 0 else 0) }}%

Services at Risk

{% for service in executive_summary.top_services_at_risk %} {% endfor %}
ServiceFailed ChecksTotal ChecksFailure Rate
{{ service.service }} {{ service.failed_checks }} {{ service.total_checks }} {{ service.failure_rate }}%

Top Security Findings

{% for finding in analysis.top_failures[:20] %} {% endfor %}
CheckSeverityServiceRegionResource
{{ finding.check_title }} {{ finding.severity }} {{ finding.service }} {{ finding.region }} {{ finding.resource_id }}

Regional Analysis

{% for region, stats in analysis.region_breakdown.items() %} {% endfor %}
RegionTotal ChecksFailed ChecksSuccess Rate
{{ region }} {{ stats.total }} {{ stats.failed }} {{ "%.1f"|format(((stats.total - stats.failed) / stats.total * 100) if stats.total > 0 else 0) }}%

"""

    template = Template(html_template)
    html_content = template.render(
        timestamp=self.timestamp,
        account_id="123456789012",  # Replace with actual account ID
        analysis=analysis,
        executive_summary=executive_summary
    )

    with open(output_file, 'w') as f:
        f.write(html_content)

    print(f"Detailed HTML report generated: {output_file}")
    return output_file

def generate_charts(self, analysis, output_dir):
    """Generate charts and visualizations"""

    import os
    os.makedirs(output_dir, exist_ok=True)

    # Set style
    plt.style.use('seaborn-v0_8')

    # 1. Severity breakdown pie chart
    plt.figure(figsize=(15, 10))

    # Filter out zero values
    severity_data = {k: v for k, v in analysis['severity_breakdown'].items() if v > 0}

    if severity_data:
        plt.subplot(2, 3, 1)
        colors = ['#c0392b', '#e67e22', '#f1c40f', '#2ecc71', '#3498db']
        plt.pie(severity_data.values(), labels=severity_data.keys(), autopct='%1.1f%%', colors=colors)
        plt.title('Findings by Severity')

    # 2. Pass/Fail breakdown
    plt.subplot(2, 3, 2)
    status_data = [analysis['passed_checks'], analysis['failed_checks']]
    status_labels = ['Passed', 'Failed']
    colors = ['#27ae60', '#e74c3c']
    plt.pie(status_data, labels=status_labels, colors=colors, autopct='%1.1f%%')
    plt.title('Overall Compliance Status')

    # 3. Top services by failure count
    plt.subplot(2, 3, 3)
    services = list(analysis['service_breakdown'].keys())[:10]
    failures = [analysis['service_breakdown'][s]['failed'] for s in services]
    plt.barh(services, failures, color='#e74c3c')
    plt.title('Top Services by Failed Checks')
    plt.xlabel('Failed Checks')

    # 4. Regional analysis
    plt.subplot(2, 3, 4)
    regions = list(analysis['region_breakdown'].keys())[:10]
    region_failures = [analysis['region_breakdown'][r]['failed'] for r in regions]
    plt.bar(regions, region_failures, color='#f39c12')
    plt.title('Failed Checks by Region')
    plt.ylabel('Failed Checks')
    plt.xticks(rotation=45)

    # 5. Compliance score gauge
    plt.subplot(2, 3, 5)
    total_checks = analysis['total_checks']
    passed_checks = analysis['passed_checks']
    compliance_score = (passed_checks / total_checks * 100) if total_checks > 0 else 0

    # Create gauge chart
    fig, ax = plt.subplots(figsize=(6, 6), subplot_kw=dict(projection='polar'))
    theta = compliance_score * 2 * 3.14159 / 100
    ax.barh(0, theta, height=0.5, color='#27ae60' if compliance_score > 80 else '#f39c12' if compliance_score > 60 else '#e74c3c')
    ax.set_ylim(-0.5, 0.5)
    ax.set_title(f'Compliance Score: {compliance_score:.1f}%')

    # 6. Trend analysis (if historical data available)
    plt.subplot(2, 3, 6)
    # Placeholder for trend data
    months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun']
    scores = [75, 78, 82, 85, 87, 90]  # Example data
    plt.plot(months, scores, marker='o', color='#3498db')
    plt.title('Compliance Score Trend')
    plt.ylabel('Compliance Score (%)')
    plt.ylim(0, 100)

    plt.tight_layout()
    chart_file = f"{output_dir}/prowler_security_analysis.png"
    plt.savefig(chart_file, dpi=300, bbox_inches='tight')
    plt.close()

    print(f"Charts generated: {chart_file}")
    return chart_file

def main(): parser = argparse.ArgumentParser(description='Prowler Advanced Reporter') parser.add_argument('prowler_file', help='Prowler scan results JSON file') parser.add_argument('--output-dir', default='reports', help='Output directory for reports') parser.add_argument('--charts', action='store_true', help='Generate charts and visualizations')

args = parser.parse_args()

reporter = ProwlerReporter()

# Load Prowler results
prowler_results = reporter.load_prowler_results(args.prowler_file)
if not prowler_results:
    return

# Analyze results
analysis = reporter.analyze_results(prowler_results)

# Create output directory
import os
os.makedirs(args.output_dir, exist_ok=True)

# Generate reports
report_files = []

# HTML report
html_file = f"{args.output_dir}/prowler_security_report_{reporter.timestamp}.html"
reporter.generate_detailed_report(analysis, prowler_results, html_file)
report_files.append(html_file)

# Charts
if args.charts:
    chart_file = reporter.generate_charts(analysis, args.output_dir)
    report_files.append(chart_file)

print(f"Report generation completed. Files: {report_files}")

if name == "main": main() ```_

CI/CD Integration und Automatisierung

GitHub Aktionen Integration

```yaml

.github/workflows/prowler-security-scan.yml

name: Prowler Security Scan

on: push: branches: [ main, develop ] pull_request: branches: [ main ] schedule: # Run daily at 2 AM UTC - cron: '0 2 * * *' workflow_dispatch: inputs: compliance_framework: description: 'Compliance framework to scan' required: false default: 'cis_1.5_aws' type: choice options: - cis_1.5_aws - pci_3.2.1_aws - iso27001_2013_aws - nist_800_53_rev5_aws

jobs: prowler-scan: runs-on: ubuntu-latest

strategy:
  matrix:
    region: [us-east-1, us-west-2, eu-west-1]

steps:
- name: Checkout code
  uses: actions/checkout@v3

- name: Setup Python
  uses: actions/setup-python@v4
  with:
    python-version: '3.9'

- name: Install Prowler
  run: |
    pip install prowler
    prowler --version

- name: Configure AWS credentials
  uses: aws-actions/configure-aws-credentials@v2
  with:
    aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
    aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
    aws-region: ${{ matrix.region }}

- name: Run Prowler scan
  run: |
    mkdir -p reports

    # Determine compliance framework

| COMPLIANCE_FRAMEWORK="${{ github.event.inputs.compliance_framework | | 'cis_1.5_aws' }}" |

    # Run Prowler scan
    prowler aws \
      --region ${{ matrix.region }} \
      --compliance $COMPLIANCE_FRAMEWORK \
      --output-formats json,csv,html \
      --output-directory reports \
      --output-filename prowler_${{ matrix.region }}_scan

- name: Generate security report
  run: |
    pip install jinja2 matplotlib seaborn pandas
    python scripts/prowler_reporter.py reports/prowler_${{ matrix.region }}_scan.json \
      --output-dir reports \
      --charts

- name: Upload scan results
  uses: actions/upload-artifact@v3
  with:
    name: prowler-scan-results-${{ matrix.region }}
    path: reports/

- name: Security gate check
  run: |
    python << 'EOF'
    import json
    import sys

    with open('reports/prowler_${{ matrix.region }}_scan.json', 'r') as f:
        results = json.load(f)

    critical_issues = sum(1 for r in results if r.get('Status') == 'FAIL' and r.get('Severity') == 'CRITICAL')
    high_issues = sum(1 for r in results if r.get('Status') == 'FAIL' and r.get('Severity') == 'HIGH')

    print(f"Critical issues: {critical_issues}")
    print(f"High issues: {high_issues}")

    # Fail build if critical issues found
    if critical_issues > 0:
        print("❌ CRITICAL SECURITY ISSUES FOUND!")
        sys.exit(1)

    # Warn if too many high issues
    if high_issues > 10:
        print("⚠️ WARNING: High number of high-severity issues found!")
        sys.exit(1)

    print("✅ Security gate passed")
    EOF

- name: Comment PR with results
  if: github.event_name == 'pull_request'
  uses: actions/github-script@v6
  with:
    script: |
      const fs = require('fs');
      const results = JSON.parse(fs.readFileSync('reports/prowler_${{ matrix.region }}_scan.json', 'utf8'));

      const totalChecks = results.length;
      const failedChecks = results.filter(r => r.Status === 'FAIL').length;
      const passedChecks = totalChecks - failedChecks;
      const complianceScore = ((passedChecks / totalChecks) * 100).toFixed(1);

      const criticalIssues = results.filter(r => r.Status === 'FAIL' && r.Severity === 'CRITICAL').length;
      const highIssues = results.filter(r => r.Status === 'FAIL' && r.Severity === 'HIGH').length;

      const comment = `## 🔒 Prowler Security Scan Results (${{ matrix.region }})

      **Compliance Score: ** ${complianceScore}%

      **Summary: **
      - ✅ Passed: ${passedChecks}
      - ❌ Failed: ${failedChecks}
      - 🔴 Critical: ${criticalIssues}
      - 🟠 High: ${highIssues}

      ${criticalIssues > 0 ? '⚠️ **Critical security issues found! Please review and remediate.**' : '✅ No critical security issues found.'}

      [View detailed report](https: //github.com/${{ github.repository }}/actions/runs/${{ github.run_id }})`;

      github.rest.issues.createComment({
        issue_number: context.issue.number,
        owner: context.repo.owner,
        repo: context.repo.repo,
        body: comment
      });

consolidate-results: needs: prowler-scan runs-on: ubuntu-latest if: always()

steps:
- name: Download all artifacts
  uses: actions/download-artifact@v3

- name: Consolidate results
  run: |
    mkdir -p consolidated_reports

    # Combine all regional results
    python << 'EOF'
    import json
    import glob
    import os

    all_results = []

    for result_file in glob.glob('prowler-scan-results-*/prowler_*_scan.json'):
        try:
            with open(result_file, 'r') as f:
                data = json.load(f)
            all_results.extend(data)
        except Exception as e:
            print(f"Error processing {result_file}: {e}")

    # Save consolidated results
    with open('consolidated_reports/prowler_consolidated_scan.json', 'w') as f:
        json.dump(all_results, f, indent=2)

    print(f"Consolidated {len(all_results)} results from all regions")
    EOF

- name: Generate consolidated report
  run: |
    pip install jinja2 matplotlib seaborn pandas
    python scripts/prowler_reporter.py consolidated_reports/prowler_consolidated_scan.json \
      --output-dir consolidated_reports \
      --charts

- name: Upload consolidated results
  uses: actions/upload-artifact@v3
  with:
    name: prowler-consolidated-results
    path: consolidated_reports/

```_

Jenkins Pipeline

```groovy // Jenkinsfile for Prowler security scanning

pipeline { agent any

parameters {
    choice(
        name: 'COMPLIANCE_FRAMEWORK',
        choices: ['cis_1.5_aws', 'pci_3.2.1_aws', 'iso27001_2013_aws', 'nist_800_53_rev5_aws'],
        description: 'Compliance framework to scan'
    )
    string(
        name: 'AWS_REGIONS',
        defaultValue: 'us-east-1,us-west-2,eu-west-1',
        description: 'AWS regions to scan (comma-separated)'
    )
    booleanParam(
        name: 'GENERATE_CHARTS',
        defaultValue: true,
        description: 'Generate charts and visualizations'
    )
}

environment {
    PROWLER_VERSION = '3.12.0'
    REPORTS_DIR = 'prowler_reports'
    TIMESTAMP = sh(script: 'date +%Y%m%d_%H%M%S', returnStdout: true).trim()
}

stages {
    stage('Setup') {
        steps {
            script {
                // Clean workspace
                deleteDir()

                // Install Prowler
                sh '''
                    python3 -m pip install --upgrade pip
                    pip3 install prowler==${PROWLER_VERSION}
                    prowler --version
                '''
            }
        }
    }

    stage('Security Scan') {
        parallel {
            stage('Multi-Region Scan') {
                steps {
                    script {
                        def regions = params.AWS_REGIONS.split(',')
                        def scanJobs = [:]

                        regions.each { region ->
                            scanJobs["Scan ${region}"] = {
                                withCredentials([
                                    [$class: 'AmazonWebServicesCredentialsBinding', 
                                     credentialsId: 'aws-credentials']
                                ]) {
                                    sh """
                                        mkdir -p ${REPORTS_DIR}

                                        prowler aws \\
                                            --region ${region.trim()} \\
                                            --compliance ${params.COMPLIANCE_FRAMEWORK} \\
                                            --output-formats json,csv,html \\
                                            --output-directory ${REPORTS_DIR} \\
                                            --output-filename prowler_${region.trim()}_${TIMESTAMP}
                                    """
                                }
                            }
                        }

                        parallel scanJobs
                    }
                }
            }
        }
    }

    stage('Generate Reports') {
        steps {
            script {
                sh '''
                    # Install reporting dependencies
                    pip3 install jinja2 matplotlib seaborn pandas

                    # Generate reports for each region
                    for scan_file in ${REPORTS_DIR}/prowler_*_${TIMESTAMP}.json; do
                        if [ -f "$scan_file" ]; then
                            echo "Generating report for $scan_file"
                            python3 scripts/prowler_reporter.py "$scan_file" \\
                                --output-dir "${REPORTS_DIR}" \\
                                ${params.GENERATE_CHARTS ? '--charts' : ''}
                        fi
                    done

                    # Consolidate all results
                    python3 << 'EOF'

import json import glob

all_results = [] for result_file in glob.glob("${REPORTS_DIR}/prowler_*_${TIMESTAMP}.json"): try: with open(result_file, 'r') as f: data = json.load(f) all_results.extend(data) except Exception as e: print(f"Error processing {result_file}: {e}")

Save consolidated results

with open("${REPORTS_DIR}/prowler_consolidated_${TIMESTAMP}.json", 'w') as f: json.dump(all_results, f, indent=2)

print(f"Consolidated {len(all_results)} results") EOF

                    # Generate consolidated report
                    python3 scripts/prowler_reporter.py "${REPORTS_DIR}/prowler_consolidated_${TIMESTAMP}.json" \\
                        --output-dir "${REPORTS_DIR}" \\
                        ${params.GENERATE_CHARTS ? '--charts' : ''}
                '''
            }
        }
    }

    stage('Security Gate') {
        steps {
            script {
                sh '''
                    python3 << 'EOF'

import json import sys

Load consolidated results

with open("${REPORTS_DIR}/prowler_consolidated_${TIMESTAMP}.json", 'r') as f: results = json.load(f)

Count issues by severity

critical_issues = sum(1 for r in results if r.get("Status") == "FAIL" and r.get("Severity") == "CRITICAL") high_issues = sum(1 for r in results if r.get("Status") == "FAIL" and r.get("Severity") == "HIGH") medium_issues = sum(1 for r in results if r.get("Status") == "FAIL" and r.get("Severity") == "MEDIUM")

print(f"Security Assessment Results:") print(f"Critical issues: {critical_issues}") print(f"High issues: {high_issues}") print(f"Medium issues: {medium_issues}")

Security gate logic

if critical_issues > 0: print("❌ CRITICAL SECURITY ISSUES FOUND!") print("Build failed due to critical security issues.") sys.exit(1)

if high_issues > 15: print("⚠️ WARNING: High number of high-severity issues found!") print("Consider addressing high-severity issues.") sys.exit(1)

print("✅ Security gate passed") EOF ''' } } }

    stage('Archive Results') {
        steps {
            archiveArtifacts artifacts: "${REPORTS_DIR}/**/*", fingerprint: true

            publishHTML([
                allowMissing: false,
                alwaysLinkToLastBuild: true,
                keepAll: true,
                reportDir: env.REPORTS_DIR,
                reportFiles: '*.html',
                reportName: 'Prowler Security Report'
            ])
        }
    }

    stage('Notify') {
        steps {
            script {
                // Send Slack notification
                def consolidatedResults = readJSON file: "${env.REPORTS_DIR}/prowler_consolidated_${env.TIMESTAMP}.json"
                def totalChecks = consolidatedResults.size()
                def failedChecks = consolidatedResults.count { it.Status == 'FAIL' }
                def complianceScore = ((totalChecks - failedChecks) / totalChecks * 100).round(1)

                def criticalIssues = consolidatedResults.count { it.Status == 'FAIL' && it.Severity == 'CRITICAL' }
                def highIssues = consolidatedResults.count { it.Status == 'FAIL' && it.Severity == 'HIGH' }

                slackSend(
                    channel: '#security',
                    color: complianceScore > 90 ? 'good' : complianceScore > 70 ? 'warning' : 'danger',
                    message: """

🔒 Prowler Security Scan Completed Project: ${env.JOB_NAME} Build: ${env.BUILD_NUMBER} Framework: ${params.COMPLIANCE_FRAMEWORK} Compliance Score: ${complianceScore}% Critical Issues: ${criticalIssues} High Issues: ${highIssues} Failed Checks: ${failedChecks}/${totalChecks} Report: ${env.BUILD_URL}Prowler_Security_Report/ """.trim() ) } } } }

post {
    always {
        // Clean up large files but keep reports
        sh "find . -name '*.json' -size +50M -delete"
    }
    failure {
        emailext(
            subject: "Prowler Security Scan Failed - ${env.JOB_NAME} #${env.BUILD_NUMBER}",
            body: """

The Prowler security scan has failed.

Project: ${env.JOB_NAME} Build Number: ${env.BUILD_NUMBER} Framework: ${params.COMPLIANCE_FRAMEWORK} Build URL: ${env.BUILD_URL}

Please check the build logs and security report for more details. """, to: "${env.CHANGE_AUTHOR_EMAIL ?: 'security-team@company.com'}" ) } } } ```_

Leistungsoptimierung und Fehlerbehebung

Leistungsoptimierung

```bash

!/bin/bash

Prowler performance optimization

optimize_prowler_performance() { echo "Optimizing Prowler performance..."

# 1. Create performance configuration
cat > prowler_performance_config.yaml << 'EOF'

Prowler performance configuration

Global settings

global: # Increase timeout for slow checks timeout: 300

# Parallel execution settings max_workers: 10

# Memory optimization max_memory_usage: "4GB"

# Cache settings enable_cache: true cache_timeout: 3600

Service-specific optimizations

services: # EC2 optimizations ec2: enabled: true timeout: 120 max_parallel_requests: 5

# S3 optimizations s3: enabled: true timeout: 180 max_parallel_requests: 3

# IAM optimizations iam: enabled: true timeout: 60 max_parallel_requests: 2

Check-specific optimizations

checks: # Disable slow or problematic checks disabled_checks: - "cloudtrail_cloudwatch_logging_enabled" # Slow check - "ec2_elastic_ip_shodan" # External dependency - "s3_bucket_object_versioning" # Can be slow for large buckets

# Check timeouts check_timeouts: "iam_user_hardware_mfa_enabled": 30 "ec2_instance_older_than_specific_days": 60 "s3_bucket_public_access_block": 45

Region optimization

regions: # Limit to essential regions for faster scanning enabled_regions: - "us-east-1" - "us-west-2" - "eu-west-1" - "ap-southeast-1"

# Skip regions with no resources skip_empty_regions: true

Output optimization

output: # Reduce output verbosity for performance verbose: false

# Compress large outputs compress_outputs: true

# Limit output formats for faster processing formats: ["json"] EOF

# 2. Create optimized scanning script
cat > optimized_prowler_scan.sh << 'EOF'

!/bin/bash

Optimized Prowler scanning script

Performance environment variables

export PYTHONUNBUFFERED=1 export PYTHONDONTWRITEBYTECODE=1

Memory optimization

export PROWLER_MAX_MEMORY=4096 export PROWLER_CACHE_ENABLED=true

Parallel processing

export PROWLER_MAX_WORKERS=10

Function to run optimized scan

run_optimized_scan() { local region="$1" local service="$2" local output_dir="$3"

echo "Running optimized scan for $service in $region..."

# Use timeout to prevent hanging
timeout 1800 prowler aws \
    --region "$region" \
    --service "$service" \
    --config-file prowler_performance_config.yaml \
    --output-formats json \
    --output-directory "$output_dir" \
    --output-filename "prowler_${region}_${service}" \
    --quiet

local exit_code=$?

if [ $exit_code -eq 124 ]; then
    echo "Warning: Scan for $service in $region timed out"
elif [ $exit_code -ne 0 ]; then
    echo "Error: Scan for $service in $region failed with exit code $exit_code"
else
    echo "Completed scan for $service in $region"
fi

return $exit_code

}

Main optimization function

main() { local timestamp=$(date +%Y%m%d_%H%M%S) local output_dir="optimized_scan_$timestamp"

mkdir -p "$output_dir"

# Essential services for quick scan
local services=("iam" "s3" "ec2" "vpc" "cloudtrail")
local regions=("us-east-1" "us-west-2" "eu-west-1")

echo "Starting optimized Prowler scan..."

# Parallel scanning by service and region
for service in "${services[@]}"; do
    for region in "${regions[@]}"; do
        run_optimized_scan "$region" "$service" "$output_dir" &

        # Limit concurrent jobs
        while [ $(jobs -r | wc -l) -ge 5 ]; do
            sleep 5
        done
    done
done

# Wait for all background jobs to complete
wait

echo "Optimized scan completed. Results in: $output_dir"

# Combine results
python3 << 'PYTHON'

import json import glob import os

output_dir = "$output_dir" combined_results = []

for result_file in glob.glob(f"{output_dir}/prowler_*.json"): try: with open(result_file, 'r') as f: data = json.load(f) combined_results.extend(data) print(f"Processed {len(data)} results from {result_file}") except Exception as e: print(f"Error processing {result_file}: {e}")

Save combined results

with open(f"{output_dir}/prowler_combined_optimized.json", 'w') as f: json.dump(combined_results, f, indent=2)

print(f"Combined {len(combined_results)} total results") PYTHON }

Run optimization

main "$@" EOF

chmod +x optimized_prowler_scan.sh

echo "Performance optimization setup complete"

}

Memory optimization

optimize_memory_usage() { echo "Optimizing Prowler memory usage..."

# Create memory monitoring script
cat > monitor_prowler_memory.sh << 'EOF'

!/bin/bash

Monitor Prowler memory usage

PROWLER_PID="" MEMORY_LOG="prowler_memory.log" CPU_LOG="prowler_cpu.log"

Function to monitor resources

monitor_resources() { echo "Timestamp,Memory_MB,CPU_Percent" > "$MEMORY_LOG" echo "Timestamp,CPU_Percent" > "$CPU_LOG"

while true; do
    if [ -n "$PROWLER_PID" ] && kill -0 "$PROWLER_PID" 2>/dev/null; then
        timestamp=$(date '+%Y-%m-%d %H:%M:%S')

        # Memory usage in MB
        memory_kb=$(ps -p "$PROWLER_PID" -o rss --no-headers 2>/dev/null)
        if [ -n "$memory_kb" ]; then
            memory_mb=$((memory_kb / 1024))
            echo "$timestamp,$memory_mb" >> "$MEMORY_LOG"
        fi

        # CPU usage
        cpu_percent=$(ps -p "$PROWLER_PID" -o %cpu --no-headers 2>/dev/null)
        if [ -n "$cpu_percent" ]; then
            echo "$timestamp,$cpu_percent" >> "$CPU_LOG"
        fi
    else
        break
    fi

    sleep 5
done

}

Start monitoring in background

monitor_resources & MONITOR_PID=$!

Run Prowler with memory optimization

export PYTHONUNBUFFERED=1 export PYTHONDONTWRITEBYTECODE=1

Start Prowler and capture PID

prowler aws --region us-east-1 --service iam --output-formats json & PROWLER_PID=$!

Wait for Prowler to complete

wait $PROWLER_PID

Stop monitoring

kill $MONITOR_PID 2>/dev/null

Analyze memory usage

python3 << 'PYTHON' import pandas as pd import matplotlib.pyplot as plt

try: # Read memory data memory_df = pd.read_csv('prowler_memory.log') memory_df['Timestamp'] = pd.to_datetime(memory_df['Timestamp'])

# Create memory usage chart
plt.figure(figsize=(12, 6))
plt.plot(memory_df['Timestamp'], memory_df['Memory_MB'])
plt.title('Prowler Memory Usage Over Time')
plt.xlabel('Time')
plt.ylabel('Memory Usage (MB)')
plt.xticks(rotation=45)
plt.grid(True)
plt.tight_layout()
plt.savefig('prowler_memory_usage.png')

# Print statistics
print(f"Average Memory Usage: {memory_df['Memory_MB'].mean():.1f} MB")
print(f"Peak Memory Usage: {memory_df['Memory_MB'].max():.1f} MB")
print(f"Memory Usage Range: {memory_df['Memory_MB'].min():.1f} - {memory_df['Memory_MB'].max():.1f} MB")

except Exception as e: print(f"Error analyzing memory usage: {e}") PYTHON

echo "Memory monitoring completed" EOF

chmod +x monitor_prowler_memory.sh

echo "Memory optimization complete"

}

Run optimizations

optimize_prowler_performance optimize_memory_usage ```_

Leitfaden zur Fehlerbehebung

```bash

!/bin/bash

Prowler troubleshooting guide

troubleshoot_prowler() { echo "Prowler Troubleshooting Guide" echo "============================="

# Check Python installation
if ! command -v python3 &> /dev/null; then
    echo "❌ Python 3 not found"
    echo "Solution: Install Python 3.9 or later"
    echo "  sudo apt update && sudo apt install python3 python3-pip"
    return 1
fi

python_version=$(python3 --version | cut -d' ' -f2)
echo "✅ Python found: $python_version"

# Check Prowler installation
if ! command -v prowler &> /dev/null; then
    echo "❌ Prowler not found"
    echo "Solution: Install Prowler"
    echo "  pip3 install prowler"
    return 1
fi

prowler_version=$(prowler --version 2>&1 | head -n1)
echo "✅ Prowler found: $prowler_version"

# Check AWS CLI installation
if ! command -v aws &> /dev/null; then
    echo "⚠️  AWS CLI not found"
    echo "Solution: Install AWS CLI"
    echo "  curl 'https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip' -o 'awscliv2.zip'"
    echo "  unzip awscliv2.zip && sudo ./aws/install"
else
    aws_version=$(aws --version 2>&1)
    echo "✅ AWS CLI found: $aws_version"
fi

# Check AWS credentials
echo ""
echo "Checking AWS credentials..."

if aws sts get-caller-identity > /dev/null 2>&1; then
    account_id=$(aws sts get-caller-identity --query Account --output text)
    user_arn=$(aws sts get-caller-identity --query Arn --output text)
    echo "✅ AWS credentials configured"
    echo "   Account ID: $account_id"
    echo "   User/Role: $user_arn"
else
    echo "❌ AWS credentials not configured or invalid"
    echo "Solution: Configure AWS credentials"
    echo "  aws configure"
    echo "  or set environment variables:"
    echo "  export AWS_ACCESS_KEY_ID=your-key"
    echo "  export AWS_SECRET_ACCESS_KEY=your-secret"
    echo "  export AWS_DEFAULT_REGION=us-east-1"
fi

# Test basic functionality
echo ""
echo "Testing basic functionality..."

if prowler --help > /dev/null 2>&1; then
    echo "✅ Prowler help command works"
else
    echo "❌ Prowler help command failed"
    echo "Solution: Reinstall Prowler"
    echo "  pip3 uninstall prowler && pip3 install prowler"
fi

# Check permissions
echo ""
echo "Checking AWS permissions..."

# Test basic permissions
if aws iam get-account-summary > /dev/null 2>&1; then
    echo "✅ IAM read permissions available"
else
    echo "⚠️  IAM read permissions not available"
    echo "Solution: Ensure IAM permissions are configured"
fi

if aws s3 ls > /dev/null 2>&1; then
    echo "✅ S3 read permissions available"
else
    echo "⚠️  S3 read permissions not available"
    echo "Solution: Ensure S3 permissions are configured"
fi

if aws ec2 describe-regions > /dev/null 2>&1; then
    echo "✅ EC2 read permissions available"
else
    echo "⚠️  EC2 read permissions not available"
    echo "Solution: Ensure EC2 permissions are configured"
fi

# Check system resources
echo ""
echo "Checking system resources..."

available_memory=$(free -m | awk 'NR==2{printf "%.1f", $7/1024}')
if (( $(echo "$available_memory < 2.0" | bc -l) )); then
    echo "⚠️  Low available memory: ${available_memory}GB"
    echo "Recommendation: Ensure at least 4GB available memory for large scans"
else
    echo "✅ Available memory: ${available_memory}GB"
fi

# Check disk space

| disk_usage=$(df . | tail -1 | awk '{print $5}' | sed 's/%//') | if [ "$disk_usage" -gt 90 ]; then echo "⚠️ High disk usage: ${disk_usage}%" echo "Solution: Free up disk space" else echo "✅ Disk usage: ${disk_usage}%" fi

echo ""
echo "Troubleshooting completed"

}

Common error solutions

fix_common_errors() { echo "Common Prowler Errors and Solutions" echo "==================================="

cat << 'EOF'
  1. "ModuleNotFoundError: No module named 'prowler'" Solution:

    • Install Prowler: pip3 install prowler
    • Check Python path: python3 -c "import sys; print(sys.path)"
  2. "AWS credentials not configured" Solution:

    • Run: aws configure
    • Or set environment variables: export AWS_ACCESS_KEY_ID=your-key export AWS_SECRET_ACCESS_KEY=your-secret
  3. "AccessDenied" or "UnauthorizedOperation" Solution:

    • Check IAM permissions
    • Ensure user/role has required policies attached
    • Use Prowler IAM policy from documentation
  4. "Timeout" or "Connection timeout" Solution:

    • Increase timeout: --timeout 300
    • Check internet connectivity
    • Reduce parallel requests
  5. "Memory allocation failed" or "Out of memory" Solution:

    • Scan fewer regions: --region us-east-1
    • Scan specific services: --service s3
    • Increase system memory
  6. "Check failed" or "Plugin error" Solution:

    • Update Prowler: pip3 install --upgrade prowler
    • Skip problematic checks: --excluded-checks check-name
    • Check Prowler logs for details
  7. "JSON decode error" in output Solution:

    • Use single output format: --output-formats json
    • Check for mixed output in logs
    • Ensure clean JSON output
  8. "Rate limiting" or "API throttling" Solution:

    • Reduce parallel requests
    • Add delays between API calls
    • Use multiple AWS accounts/regions
  9. "SSL/TLS certificate errors" Solution:

    • Update certificates: sudo apt update && sudo apt install ca-certificates
    • Check system time: timedatectl status
    • Update Python requests: pip3 install --upgrade requests
  10. "Scan takes too long" or "Hangs" Solution:

    • Use region filtering: --region us-east-1
    • Scan specific services: --service iam,s3
    • Use quick scan: --quick
    • Monitor with timeout: timeout 3600 prowler aws EOF }

Performance diagnostics

diagnose_performance() { echo "Diagnosing Prowler Performance" echo "=============================="

# Test scan performance
echo "Running performance test..."

start_time=$(date +%s.%N)

# Run a simple scan
timeout 120 prowler aws --region us-east-1 --service iam --check iam_root_access_key_check --output-formats json > /dev/null 2>&1
exit_code=$?

end_time=$(date +%s.%N)
duration=$(echo "$end_time - $start_time" | bc)

if [ $exit_code -eq 0 ]; then
    echo "✅ Performance test completed in ${duration}s"
elif [ $exit_code -eq 124 ]; then
    echo "⚠️  Performance test timed out (>120s)"
    echo "Recommendation: Check network connectivity and AWS API performance"
else
    echo "❌ Performance test failed"
    echo "Recommendation: Check configuration and credentials"
fi

# Check Python performance

| python_startup_time=$(python3 -c "import time; start=time.time(); import prowler; print(f'{time.time()-start:.2f}')" 2>/dev/null | | echo "N/A") | echo "Python import time: ${python_startup_time}s"

# System load

| load_avg=$(uptime | awk -F'load average:' '{print $2}' | awk '{print $1}' | sed 's/,//') | echo "System load average: $load_avg"

# Network connectivity test
echo "Testing AWS API connectivity..."

| aws_api_time=$(time (aws sts get-caller-identity > /dev/null 2>&1) 2>&1 | grep real | awk '{print $2}') | echo "AWS API response time: $aws_api_time"

# Recommendations
echo ""
echo "Performance Recommendations:"
echo "- Use region filtering for faster scans"
echo "- Scan specific services instead of all services"
echo "- Use --quick for essential checks only"
echo "- Increase system memory for large environments"
echo "- Use SSD storage for better I/O performance"
echo "- Monitor AWS API rate limits"

}

Main troubleshooting function

main() { troubleshoot_prowler echo "" fix_common_errors echo "" diagnose_performance }

Run troubleshooting

main ```_

Ressourcen und Dokumentation

Offizielle Mittel

Compliance Ressourcen

Integrationsbeispiele