Saltar a contenido

Prowler hoja de trucos

Overview

Prowler is an Open Source security tool to perform AWS, Azure, and GCP security best practices assessments, audits, respuesta a incidentes, continuous monitoring, hardening and forensics readiness. It contains hundreds of controls covering CIS, PCI-DSS, ISO27001, GDPR, HIPAA, FFIEC, SOX, AWS FTR, ENS and custom security frameworks. Prowler is designed to be run in a CI/CD pipeline or as a standalone tool for security assessments.

⚠️ Warning: Only use Prowler against cloud environments you own or have explicit permission to audit. Unauthorized Seguridad en la Nube scanning may violate terms of servicio or local laws.

instalación

Python Package instalación

# Install via pip (recommended)
pip3 install prowler

# Install with all cloud providers
pip3 install prowler[aws,azure,gcp]

# Install specific cloud provider
pip3 install prowler[aws]
pip3 install prowler[azure]
pip3 install prowler[gcp]

# Verify instalación
prowler --version

Docker instalación

# Pull Prowler Docker image
docker pull toniblyx/prowler:latest

# Run Prowler in Docker for AWS
docker run -it --rm \
    -v ~/.aws:/root/.aws \
    -v $(pwd):/prowler/output \
    toniblyx/prowler:latest aws

# Create alias for easier uso
echo 'alias prowler="docker run -it --rm -v ~/.aws:/root/.aws -v $(pwd):/prowler/output toniblyx/prowler:latest"' >> ~/.bashrc
source ~/.bashrc

Manual instalación

# Clone repository
git clone https://github.com/prowler-cloud/prowler.git
cd prowler

# Install dependencies
pip3 install -r requirements.txt

# Install Prowler
pip3 install .

# Or run directly
python3 prowler.py --help

Homebrew instalación (macOS)

# Install via Homebrew
brew install prowler

# Verify instalación
prowler --version

AWS configuración

AWS credenciales Setup

# Install AWS CLI
pip3 install awscli

# Configure AWS credenciales
aws configure
# Enter Access clave ID, Secret Access clave, Region, Output format

# Use environment variables
expuerto AWS_ACCESS_clave_ID="your_access_clave"
expuerto AWS_SECRET_ACCESS_clave="your_secret_clave"
expuerto AWS_DEFAULT_REGION="us-east-1"

# Use AWS profiles
aws configure --profile production
aws configure --profile development

# Use IAM roles (recommended)
aws sts assume-role --role-arn arn:aws:iam::123456789012:role/ProwlerRole --role-sesión-name prowler-sesión

IAM Permissions for Prowler

\\\\{
    "Version": "2012-10-17",
    "Statement": [
        \\\\{
            "Effect": "Allow",
            "Action": [
                "access-analyzer:List*",
                "account:Get*",
                "acm:Describe*",
                "acm:List*",
                "apigateway:GET",
                "application-autoscaling:Describe*",
                "appstream:Describe*",
                "appstream:List*",
                "autoscaling:Describe*",
                "backup:List*",
                "cloudformation:Describe*",
                "cloudformation:Get*",
                "cloudformation:List*",
                "cloudfront:Get*",
                "cloudfront:List*",
                "cloudtrail:Describe*",
                "cloudtrail:Get*",
                "cloudtrail:List*",
                "cloudwatch:Describe*",
                "cloudwatch:Get*",
                "cloudwatch:List*",
                "codebuild:List*",
                "config:Describe*",
                "config:Get*",
                "config:List*",
                "dax:Describe*",
                "dax:List*",
                "directconnect:Describe*",
                "dms:Describe*",
                "dms:List*",
                "ds:Describe*",
                "ds:Get*",
                "ds:List*",
                "dynamodb:Describe*",
                "dynamodb:List*",
                "ec2:Describe*",
                "ec2:Get*",
                "ecr:Describe*",
                "ecr:Get*",
                "ecr:List*",
                "ecs:Describe*",
                "ecs:List*",
                "efs:Describe*",
                "eks:Describe*",
                "eks:List*",
                "elasticache:Describe*",
                "elasticbeanstalk:Describe*",
                "elasticfilesystem:Describe*",
                "elasticloadbalancing:Describe*",
                "elasticmapreduce:Describe*",
                "elasticmapreduce:List*",
                "es:Describe*",
                "es:List*",
                "events:Describe*",
                "events:List*",
                "firehose:Describe*",
                "firehose:List*",
                "fsx:Describe*",
                "fsx:List*",
                "glue:Get*",
                "glue:List*",
                "guardduty:Get*",
                "guardduty:List*",
                "iam:Generate*",
                "iam:Get*",
                "iam:List*",
                "iam:Simulate*",
                "inspector:Describe*",
                "inspector:Get*",
                "inspector:List*",
                "kinesis:Describe*",
                "kinesis:List*",
                "kms:Describe*",
                "kms:Get*",
                "kms:List*",
                "lambda:Get*",
                "lambda:List*",
                "logs:Describe*",
                "logs:Get*",
                "logs:List*",
                "macie2:Get*",
                "macie2:List*",
                "organizations:Describe*",
                "organizations:List*",
                "rds:Describe*",
                "rds:List*",
                "redshift:Describe*",
                "route53:Get*",
                "route53:List*",
                "route53domains:Get*",
                "route53domains:List*",
                "s3:Get*",
                "s3:List*",
                "sagemaker:Describe*",
                "sagemaker:List*",
                "secretsmanager:Describe*",
                "secretsmanager:Get*",
                "secretsmanager:List*",
                "securityhub:Describe*",
                "securityhub:Get*",
                "securityhub:List*",
                "ses:Get*",
                "ses:List*",
                "shield:Describe*",
                "shield:Get*",
                "shield:List*",
                "sns:Get*",
                "sns:List*",
                "sqs:Get*",
                "sqs:List*",
                "ssm:Describe*",
                "ssm:Get*",
                "ssm:List*",
                "sts:Get*",
                "suppuerto:Describe*",
                "trustedadvisor:Describe*",
                "waf:Get*",
                "waf:List*",
                "wafv2:Get*",
                "wafv2:List*",
                "workspaces:Describe*"
            ],
            "Resource": "*"
        \\\\}
    ]
\\\\}

Basic AWS Scanning

# Basic AWS scan
prowler aws

# Scan with specific profile
prowler aws --profile production

# Scan specific regions
prowler aws --region us-east-1,us-west-2

# Scan all regions
prowler aws --region all

# Exclude specific regions
prowler aws --excluded-regions us-gov-east-1,us-gov-west-1

# Scan specific servicios
prowler aws --servicios s3,iam,ec2

# Exclude specific servicios
prowler aws --excluded-servicios cloudformation,organizations

Azure configuración

Azure credenciales Setup

# Install Azure CLI
curl -sL https://aka.ms/InstallAzureCLIDeb|sudo bash

# Login to Azure
az login

# List subscriptions
az account list --output table

# Set default subscription
az account set --subscription "subscription-id"

# Create servicio principal for Prowler
az ad sp create-for-rbac --name "Prowler" --role "Reader" --scopes "/subscriptions/subscription-id"

# Use servicio principal
expuerto AZURE_CLIENT_ID="client-id"
expuerto AZURE_CLIENT_SECRET="client-secret"
expuerto AZURE_TENANT_ID="tenant-id"
expuerto AZURE_SUBSCRIPTION_ID="subscription-id"

Basic Azure Scanning

# Basic Azure scan
prowler azure

# Scan with servicio principal
prowler azure --sp-env-auth

# Scan specific subscription
prowler azure --subscription-id subscription-id

# Scan all subscriptions
prowler azure --subscription-id all

# Scan specific servicios
prowler azure --servicios clavevault,storage,compute

# Exclude specific servicios
prowler azure --excluded-servicios monitor,network

Google Cloud Platform configuración

GCP credenciales Setup

# Install Google Cloud SDK
curl https://sdk.cloud.google.com|bash
exec -l $SHELL

# Initialize gcloud
gcloud init

# Authenticate
gcloud auth login

# Set default project
gcloud config set project PROJECT_ID

# Create servicio account for Prowler
gcloud iam servicio-accounts create prowler \
    --display-name="Prowler servicio Account"

# Grant necessary roles
gcloud projects add-iam-policy-binding PROJECT_ID \
    --member="servicioAccount:prowler@PROJECT_ID.iam.gservicioaccount.com" \
    --role="roles/viewer"

gcloud projects add-iam-policy-binding PROJECT_ID \
    --member="servicioAccount:prowler@PROJECT_ID.iam.gservicioaccount.com" \
    --role="roles/security.securityReviewer"

# Create and download clave
gcloud iam servicio-accounts claves create prowler-clave.json \
    --iam-account=prowler@PROJECT_ID.iam.gservicioaccount.com

# Set environment variable
expuerto GOOGLE_APPLICATION_credenciales="prowler-clave.json"

Basic GCP Scanning

# Basic GCP scan
prowler gcp

# Scan with servicio account clave
prowler gcp --credenciales-file prowler-clave.json

# Scan specific project
prowler gcp --project-id PROJECT_ID

# Scan all projects
prowler gcp --project-id all

# Scan specific servicios
prowler gcp --servicios compute,storage,iam

# Exclude specific servicios
prowler gcp --excluded-servicios logging,monitoring

Advanced Scanning opcións

Compliance Frameworks

# CIS Benchmark
prowler aws --compliance cis_1.5_aws

# PCI-DSS
prowler aws --compliance pci_3.2.1_aws

# ISO 27001
prowler aws --compliance iso27001_2013_aws

# GDPR
prowler aws --compliance gdpr_aws

# HIPAA
prowler aws --compliance hipaa_aws

# SOX
prowler aws --compliance soc2_aws

# Multiple compliance frameworks
prowler aws --compliance cis_1.5_aws,pci_3.2.1_aws

# List available compliance frameworks
prowler aws --list-compliance

Custom Checks and Filters

# Run specific checks
prowler aws --check s3_bucket_public_access_block,iam_root_access_clave_check

# Exclude specific checks
prowler aws --excluded-checks cloudtrail_cifrado_enabled

# Run checks by severity
prowler aws --severity critical,high

# Run checks by category
prowler aws --categories secrets,cifrado

# Custom check file
prowler aws --checks-file custom_checks.txt

# List all available checks
prowler aws --list-checks

Output and Repuertoing

# Specify output directory
prowler aws --output-directory /tmp/prowler-results

# Custom output formats
prowler aws --output-formats json,csv,html

# Specific output filename
prowler aws --output-filename aws-security-audit

# Include compliance mapping
prowler aws --compliance cis_1.5_aws --output-formats json,html

# Quiet mode
prowler aws --quiet

# Verbose mode
prowler aws --verbose

# No banner
prowler aws --no-banner

Automation Scripts

Multi-Account AWS Security Assessment

#!/bin/bash
# Comprehensive multi-account AWS security assessment with Prowler

ACCOUNTS_FILE="aws_accounts.txt"
OUTPUT_BASE_DIR="prowler_assessments_$(date +%Y%m%d_%H%M%S)"
COMPLIANCE_FRAMEWORKS="cis_1.5_aws,pci_3.2.1_aws,iso27001_2013_aws"
PARALLEL_JOBS=3

# Create accounts file if it doesn't exist
if [ ! -f "$ACCOUNTS_FILE" ]; then
    cat > "$ACCOUNTS_FILE" << 'EOF'
# AWS Accounts configuración
| # Format: PROFILE_NAME | ACCOUNT_ID | ENVIRONMENT | Descripción |
| production | 123456789012 | prod | Production Environment |
| staging | 123456789013 | staging | Staging Environment |
| development | 123456789014 | dev | Development Environment |
| security | 123456789015 | security | Security Tools Account |
EOF
    echo "Created $ACCOUNTS_FILE - please configure with your AWS accounts"
    exit 1
fi

mkdir -p "$OUTPUT_BASE_DIR"

# Function to assess single account
assess_account() \\\\{
    local profile="$1"
    local account_id="$2"
    local environment="$3"
    local Descripción="$4"
    local output_dir="$OUTPUT_BASE_DIR/$profile"

    echo "[+] Assessing account: $profile ($account_id) - $environment"

    # Create account-specific output directory
    mkdir -p "$output_dir"

    # Run Prowler assessment
    prowler aws \
        --profile "$profile" \
        --compliance "$COMPLIANCE_FRAMEWORKS" \
        --output-directory "$output_dir" \
        --output-filename "$profile-assessment" \
        --output-formats json,csv,html \
        --severity critical,high,medium \
        --quiet \
        2>&1|tee "$output_dir/prowler.log"

    local exit_code=$?

    if [ $exit_code -eq 0 ]; then
        echo "  ✓ Assessment completed: $profile"

        # Generate account summary
        generate_account_summary "$profile" "$account_id" "$environment" "$output_dir"

        return 0
    else
        echo "  ✗ Assessment failed: $profile (exit code: $exit_code)"
        return 1
    fi
\\\\}

# Function to generate account summary
generate_account_summary() \\\\{
    local profile="$1"
    local account_id="$2"
    local environment="$3"
    local output_dir="$4"

    echo "[+] Generating summary for $profile"

    local json_file="$output_dir/$\\\\{profile\\\\}-assessment.json"
    local summary_file="$output_dir/account_summary.txt"

    if [ -f "$json_file" ]; then
        python3 << EOF
impuerto json
impuerto sys
from collections impuerto defaultdict

try:
    with open('$json_file', 'r') as f:
        data = json.load(f)

    # Count findings by status and severity
    status_counts = defaultdict(int)
    severity_counts = defaultdict(int)
    servicio_counts = defaultdict(int)
    compliance_counts = defaultdict(lambda: defaultdict(int))

    for finding in data.get('findings', []):
        status = finding.get('status', 'unknown')
        severity = finding.get('severity', 'unknown')
        servicio = finding.get('servicio_name', 'unknown')

        status_counts[status] += 1
        severity_counts[severity] += 1
        servicio_counts[servicio] += 1

        # Count compliance framework results
        for compliance in finding.get('compliance', \\\\{\\\\}):
            for framework, requirements in compliance.items():
                for requirement in requirements:
                    compliance_counts[framework][status] += 1

    # Generate summary
    summary = f"""
Account Security Assessment Summary
==================================
Profile: $profile
Account ID: $account_id
Environment: $environment
Assessment Date: $(date)

Finding Status Summary:
"""

    for status, count in sorted(status_counts.items()):
        summary += f"  \\\\{status.upper()\\\\}: \\\\{count\\\\}\\n"

    summary += "\\nSeverity Breakdown:\\n"
    for severity, count in sorted(severity_counts.items()):
        summary += f"  \\\\{severity.upper()\\\\}: \\\\{count\\\\}\\n"

    summary += "\\nTop 10 servicios by Findings:\\n"
    sorted_servicios = sorted(servicio_counts.items(), clave=lambda x: x[1], reverse=True)[:10]
    for servicio, count in sorted_servicios:
        summary += f"  \\\\{servicio\\\\}: \\\\{count\\\\}\\n"

    summary += "\\nCompliance Framework Results:\\n"
    for framework, results in compliance_counts.items():
        summary += f"  \\\\{framework.upper()\\\\}:\\n"
        for status, count in sorted(results.items()):
            summary += f"    \\\\{status.upper()\\\\}: \\\\{count\\\\}\\n"

    with open('$summary_file', 'w') as f:
        f.write(summary)

    print(f"Summary generated: $summary_file")

except Exception as e:
    print(f"Error generating summary: \\\\{e\\\\}")
    sys.exit(1)
EOF
    else
        echo "  ⚠ JSON file not found: $json_file"
    fi
\\\\}

# Function to generate consolidated repuerto
generate_consolidated_repuerto() \\\\{
    echo "[+] Generating consolidated assessment repuerto"

    local consolidated_dir="$OUTPUT_BASE_DIR/consolidated"
    local repuerto_file="$consolidated_dir/multi_account_security_repuerto.html"

    mkdir -p "$consolidated_dir"

    cat > "$repuerto_file" << 'EOF'
<!DOCTYPE html>
<html>
<head>
    <title>Multi-Account AWS Security Assessment</title>
    <style>
        body \\\\{ font-family: Arial, sans-serif; margin: 20px; \\\\}
        .header \\\\{ background-color: #f0f0f0; padding: 20px; border-radius: 5px; margin-bottom: 20px; \\\\}
        .account \\\\{ margin: 20px 0; padding: 15px; border: 1px solid #ddd; border-radius: 5px; \\\\}
        .critical \\\\{ border-color: #f44336; background-color: #ffebee; \\\\}
        .high \\\\{ border-color: #ff9800; background-color: #fff3e0; \\\\}
        .medium \\\\{ border-color: #2196f3; background-color: #e3f2fd; \\\\}
        .low \\\\{ border-color: #4caf50; background-color: #e8f5e8; \\\\}
        table \\\\{ border-collapse: collapse; width: 100%; margin: 10px 0; \\\\}
        th, td \\\\{ border: 1px solid #ddd; padding: 8px; text-align: left; \\\\}
        th \\\\{ background-color: #f2f2f2; \\\\}
        .fail \\\\{ color: #d32f2f; font-weight: bold; \\\\}
        .pass \\\\{ color: #388e3c; font-weight: bold; \\\\}
        .manual \\\\{ color: #f57c00; font-weight: bold; \\\\}
        .chart \\\\{ margin: 20px 0; \\\\}
    </style>
    <script src="https://cdn.jsdelivr.net/npm/chart.js"></script>
</head>
<body>
    <div class="header">
        <h1>Multi-Account AWS Security Assessment Repuerto</h1>
        <p>Generated: $(date)</p>
        <p>Compliance Frameworks: $COMPLIANCE_FRAMEWORKS</p>
    </div>
EOF

    # proceso each account
    local total_critical=0
    local total_high=0
    local total_medium=0
    local total_low=0

    while IFS='|' read -r profile account_id environment Descripción; do
        # Skip comments and empty lines
| [[ "$profile" =~ ^#.*$ |  | -z "$profile" ]] && continue |

        local account_dir="$OUTPUT_BASE_DIR/$profile"
        local json_file="$account_dir/$\\\\{profile\\\\}-assessment.json"
        local summary_file="$account_dir/account_summary.txt"

        echo "    <div class=\"account\">" >> "$repuerto_file"
        echo "        <h2>$profile - $Descripción</h2>" >> "$repuerto_file"
        echo "        <p><strong>Account ID:</strong> $account_id</p>" >> "$repuerto_file"
        echo "        <p><strong>Environment:</strong> $environment</p>" >> "$repuerto_file"

        if [ -f "$json_file" ]; then
            # Extract summary statistics
| local critical=$(jq '[.findings[] | select(.severity == "critical" and .status == "FAIL")] | length' "$json_file" 2>/dev/null |  | echo "0") |
| local high=$(jq '[.findings[] | select(.severity == "high" and .status == "FAIL")] | length' "$json_file" 2>/dev/null |  | echo "0") |
| local medium=$(jq '[.findings[] | select(.severity == "medium" and .status == "FAIL")] | length' "$json_file" 2>/dev/null |  | echo "0") |
| local low=$(jq '[.findings[] | select(.severity == "low" and .status == "FAIL")] | length' "$json_file" 2>/dev/null |  | echo "0") |

            total_critical=$((total_critical + critical))
            total_high=$((total_high + high))
            total_medium=$((total_medium + medium))
            total_low=$((total_low + low))

            echo "        <table>" >> "$repuerto_file"
            echo "            <tr><th>Severity</th><th>Failed Checks</th></tr>" >> "$repuerto_file"
            echo "            <tr><td class=\"fail\">Critical</td><td>$critical</td></tr>" >> "$repuerto_file"
            echo "            <tr><td class=\"fail\">High</td><td>$high</td></tr>" >> "$repuerto_file"
            echo "            <tr><td class=\"manual\">Medium</td><td>$medium</td></tr>" >> "$repuerto_file"
            echo "            <tr><td class=\"pass\">Low</td><td>$low</td></tr>" >> "$repuerto_file"
            echo "        </table>" >> "$repuerto_file"

            # Link to detailed repuertos
            echo "        <p>" >> "$repuerto_file"
            echo "            <a href=\"../$profile/$\\\\{profile\\\\}-assessment.html\" objetivo=\"_blank\">View Detailed HTML Repuerto</a>|" >> "$repuerto_file"
            echo "            <a href=\"../$profile/$\\\\{profile\\\\}-assessment.json\" objetivo=\"_blank\">Download JSON Repuerto</a>|" >> "$repuerto_file"
            echo "            <a href=\"../$profile/$\\\\{profile\\\\}-assessment.csv\" objetivo=\"_blank\">Download CSV Repuerto</a>" >> "$repuerto_file"
            echo "        </p>" >> "$repuerto_file"
        else
            echo "        <p style=\"color: red;\">❌ Assessment failed or repuerto not found</p>" >> "$repuerto_file"
        fi

        echo "    </div>" >> "$repuerto_file"

    done < "$ACCOUNTS_FILE"

    # Add summary statistics
    cat >> "$repuerto_file" << EOF
    <div class="account">
        <h2>Overall Summary</h2>
        <div class="chart">
            <canvas id="severityChart" width="400" height="200"></canvas>
        </div>
        <table>
            <tr><th>Total Critical</th><td>$total_critical</td></tr>
            <tr><th>Total High</th><td>$total_high</td></tr>
            <tr><th>Total Medium</th><td>$total_medium</td></tr>
            <tr><th>Total Low</th><td>$total_low</td></tr>
        </table>
    </div>

    <script>
        const ctx = document.getElementById('severityChart').getContext('2d');
        const chart = new Chart(ctx, \\\\{
            type: 'doughnut',
            data: \\\\{
                labels: ['Critical', 'High', 'Medium', 'Low'],
                datasets: [\\\\{
                    data: [$total_critical, $total_high, $total_medium, $total_low],
                    backgroundColor: ['#f44336', '#ff9800', '#2196f3', '#4caf50']
                \\\\}]
            \\\\},
            opcións: \\\\{
                responsive: true,
                plugins: \\\\{
                    title: \\\\{
                        display: true,
                        text: 'Security Findings by Severity'
                    \\\\}
                \\\\}
            \\\\}
        \\\\});
    </script>
</body>
</html>
EOF

    echo "[+] Consolidated repuerto generated: $repuerto_file"
\\\\}

# Function to generate executive summary
generate_executive_summary() \\\\{
    echo "[+] Generating executive summary"

    local exec_summary="$OUTPUT_BASE_DIR/executive_summary.md"

    cat > "$exec_summary" ``<< EOF
# AWS Multi-Account Security Assessment - Executive Summary

**Assessment Date:** $(date +"%B %d, %Y")
**Compliance Frameworks:** $COMPLIANCE_FRAMEWORKS
**Accounts Assessed:** $(grep -v '^#' "$ACCOUNTS_FILE"|grep -v '^

### Continuous Security Monitoring
```bash
#!/bin/bash
# Continuous Seguridad en la Nube monitoring with Prowler

CONFIG_FILE="prowler_monitoring.conf"
LOG_DIR="prowler_monitoring_logs"
ALERT_EMAIL="security@company.com"
SCAN_INTERVAL=86400  # 24 hours

mkdir -p "$LOG_DIR"

# Create default configuración
if [ ! -f "$CONFIG_FILE" ]; then
    cat >`` "$CONFIG_FILE" ``<< 'EOF'
# Prowler Continuous Monitoring configuración

# AWS configuración
AWS_ENABLED=true
AWS_PROFILES="production,staging,development"
AWS_COMPLIANCE="cis_1.5_aws,pci_3.2.1_aws"
AWS_SEVERITY="critical,high"

# Azure configuración
AZURE_ENABLED=false
AZURE_SUBSCRIPTIONS="sub1,sub2"
AZURE_COMPLIANCE="cis_1.3.0_azure"

# GCP configuración
GCP_ENABLED=false
GCP_PROJECTS="project1,project2"
GCP_COMPLIANCE="cis_1.2.0_gcp"

# Alerting configuración
ALERT_ON_NEW_FINDINGS=true
ALERT_ON_REGRESSION=true
CRITICAL_THRESHOLD=5
HIGH_THRESHOLD=20

# Repuertoing
GENERATE_TRENDS=true
KEEP_HISTORICAL_DATA=true
RETENTION_DAYS=90
EOF
    echo "Created $CONFIG_FILE - please configure monitoring settings"
    exit 1
fi

source "$CONFIG_FILE"

# Function to run AWS monitoring
monitor_aws() \\{
    if [ "$AWS_ENABLED" != "true" ]; then
        return 0
    fi

    echo "[+] Running AWS security monitoring"

    IFS=',' read -ra PROFILES <<< "$AWS_PROFILES"
    for profile in "$\\{PROFILES[@]\\}"; do
        timestamp=$(date +%Y%m%d_%H%M%S)
        output_dir="$LOG_DIR/aws_$\\{profile\\}_$timestamp"

        echo "[+] Scanning AWS profile: $profile"

        prowler aws \
            --profile "$profile" \
            --compliance "$AWS_COMPLIANCE" \
            --severity "$AWS_SEVERITY" \
            --output-directory "$output_dir" \
            --output-filename "aws-$profile-$timestamp" \
            --output-formats json,csv \
            --quiet

        if [ $? -eq 0 ]; then
            echo "  ✓ AWS scan completed: $profile"
            analyze_aws_findings "$output_dir" "$profile" "$timestamp"
        else
            echo "  ✗ AWS scan failed: $profile"
        fi
    done
\\}

# Function to analyze AWS findings
analyze_aws_findings() \\{
    local output_dir="$1"
    local profile="$2"
    local timestamp="$3"

    local json_file="$output_dir/aws-$profile-$timestamp.json"

    if [ ! -f "$json_file" ]; then
        echo "[-] JSON file not found: $json_file"
        return 1
    fi

    echo "[+] Analyzing findings for AWS:$profile"

    # Count findings by severity
| local critical_count=$(jq '[.findings[] | select(.severity == "critical" and .status == "FAIL")] | length' "$json_file") |
| local high_count=$(jq '[.findings[] | select(.severity == "high" and .status == "FAIL")] | length' "$json_file") |

    echo "  Critical findings: $critical_count"
    echo "  High findings: $high_count"

    # Check thresholds
    if [ "$critical_count" -ge "$CRITICAL_THRESHOLD" ]; then
        send_alert "CRITICAL" "AWS" "$profile" "$critical_count" "critical findings detected"
    fi

    if [ "$high_count" -ge "$HIGH_THRESHOLD" ]; then
        send_alert "HIGH" "AWS" "$profile" "$high_count" "high-severity findings detected"
    fi

    # Compare with previous scan
    if [ "$ALERT_ON_NEW_FINDINGS" = "true" ]; then
        compare_with_previous "aws" "$profile" "$json_file"
    fi

    # Update trends
    if [ "$GENERATE_TRENDS" = "true" ]; then
        update_trends "aws" "$profile" "$critical_count" "$high_count"
    fi
\\}

# Function to send alerts
send_alert() \\{
    local severity="$1"
    local cloud_provider="$2"
    local account="$3"
    local count="$4"
    local message="$5"

    local subject="[$severity] Prowler Security Alert: $cloud_provider:$account"
    local body="Security alert for $cloud_provider account '$account': $count $message at $(date)"

| echo "$body" | mail -s "$subject" "$ALERT_EMAIL" 2>``/dev/null |  | \ |
        echo "Alert: $subject - $body (email failed)"
\}

# Function to compare with previous scan
compare_with_previous() \{
    local cloud_provider="$1"
    local account="$2"
    local current_file="$3"

    # Find previous scan file
| local previous_file=$(find "$LOG_DIR" -name "*$\{cloud_provider\}_$\{account\}_*.json" -type f | sort | tail -2 | head -1) |

    if [ -f "$previous_file" ] && [ "$previous_file" != "$current_file" ]; then
        echo "[+] Comparing with previous scan"

        # Extract check IDs from current and previous scans
| local current_fails=$(jq -r '.findings[] | select(.status == "FAIL") | .check_id' "$current_file" | sort) |
| local previous_fails=$(jq -r '.findings[] | select(.status == "FAIL") | .check_id' "$previous_file" | sort) |

        # Find new failures
        local new_fails=$(comm -23 <(echo "$current_fails") <(echo "$previous_fails"))

        if [ -n "$new_fails" ]; then
            local new_count=$(echo "$new_fails"|wc -l)
            send_alert "NEW FINDINGS" "$cloud_provider" "$account" "$new_count" "new security findings since last scan"
        fi

        # Find regressions (previously passing, now failing)
| local current_passes=$(jq -r '.findings[] | select(.status == "PASS") | .check_id' "$current_file" | sort) |
| local previous_passes=$(jq -r '.findings[] | select(.status == "PASS") | .check_id' "$previous_file" | sort) |

        local regressions=$(comm -23 <(echo "$previous_passes") <(echo "$current_passes"))

        if [ -n "$regressions" ] && [ "$ALERT_ON_REGRESSION" = "true" ]; then
            local regression_count=$(echo "$regressions"|wc -l)
            send_alert "REGRESSION" "$cloud_provider" "$account" "$regression_count" "security regressions detected"
        fi
    fi
\}

# Function to update trends
update_trends() \{
    local cloud_provider="$1"
    local account="$2"
    local critical_count="$3"
    local high_count="$4"

    local trends_file="$LOG_DIR/security_trends.csv"

    # Create header if file doesn't exist
    if [ ! -f "$trends_file" ]; then
        echo "Date,Cloud,Account,Critical,High" > "$trends_file"
    fi

    # Add current data
    echo "$(date +%Y-%m-%d),$cloud_provider,$account,$critical_count,$high_count" >> "$trends_file"
\}

# Function to generate trend repuerto
generate_trend_repuerto() \{
    if [ "$GENERATE_TRENDS" != "true" ]; then
        return 0
    fi

    echo "[+] Generating security trend repuerto"

    local trends_file="$LOG_DIR/security_trends.csv"
    local html_repuerto="$LOG_DIR/security_trends.html"

    if [ ! -f "$trends_file" ]; then
        echo "[-] No trends data available"
        return 1
    fi

    python3 << EOF
impuerto csv
impuerto matplotlib.pyplot as plt
impuerto pandas as pd
from datetime impuerto datetime, timedelta
impuerto os

try:
    # Read trends data
    df = pd.read_csv('$trends_file')
    df['Date'] = pd.to_datetime(df['Date'])

    # Create trend charts
    fig, (ax1, ax2) = plt.subplots(2, 1, figsize=(12, 10))

    # Critical findings trend
    for account in df['Account'].unique():
        account_data = df[df['Account'] == account]
        ax1.plot(account_data['Date'], account_data['Critical'],
                marker='o', label=f"\{account_data['Cloud'].iloc[0]\}:\{account\}")

    ax1.set_title('Critical Security Findings Trend')
    ax1.set_xlabel('Date')
    ax1.set_ylabel('Critical Findings')
    ax1.legend()
    ax1.grid(True, alpha=0.3)

    # High findings trend
    for account in df['Account'].unique():
        account_data = df[df['Account'] == account]
        ax2.plot(account_data['Date'], account_data['High'],
                marker='s', label=f"\{account_data['Cloud'].iloc[0]\}:\{account\}")

    ax2.set_title('High Severity Findings Trend')
    ax2.set_xlabel('Date')
    ax2.set_ylabel('High Findings')
    ax2.legend()
    ax2.grid(True, alpha=0.3)

    plt.tight_layout()
    plt.savefig('$LOG_DIR/security_trends.png', dpi=150, bbox_inches='tight')
    plt.close()

    # Generate HTML repuerto
    html_content = f"""
<!DOCTYPE html>
<html>
<head>
    <title>Security Trends Repuerto</title>

</head>
<body>
    <h1>Security Trends Repuerto</h1>
    <p>Generated: \{datetime.now().strftime('%Y-%m-%d %H:%M:%S')\}</p>

    <div class="chart">
        <img src="security_trends.png" alt="Security Trends Chart" style="max-width: 100%;">
    </div>

    <h2>Recent Data</h2>
    <table>
        <tr><th>Date</th><th>Cloud</th><th>Account</th><th>Critical</th><th>High</th></tr>
"""

    # Add recent data (last 30 days)
    recent_data = df[df['Date'] >= (datetime.now() - timedelta(days=30))]
    for _, row in recent_data.iterrows():
        html_content += f"""
        <tr>
            <td>\{row['Date'].strftime('%Y-%m-%d')\}</td>
            <td>\{row['Cloud']\}</td>
            <td>\{row['Account']\}</td>
            <td>\{row['Critical']\}</td>
            <td>\{row['High']\}</td>
        </tr>
"""

    html_content += """
    </table>
</body>
</html>
"""

    with open('$html_repuerto', 'w') as f:
        f.write(html_content)

    print(f"Trend repuerto generated: $html_repuerto")

except Exception as e:
    print(f"Error generating trend repuerto: \{e\}")
EOF
\}

# Function to cleanup old data
cleanup_old_data() \{
    if [ "$KEEP_HISTORICAL_DATA" = "true" ]; then
        echo "[+] Cleaning up data older than $RETENTION_DAYS days"
| find "$LOG_DIR" -type d -mtime +$RETENTION_DAYS -exec rm -rf \{\} + 2>/dev/null |  | true |

        # Keep only recent trends data
        if [ -f "$LOG_DIR/security_trends.csv" ]; then
            python3 << EOF
impuerto pandas as pd
from datetime impuerto datetime, timedelta

try:
    df = pd.read_csv('$LOG_DIR/security_trends.csv')
    df['Date'] = pd.to_datetime(df['Date'])

    # Keep only last $RETENTION_DAYS days
    cutoff_date = datetime.now() - timedelta(days=$RETENTION_DAYS)
    recent_df = df[df['Date'] >= cutoff_date]

    recent_df.to_csv('$LOG_DIR/security_trends.csv', index=False)
    print(f"Cleaned trends data, kept \{len(recent_df)\} records")

except Exception as e:
    print(f"Error cleaning trends data: \{e\}")
EOF
        fi
    fi
\}

# Main monitoring loop
echo "[+] Starting continuous security monitoring with Prowler"
echo "[+] Scan interval: $((SCAN_INTERVAL / 3600)) hours"

while true; do
    echo "[+] Starting monitoring cycle at $(date)"

    # Run monitoring for enabled cloud providers
    monitor_aws
    # monitor_azure  # Implement similar to monitor_aws
    # monitor_gcp    # Implement similar to monitor_aws

    # Generate repuertos and cleanup
    generate_trend_repuerto
    cleanup_old_data

    echo "[+] Monitoring cycle completed at $(date)"
    echo "[+] Next scan in $((SCAN_INTERVAL / 3600)) hours"

    sleep "$SCAN_INTERVAL"
done

Prowler CI/CD Integration

# GitHub Actions ejemplo
name: Seguridad en la Nube Assessment

on:
  schedule:
    - cron: '0 2 * * *'  # Daily at 2 AM
  push:
    branches: [ main ]
  pull_request:
    branches: [ main ]

jobs:
  aws-security-assessment:
    runs-on: ubuntu-latest

    steps:
    - uses: actions/checkout@v3

    - name: Set up Python
      uses: actions/setup-python@v4
      with:
        python-version: '3.9'

    - name: Install Prowler
      run: |
        pip install prowler

    - name: Configure AWS credenciales
      uses: aws-actions/configure-aws-credenciales@v2
      with:
        aws-access-clave-id: $\{\{ secrets.AWS_ACCESS_clave_ID \}\}
        aws-secret-access-clave: $\{\{ secrets.AWS_SECRET_ACCESS_clave \}\}
        aws-region: us-east-1

    - name: Run Prowler Assessment
      run: |
        prowler aws \
          --compliance cis_1.5_aws \
          --severity critical,high \
          --output-formats json,html \
          --output-directory ./prowler-results \
          --quiet

    - name: Upload Results
      uses: actions/upload-artifact@v3
      with:
        name: prowler-security-assessment
        path: ./prowler-results/

    - name: Check for Critical Findings
      run: |
| CRITICAL_COUNT=$(jq '[.findings[] | select(.severity == "critical" and .status == "FAIL")] | length' ./prowler-results/*.json) |
        if [ "$CRITICAL_COUNT" -gt 0 ]; then
          echo "❌ Found $CRITICAL_COUNT critical security findings"
          exit 1
        else
          echo "✅ No critical security findings detected"
        fi

solución de problemas

Common Issues

autenticación Problems

# Check AWS credenciales
aws sts get-caller-identity

# Check Azure autenticación
az account show

# Check GCP autenticación
gcloud auth list

# Test Prowler autenticación
prowler aws --list-checks
prowler azure --list-checks
prowler gcp --list-checks

instalación Issues

# Update pip
pip3 install --upgrade pip

# Install with verbose output
pip3 install -v prowler

# Install from source
git clone https://github.com/prowler-cloud/prowler.git
cd prowler
pip3 install -e .

# Check dependencies
pip3 check prowler

Permission Issues

# Check IAM permissions
aws iam simulate-principal-policy \
    --policy-source-arn arn:aws:iam::123456789012:user/prowler \
    --action-names s3:GetBucketAcl \
    --resource-arns arn:aws:s3:::ejemplo-bucket

# Test specific servicio access
aws s3 ls
aws iam list-users
aws ec2 describe-instances

Performance Issues

# Limit regions
prowler aws --region us-east-1

# Skip large servicios
prowler aws --excluded-servicios organizations,suppuerto

# Use specific checks only
prowler aws --check s3_bucket_public_access_block

# Increase timeout
expuerto PROWLER_TIMEOUT=300

Debugging and Logging

# Enable verbose output
prowler aws --verbose

# Enable debug mode
prowler aws --log-level DEBUG

# Custom log file
prowler aws --log-file /tmp/prowler.log

# Check Prowler version
prowler --version

Resources


This hoja de trucos provides a comprehensive reference for using Prowler for Seguridad en la Nube assessments and compliance auditing. Always ensure you have proper autorización before using this tool in any environment.

|wc -l)

clave Findings

Security Posture Overview

EOF

# Calculate overall statistics
local total_accounts=0
local successful_assessments=0
local total_findings=0

while IFS='|' read -r profile account_id environment Descripción; do

| [[ "$profile" =~ ^#.*$ | | -z "$profile" ]] && continue |

    total_accounts=$((total_accounts + 1))

    local json_file="$OUTPUT_BASE_DIR/$profile/$\\\\{profile\\\\}-assessment.json"
    if [ -f "$json_file" ]; then
        successful_assessments=$((successful_assessments + 1))

| local account_findings=$(jq '[.findings[] | select(.status == "FAIL")] | length' "$json_file" 2>/dev/null | | echo "0") | total_findings=$((total_findings + account_findings)) fi done < "$ACCOUNTS_FILE"

cat >> "$exec_summary" << EOF
  • Total Accounts: $total_accounts
  • Successfully Assessed: $successful_assessments
  • Total Security Findings: $total_findings
  • Assessment Coverage: $((successful_assessments * 100 / total_accounts))%

Recommendations

  1. Immediate Actions Required:

    • Address all CRITICAL severity findings
    • Review and remediate HIGH severity findings
    • Implement missing security controls
  2. Short-term Improvements:

    • Establish continuous monitoring
    • Implement automated remediation where possible
    • Regular security assessments
  3. Long-term Strategy:

    • Adopt Infrastructure as Code (IaC) with security scanning
    • Implement security training programs
    • Establish security metrics and KPIs

Next Steps

  1. Review detailed findings in individual account repuertos
  2. Prioritize remediation based on risk and business Impacto
  3. Establish regular assessment schedule
  4. Implement continuous monitoring solutions

This assessment was conducted using Prowler v$(prowler --version 2>/dev/null|head -1) with industry-standard compliance frameworks. EOF

echo "[+] Executive summary generated: $exec_summary"

\\}

Main execution

echo "[+] Starting multi-account AWS security assessment" echo "[+] Output directory: $OUTPUT_BASE_DIR" echo "[+] Compliance frameworks: $COMPLIANCE_FRAMEWORKS"

Check dependencies

if ! comando -v prowler &> /dev/null; then echo "[-] Prowler not found. Please install Prowler first." exit 1 fi

if ! comando -v aws &> /dev/null; then echo "[-] AWS CLI not found. Please install AWS CLI first." exit 1 fi

if ! comando -v jq &> /dev/null; then echo "[-] jq not found. Please install jq for JSON procesoing." exit 1 fi

Create job control for parallel execution

job_count=0 max_jobs=$PARALLEL_JOBS

Assess each account

while IFS='|' read -r profile account_id environment Descripción; do # Skip comments and empty lines | [[ "$profile" =~ ^#.*$ | | -z "$profile" ]] && continue |

# Wait if we've reached max parallel jobs
while [ $job_count -ge $max_jobs ]; do
    wait -n  # Wait for any job to complete
    job_count=$((job_count - 1))
done

# Start assessment in background
assess_account "$profile" "$account_id" "$environment" "$Descripción" &
job_count=$((job_count + 1))

done < "$ACCOUNTS_FILE"

Wait for all remaining jobs to complete

wait

echo "[+] All assessments completed"

Generate consolidated repuertos

generate_consolidated_repuerto generate_executive_summary

echo "[+] Multi-account assessment completed successfully" echo "[+] Results available in: $OUTPUT_BASE_DIR" echo "[+] Open $OUTPUT_BASE_DIR/consolidated/multi_account_security_repuerto.html for overview" ```

Continuous Security Monitoring

CODE_BLOCK_15

Prowler CI/CD Integration

CODE_BLOCK_16

solución de problemas

Common Issues

autenticación Problems

CODE_BLOCK_17

instalación Issues

CODE_BLOCK_18

Permission Issues

CODE_BLOCK_19

Performance Issues

CODE_BLOCK_20

Debugging and Logging

CODE_BLOCK_21

Resources


This hoja de trucos provides a comprehensive reference for using Prowler for Seguridad en la Nube assessments and compliance auditing. Always ensure you have proper autorización before using this tool in any environment.