Zum Inhalt

Wapiti Cheat Blatt

generieren

Überblick

Wapiti ist ein Web Application Sicherheitsscanner, der Black-Box-Tests von Web-Anwendungen durchführt. Es krabbelt Webseiten und sucht nach Skripten und Formularen, wo es Daten injizieren kann. Sobald es die Liste der URLs, Formulare und deren Eingaben erhält, handelt Wapiti wie ein Fuzzer, injiziert Nutzlasten, um zu sehen, ob ein Skript verletzlich ist. Wapiti kann verschiedene Schwachstellen wie SQL-Injektion, XSS, Dateiinklusion, Befehlsausführung und mehr erkennen.

ZEIT Warning: Verwenden Sie Wapiti nur gegen Anwendungen, die Sie besitzen oder eine ausdrückliche Erlaubnis zum Testen haben. Unberechtigte Tests können gegen die Nutzungsbedingungen oder lokale Gesetze verstoßen.

Installation

Python Paket Installation

```bash

Install via pip

pip install wapiti3

Install with all dependencies

pip install wapiti3[complete]

Install development version

pip install git+https://github.com/wapiti-scanner/wapiti.git

Verify installation

wapiti --version ```_

Systempaket Installation

```bash

Ubuntu/Debian

sudo apt update sudo apt install wapiti

CentOS/RHEL/Fedora

sudo yum install wapiti

or

sudo dnf install wapiti

Arch Linux

sudo pacman -S wapiti

macOS with Homebrew

brew install wapiti ```_

Docker Installation

```bash

Pull Docker image

docker pull wapiti/wapiti:latest

Run with Docker

docker run --rm -it wapiti/wapiti:latest --help

Create alias for easier usage

echo 'alias wapiti="docker run --rm -it -v $(pwd):/data wapiti/wapiti:latest"' >> ~/.bashrc source ~/.bashrc ```_

Manuelle Installation

```bash

Clone repository

git clone https://github.com/wapiti-scanner/wapiti.git cd wapiti

Install dependencies

pip install -r requirements.txt

Install

python setup.py install

Or run directly

python wapiti.py --help ```_

Basisnutzung

Einfache Schwachstelle Scannen

```bash

Basic scan

wapiti -u http://target.com

Scan with specific modules

wapiti -u http://target.com -m sql,xss,file

Scan with all modules

wapiti -u http://target.com -m all

Verbose scan

wapiti -u http://target.com -v 2

Quiet scan

wapiti -u http://target.com -q ```_

Crawling Optionen

```bash

Set crawling depth

wapiti -u http://target.com --depth 3

Set maximum pages to crawl

wapiti -u http://target.com --max-pages 100

Set crawling scope

wapiti -u http://target.com --scope domain

Include/exclude specific paths

wapiti -u http://target.com --skip-crawl "/admin,/test"

Follow external links

wapiti -u http://target.com --scope url ```_

Authentication

```bash

Basic authentication

wapiti -u http://target.com --auth-user admin --auth-password secret

Cookie-based authentication

wapiti -u http://target.com --cookie "PHPSESSID=abc123; auth=true"

Custom headers

wapiti -u http://target.com --headers "Authorization: Bearer token123"

Login form authentication

wapiti -u http://target.com --auth-method form --auth-url http://target.com/login --auth-user admin --auth-password secret ```_

Schwachstellenmodule

Verfügbare Module

```bash

List all available modules

wapiti --list-modules

SQL injection detection

wapiti -u http://target.com -m sql

Cross-site scripting (XSS)

wapiti -u http://target.com -m xss

File inclusion vulnerabilities

wapiti -u http://target.com -m file

Command execution

wapiti -u http://target.com -m exec

Cross-site request forgery (CSRF)

wapiti -u http://target.com -m csrf

Server-side request forgery (SSRF)

wapiti -u http://target.com -m ssrf

XML external entity (XXE)

wapiti -u http://target.com -m xxe

Backup file detection

wapiti -u http://target.com -m backup

Directory traversal

wapiti -u http://target.com -m traversal

HTTP security headers

wapiti -u http://target.com -m headers ```_

Modulspezifische Scans

```bash

Comprehensive SQL injection scan

wapiti -u http://target.com -m sql --level 2

XSS with custom payloads

wapiti -u http://target.com -m xss --payload-file xss_payloads.txt

File inclusion with time delay

wapiti -u http://target.com -m file --timeout 10

Command execution with specific OS

wapiti -u http://target.com -m exec --os linux

Multiple modules

wapiti -u http://target.com -m "sql,xss,file,exec" ```_

Erweiterte Konfiguration

Proxy und Netzwerkeinstellungen

```bash

Use HTTP proxy

wapiti -u http://target.com --proxy http://127.0.0.1:8080

Use SOCKS proxy

wapiti -u http://target.com --proxy socks5://127.0.0.1:9050

Set timeout

wapiti -u http://target.com --timeout 30

Set delay between requests

wapiti -u http://target.com --delay 2

Set user agent

wapiti -u http://target.com --user-agent "Custom Scanner 1.0"

Ignore SSL certificate errors

wapiti -u https://target.com --verify-ssl 0 ```_

Crawling Konfiguration

```bash

Set maximum crawling time

wapiti -u http://target.com --max-scan-time 3600

Set maximum parameters per page

wapiti -u http://target.com --max-parameters 20

Set maximum attack time per URL

wapiti -u http://target.com --max-attack-time 300

Exclude specific file types

wapiti -u http://target.com --exclude ".pdf,.jpg,*.png"

Include only specific file types

wapiti -u http://target.com --include ".php,.asp,*.jsp"

Set crawling rules

wapiti -u http://target.com --crawl-rules "follow_redirects,parse_robots" ```_

Ausgabe und Reporting

```bash

Generate HTML report

wapiti -u http://target.com -f html -o /tmp/wapiti_report.html

Generate XML report

wapiti -u http://target.com -f xml -o /tmp/wapiti_report.xml

Generate JSON report

wapiti -u http://target.com -f json -o /tmp/wapiti_report.json

Generate TXT report

wapiti -u http://target.com -f txt -o /tmp/wapiti_report.txt

Multiple output formats

wapiti -u http://target.com -f html,xml,json -o /tmp/wapiti_report ```_

Spezielles Scannen

API Testing

```bash

Scan REST API

wapiti -u http://api.target.com/v1 --scope domain -m "sql,xss,xxe"

Scan with API authentication

wapiti -u http://api.target.com/v1 --headers "Authorization: Bearer token123" -m all

Scan GraphQL endpoints

wapiti -u http://target.com/graphql -m "sql,xss" --level 2

Scan with custom content type

wapiti -u http://api.target.com/v1 --headers "Content-Type: application/json" -m all ```_

Formbasierte Prüfung

```bash

Focus on forms only

wapiti -u http://target.com --attack-forms-only

Skip GET parameters

wapiti -u http://target.com --skip-get-params

Test specific form fields

wapiti -u http://target.com --form-data "username=admin&password;=test"

Upload file testing

wapiti -u http://target.com --upload-dir /tmp/uploads -m file ```_

Sitzungsmanagement

```bash

Use session file

wapiti -u http://target.com --session-file session.json

Save session for later use

wapiti -u http://target.com --save-session session.json

Resume previous scan

wapiti -u http://target.com --resume-session session.json

Clear session data

wapiti -u http://target.com --clear-session ```_

Benutzerdefinierte Payloads und Regeln

Individuelle Payload-Dateien

```bash

Create custom SQL injection payloads

cat > sql_payloads.txt << 'EOF' ' OR '1'='1 ' UNION SELECT NULL-- '; DROP TABLE users-- ' AND SLEEP(5)-- ' OR 1=1# EOF

Use custom payloads

wapiti -u http://target.com -m sql --payload-file sql_payloads.txt

Create custom XSS payloads

cat > xss_payloads.txt << 'EOF'

javascript:alert('XSS') EOF

wapiti -u http://target.com -m xss --payload-file xss_payloads.txt ```_

Konfigurationsdateien

```bash

Create configuration file

cat > wapiti.conf << 'EOF' [general] timeout = 30 delay = 1 max_pages = 200 max_scan_time = 7200

[crawling] depth = 3 scope = domain follow_redirects = true

[modules] sql = true xss = true file = true exec = false csrf = true

[output] format = html,json output_dir = /tmp/wapiti_reports EOF

Use configuration file

wapiti -u http://target.com --config wapiti.conf ```_

Kundenspezifische Angriffsmodule

```python

!/usr/bin/env python3

Custom Wapiti module example

from wapitiCore.attack.attack import Attack from wapitiCore.language.vulnerability import Vulnerability

class CustomAttack(Attack): """Custom attack module for Wapiti"""

name = "custom"
description = "Custom vulnerability detection"

def __init__(self, crawler, persister, logger, attack_options):
    super().__init__(crawler, persister, logger, attack_options)
    self.payloads = [
        "custom_payload_1",
        "custom_payload_2",
        "custom_payload_3"
    ]

def attack(self, http_res):
    """Main attack method"""
    url = http_res.url

    for payload in self.payloads:
        # Inject payload and test response
        test_url = f"\\\\{url\\\\}?test=\\\\{payload\\\\}"

        try:
            response = self.crawler.get(test_url)

            if self.is_vulnerable(response):
                vuln = Vulnerability(
                    category="Custom Vulnerability",
                    level=Vulnerability.HIGH_LEVEL,
                    request=response.http_request,
                    info="Custom vulnerability detected"
                )
                self.add_vuln(vuln)

        except Exception as e:
            self.logger.error(f"Error testing \\\\{test_url\\\\}: \\\\{e\\\\}")

def is_vulnerable(self, response):
    """Check if response indicates vulnerability"""
    indicators = ["error", "exception", "debug"]
    return any(indicator in response.content.lower() for indicator in indicators)

```_

Automatisierungsskripte

Umfassendes Scannen von Skript

```bash

!/bin/bash

Comprehensive web application security scan

TARGET="$1" OUTPUT_DIR="wapiti_scan_$(date +%Y%m%d_%H%M%S)"

if [ -z "$TARGET" ]; then echo "Usage: $0 " exit 1 fi

mkdir -p "$OUTPUT_DIR"

echo "[+] Starting comprehensive scan for: $TARGET"

Basic vulnerability scan

echo "[+] Running basic vulnerability scan..." wapiti -u "$TARGET" \ -m "sql,xss,file,exec,csrf,ssrf" \ -f html,json \ -o "$OUTPUT_DIR/basic_scan" \ --level 2 \ --timeout 30 \ --max-pages 500

Deep SQL injection scan

echo "[+] Running deep SQL injection scan..." wapiti -u "$TARGET" \ -m sql \ -f json \ -o "$OUTPUT_DIR/sql_scan.json" \ --level 3 \ --timeout 60

XSS focused scan

echo "[+] Running XSS focused scan..." wapiti -u "$TARGET" \ -m xss \ -f json \ -o "$OUTPUT_DIR/xss_scan.json" \ --level 2 \ --timeout 30

File inclusion scan

echo "[+] Running file inclusion scan..." wapiti -u "$TARGET" \ -m file \ -f json \ -o "$OUTPUT_DIR/file_scan.json" \ --level 2

Backup file detection

echo "[+] Running backup file detection..." wapiti -u "$TARGET" \ -m backup \ -f json \ -o "$OUTPUT_DIR/backup_scan.json"

Security headers check

echo "[+] Checking security headers..." wapiti -u "$TARGET" \ -m headers \ -f json \ -o "$OUTPUT_DIR/headers_scan.json"

echo "[+] Scan completed. Results saved to: $OUTPUT_DIR"

Generate summary

python3 ``<< EOF import json import os from collections import defaultdict

results_dir = "$OUTPUT_DIR" vulnerabilities = defaultdict(list)

for filename in os.listdir(results_dir): if filename.endswith('.json'): filepath = os.path.join(results_dir, filename) try: with open(filepath, 'r') as f: data = json.load(f) if 'vulnerabilities' in data: for vuln in data['vulnerabilities']: vuln_type = vuln.get('type', 'Unknown') vulnerabilities[vuln_type].append(vuln) except Exception as e: print(f"Error processing \{filename\}: \{e\}")

print("\n=== VULNERABILITY SUMMARY ===") total_vulns = 0 for vuln_type, vulns in vulnerabilities.items(): count = len(vulns) total_vulns += count print(f"\{vuln_type\}: \{count\}")

print(f"\nTotal vulnerabilities found: \{total_vulns\}")

Save summary

summary = \{ 'total_vulnerabilities': total_vulns, 'by_type': \{k: len(v) for k, v in vulnerabilities.items()\} \}

with open(os.path.join(results_dir, 'summary.json'), 'w') as f: json.dump(summary, f, indent=2) EOF ```_

Continuous Monitoring Script

```bash

!/bin/bash

Continuous web application monitoring

CONFIG_FILE="monitor_config.conf" LOG_FILE="wapiti_monitor.log"

Configuration

TARGETS=( "https://app1.example.com" "https://app2.example.com" "https://api.example.com" )

SCAN_INTERVAL=86400 # 24 hours ALERT_EMAIL="security@example.com"

log_message() \{ echo "[$(date '+%Y-%m-%d %H:%M:%S')] $1"|tee -a "$LOG_FILE" \}

send_alert() \{ local target="$1" local vuln_count="$2" local report_file="$3"

if [ "$vuln_count" -gt 0 ]; then
    log_message "ALERT: $vuln_count vulnerabilities found in $target"

    # Send email alert (requires mail command)
    if command -v mail >``/dev/null 2>&1; then
        echo "Wapiti scan found $vuln_count vulnerabilities in $target. See attached report."|\
            mail -s "Security Alert: Vulnerabilities Detected" -A "$report_file" "$ALERT_EMAIL"
    fi
fi

\\}

scan_target() \\{ local target="$1" local timestamp=$(date +%Y%m%d_%H%M%S) local output_dir="monitor_$\\{timestamp\\}" local report_file="$\\{output_dir\\}/scan_report.json"

log_message "Starting scan for: $target"

mkdir -p "$output_dir"

# Run Wapiti scan
wapiti -u "$target" \
    -m "sql,xss,file,exec,csrf" \
    -f json \
    -o "$report_file" \
    --timeout 30 \
    --max-pages 200 \
    --level 1 \
    2>>"$LOG_FILE"

if [ -f "$report_file" ]; then
    # Count vulnerabilities
    vuln_count=$(python3 -c "

import json try: with open('$report_file', 'r') as f: data = json.load(f) print(len(data.get('vulnerabilities', []))) except: print(0) ")

    log_message "Scan completed for $target. Found $vuln_count vulnerabilities."
    send_alert "$target" "$vuln_count" "$report_file"

    # Cleanup old reports (keep last 10)

| ls -t monitor_*/scan_report.json 2>/dev/null | tail -n +11 | xargs rm -f 2>/dev/null |

else
    log_message "ERROR: Scan failed for $target"
fi

\\}

Main monitoring loop

while true; do log_message "Starting monitoring cycle"

for target in "$\\\\{TARGETS[@]\\\\}"; do
    scan_target "$target"
    sleep 60  # Wait between targets
done

log_message "Monitoring cycle completed. Sleeping for $SCAN_INTERVAL seconds."
sleep "$SCAN_INTERVAL"

done ```_

CI/CD Integrationsskript

```bash

!/bin/bash

CI/CD pipeline integration script

set -e

TARGET_URL="$1" FAIL_ON_VULN="$\\{2:-true\\}" OUTPUT_DIR="wapiti_ci_$(date +%Y%m%d_%H%M%S)"

if [ -z "$TARGET_URL" ]; then echo "Usage: $0 [fail_on_vulnerabilities]" exit 1 fi

mkdir -p "$OUTPUT_DIR"

echo "Starting security scan for: $TARGET_URL"

Run Wapiti scan

wapiti -u "$TARGET_URL" \ -m "sql,xss,file,exec,csrf" \ -f json,html \ -o "$OUTPUT_DIR/security_scan" \ --timeout 30 \ --max-pages 100 \ --level 1

Process results

REPORT_FILE="$OUTPUT_DIR/security_scan.json"

if [ -f "$REPORT_FILE" ]; then # Count vulnerabilities by severity python3 << EOF import json import sys

with open('$REPORT_FILE', 'r') as f: data = json.load(f)

vulnerabilities = data.get('vulnerabilities', []) total = len(vulnerabilities)

severity_counts = \\{'high': 0, 'medium': 0, 'low': 0\\} for vuln in vulnerabilities: severity = vuln.get('level', 'low').lower() if severity in severity_counts: severity_counts[severity] += 1

print(f"Security Scan Results:") print(f"Total vulnerabilities: \\{total\\}") print(f"High severity: \\{severity_counts['high']\\}") print(f"Medium severity: \\{severity_counts['medium']\\}") print(f"Low severity: \\{severity_counts['low']\\}")

Exit with error code if vulnerabilities found and fail_on_vuln is true

if '$FAIL_ON_VULN' == 'true' and total > 0: print("\nSecurity vulnerabilities detected. Failing build.") sys.exit(1) else: print("\nSecurity scan completed successfully.") sys.exit(0) EOF

else echo "ERROR: Scan report not found" exit 1 fi ```_

Integration mit anderen Tools

Integration von Burp Suite

```bash

Use Burp as proxy for Wapiti

wapiti -u http://target.com --proxy http://127.0.0.1:8080

Export discovered URLs to Burp

wapiti -u http://target.com --crawl-only -f txt -o burp_targets.txt ```_

OWASP ZAP Integration

```bash

Use ZAP as proxy

wapiti -u http://target.com --proxy http://127.0.0.1:8080

Generate ZAP-compatible report

wapiti -u http://target.com -f xml -o zap_import.xml ```_

Nucles Integration

```bash

Extract URLs for Nuclei

wapiti -u http://target.com --crawl-only --format txt|grep -E '^http' > nuclei_targets.txt

Run Nuclei on discovered URLs

nuclei -l nuclei_targets.txt -t /path/to/nuclei-templates/ ```_

Fehlerbehebung

Gemeinsame Themen

Crawling Probleme

```bash

Increase crawling timeout

wapiti -u http://target.com --timeout 60

Reduce crawling depth

wapiti -u http://target.com --depth 1

Skip problematic URLs

wapiti -u http://target.com --skip-crawl "/problematic-path"

Use different user agent

wapiti -u http://target.com --user-agent "Mozilla/5.0 (compatible; Scanner)" ```_

Authentifizierungsfragen

```bash

Debug authentication

wapiti -u http://target.com --auth-user admin --auth-password secret -v 2

Use cookie authentication instead

wapiti -u http://target.com --cookie "session=valid_session_id"

Test authentication manually first

curl -u admin:secret http://target.com/protected ```_

Leistungsfragen

```bash

Reduce scan scope

wapiti -u http://target.com --max-pages 50

Increase delays

wapiti -u http://target.com --delay 3

Use fewer modules

wapiti -u http://target.com -m "sql,xss"

Set scan time limit

wapiti -u http://target.com --max-scan-time 1800 ```_

SSL/TLS Ausgaben

```bash

Disable SSL verification

wapiti -u https://target.com --verify-ssl 0

Use specific SSL version

wapiti -u https://target.com --ssl-version TLSv1.2

Debug SSL issues

wapiti -u https://target.com -v 2 --verify-ssl 0 ```_

Debugging und Logging

```bash

Enable verbose logging

wapiti -u http://target.com -v 2

Save debug information

wapiti -u http://target.com -v 2 2>&1|tee wapiti_debug.log

Test specific module

wapiti -u http://target.com -m sql -v 2

Dry run (crawl only)

wapiti -u http://target.com --crawl-only -v 1 ```_

Ressourcen

--

*Dieses Betrügereiblatt bietet eine umfassende Referenz für die Verwendung von Wapiti für das Scannen von Webanwendungen. Stellen Sie immer sicher, dass Sie eine richtige Berechtigung haben, bevor Sie dieses Tool in jeder Umgebung verwenden. *