WordlistRaider
Overview
Section titled “Overview”WordlistRaider is a sophisticated wordlist optimization and generation tool designed to create highly targeted, efficient wordlists for penetration testing. Rather than using generic wordlists, WordlistRaider intelligently generates custom wordlists based on reconnaissance data, significantly improving the success rate of password and directory brute-force attacks.
Key capabilities include:
- Intelligent wordlist generation from target metadata
- Multiple wordlist combination and deduplication
- Custom rule-based wordlist manipulation
- Wordlist filtering and optimization
- Integration with reconnaissance data
Installation
Section titled “Installation”From GitHub Source
Section titled “From GitHub Source”git clone https://github.com/cakinney/wordlistraider.git
cd wordlistraider
pip install -r requirements.txt
python wordlistraider.py --help
Using pip
Section titled “Using pip”pip install wordlistraider
wordlistraider --help
Manual Installation
Section titled “Manual Installation”# Clone repository
git clone https://github.com/cakinney/wordlistraider.git
# Install Python 3.7+
python3 --version
# Install dependencies
cd wordlistraider
pip3 install -r requirements.txt
# Run directly
python3 wordlistraider.py
Docker
Section titled “Docker”docker pull cakinney/wordlistraider
docker run -it wordlistraider --help
Basic Usage
Section titled “Basic Usage”| Command | Description |
|---|---|
wordlistraider -u <url> | Generate wordlist from target URL |
wordlistraider -f <file> | Process file for wordlist generation |
wordlistraider -c <file> | Combine multiple wordlists |
wordlistraider -m <wordlist> | Mutate wordlist with rules |
wordlistraider -o <output> | Specify output file |
wordlistraider --help | Show all available options |
Common Examples
Section titled “Common Examples”Generate Wordlist from Target Website
Section titled “Generate Wordlist from Target Website”wordlistraider -u https://example.com -o custom_wordlist.txt
Analyzes the target website’s content, extracts relevant words, company names, and technical terms to create a highly targeted wordlist for password attacks.
Process Multiple Wordlists
Section titled “Process Multiple Wordlists”wordlistraider -c wordlist1.txt wordlist2.txt wordlist3.txt -o combined_wordlist.txt
Combines multiple wordlist sources, removes duplicates, and creates a consolidated wordlist optimized for efficiency.
Apply Mutation Rules to Wordlist
Section titled “Apply Mutation Rules to Wordlist”wordlistraider -f base_wordlist.txt -m rules.txt -o mutated_wordlist.txt
Applies sophisticated mutation rules (capitalizations, number appending, special characters) to expand the wordlist intelligently.
Extract Words from Website Content
Section titled “Extract Words from Website Content”wordlistraider -u https://target.company.com --extract-text -o company_wordlist.txt
Scrapes and analyzes all text content from the website, extracting domain-relevant vocabulary for targeted attacks.
Advanced Usage
Section titled “Advanced Usage”Intelligent Word Extraction
Section titled “Intelligent Word Extraction”# Extract from target and apply mutations
wordlistraider -u https://example.com \
--extract-text \
--apply-mutations \
--min-length 4 \
--max-length 20 \
-o example_advanced.txt
Extracts words from the target, applies intelligent mutations, and filters by length for better results.
Combining with Reconnaissance Data
Section titled “Combining with Reconnaissance Data”# Extract company information
# Then generate wordlist from company name, domain, products, etc.
# For company: ACME Corp, domain: acme.com, products: SecureVault
cat > acme_terms.txt << 'EOF'
acme
corp
vault
secure
password
admin
test
ACME
SecureVault
EOF
# Generate mutations
wordlistraider -f acme_terms.txt -m mutations.txt -o acme_passwords.txt
# View results
head -20 acme_passwords.txt
Creating Organization-Specific Wordlists
Section titled “Creating Organization-Specific Wordlists”#!/bin/bash
# Target organization: TechCorp
ORG="techcorp"
# Create base wordlist from reconnaissance
cat > ${ORG}_base.txt << 'EOF'
techcorp
technology
corporation
secure
cloud
server
database
password
admin
user
EOT
# Apply mutations
wordlistraider -f ${ORG}_base.txt -m standard_mutations.txt -o ${ORG}_mutations.txt
# Combine with common wordlists
wordlistraider -c ${ORG}_mutations.txt common_passwords.txt -o ${ORG}_final.txt
# Statistics
echo "Final wordlist size:"
wc -l ${ORG}_final.txt
Multi-Source Wordlist Generation
Section titled “Multi-Source Wordlist Generation”# Combine multiple intelligence sources
wordlistraider \
-c \
industry_wordlist.txt \
company_wordlist.txt \
extracted_website_words.txt \
common_tech_terms.txt \
domain_names.txt \
-o comprehensive_wordlist.txt
# Remove duplicates and sort
sort -u comprehensive_wordlist.txt > final_wordlist.txt
Mutation Rules
Section titled “Mutation Rules”Standard Mutations
Section titled “Standard Mutations”# Create mutations.txt with transformation rules
cat > mutations.txt << 'EOF'
# Capitalization
$U # UPPERCASE
$L # lowercase
$C # Capitalize
$T # Title
# Appending
$1, $2, $3, $4, $5
$!, $@, $#, $%
# Prefixing
^admin, ^test, ^user
# Common patterns
password123, admin123, letmein, welcome
EOF
# Apply mutations
wordlistraider -f base_wordlist.txt -m mutations.txt -o mutated.txt
Example Output with Mutations
Section titled “Example Output with Mutations”# Input word: password
# After mutations generates:
password
PASSWORD
Password
Prassword
password1
password123
password!
admin_password
test_password
password@123
# ... and hundreds more depending on rules
Wordlist Filtering and Optimization
Section titled “Wordlist Filtering and Optimization”Remove Short/Long Words
Section titled “Remove Short/Long Words”# Generate wordlist, filter by length
wordlistraider -u https://example.com \
--min-length 5 \
--max-length 25 \
-o filtered_wordlist.txt
Removes very short words (common false positives) and very long words (unlikely passwords).
Deduplicate and Sort
Section titled “Deduplicate and Sort”# Clean wordlist
cat multiple_wordlists.txt | sort -u > deduplicated.txt
# Remove whitespace-only lines
grep -v '^[[:space:]]*$' deduplicated.txt > cleaned.txt
# Count unique entries
wc -l cleaned.txt
Frequency-Based Filtering
Section titled “Frequency-Based Filtering”# Create frequency analysis
cat passwords.txt | sort | uniq -c | sort -rn > frequency.txt
# Extract top N most common
head -100 frequency.txt | awk '{print $NF}' > top_passwords.txt
Real-World Reconnaissance Integration
Section titled “Real-World Reconnaissance Integration”From Website to Wordlist
Section titled “From Website to Wordlist”#!/bin/bash
TARGET="https://example.com"
# Step 1: Extract all text from website
curl -s $TARGET | \
sed 's/<[^>]*>//g' | \
tr ' ' '\n' | \
grep -E '^[a-zA-Z]{4,}$' | \
sort -u > website_words.txt
# Step 2: Extract company info
# Manually add company name, products, domain terms
cat >> website_words.txt << 'EOF'
example
company
product1
product2
service
technology
EOF
# Step 3: Generate wordlist with mutations
wordlistraider -f website_words.txt -m mutations.txt -o final_wordlist.txt
# Step 4: Optimize final list
sort -u final_wordlist.txt | \
awk 'length($0) >= 5 && length($0) <= 20' > optimized_wordlist.txt
LinkedIn OSINT Integration
Section titled “LinkedIn OSINT Integration”#!/bin/bash
# Extract employee names from company LinkedIn
# Save to names.txt
# Common password patterns with names
cat names.txt | while read name; do
echo "$name"
echo "$name!"
echo "$name123"
echo "$name@123"
echo "Welcome${name}"
done > linkedin_passwords.txt
# Combine with other wordlists
wordlistraider -c linkedin_passwords.txt standard_wordlist.txt -o combined.txt
GitHub Repository Analysis
Section titled “GitHub Repository Analysis”#!/bin/bash
# Clone target repository
git clone https://github.com/target/repo.git
cd repo
# Extract all identifiers, function names, variables
find . -type f \( -name "*.py" -o -name "*.js" -o -name "*.java" \) | \
xargs grep -oh '[a-zA-Z_][a-zA-Z0-9_]*' | \
sort -u > code_identifiers.txt
# Generate wordlist
wordlistraider -f code_identifiers.txt -m mutations.txt -o code_wordlist.txt
Performance Optimization
Section titled “Performance Optimization”Wordlist Size Management
Section titled “Wordlist Size Management”# Check wordlist size
ls -lh wordlist.txt
# If too large, filter by frequency or length
# Top 50,000 most likely passwords
head -50000 wordlist.txt > reduced_wordlist.txt
# Optimize for specific tool (Hashcat, John, etc.)
wc -l wordlist.txt
Parallel Processing
Section titled “Parallel Processing”# Split large wordlist for parallel brute-forcing
split -l 10000 large_wordlist.txt chunk_
# Process each chunk in parallel
for chunk in chunk_*; do
hashcat -m 1000 hashes.txt $chunk &
done
wait
Memory-Efficient Processing
Section titled “Memory-Efficient Processing”# Stream processing for very large wordlists
cat huge_wordlist.txt | \
awk 'length($0) >= 8 && length($0) <= 20 { print }' | \
sort -u | \
head -100000 > filtered_wordlist.txt
Integration with Brute-Force Tools
Section titled “Integration with Brute-Force Tools”Hashcat Integration
Section titled “Hashcat Integration”# Generate custom wordlist
wordlistraider -u https://target.com -o target_wordlist.txt
# Use with Hashcat
hashcat -m 1000 hashes.txt target_wordlist.txt
# With mutations in Hashcat
hashcat -m 1000 hashes.txt target_wordlist.txt -r rules/best64.rule
John the Ripper Integration
Section titled “John the Ripper Integration”# Generate wordlist
wordlistraider -f base_words.txt -o wordlist.txt
# Single crack mode with custom wordlist
john --wordlist=wordlist.txt --rules:Single hashes.txt
# Incremental with wordlist as seed
john --wordlist=wordlist.txt hashes.txt
Hydra Integration
Section titled “Hydra Integration”# Generate HTTP service wordlist
wordlistraider -u https://target.com/login -o login_wordlist.txt
# HTTP form brute-force
hydra -L users.txt -P login_wordlist.txt target.com http-post-form \
"/login.php:user=^USER^&pass=^PASS^:F=incorrect"
Advanced Techniques
Section titled “Advanced Techniques”Keyboard Walk Patterns
Section titled “Keyboard Walk Patterns”# Generate common keyboard patterns
cat > keyboard_patterns.txt << 'EOF'
qwerty
asdfgh
zxcvbn
qweasd
12345
123456
EOF
# Combine with other wordlists
wordlistraider -c keyboard_patterns.txt other_wordlists.txt -o combined.txt
L33t Speak Mutations
Section titled “L33t Speak Mutations”# Create l33t mutations
cat > leet_mutations.txt << 'EOF'
# Replace common letters
s -> $
a -> @
e -> 3
i -> 1
o -> 0
l -> 1
EOF
# Apply mutations
wordlistraider -f base_wordlist.txt -m leet_mutations.txt -o leet_wordlist.txt
Time-Based Word Generation
Section titled “Time-Based Word Generation”# Generate time-based passwords (common default patterns)
for year in 2020 2021 2022 2023 2024 2025; do
echo "$year"
for month in 01 02 03 04 05 06 07 08 09 10 11 12; do
echo "${year}${month}"
done
done >> time_passwords.txt
# Combine with other wordlists
wordlistraider -c time_passwords.txt main_wordlist.txt -o final.txt
Best Practices
Section titled “Best Practices”- Target-Specific: Always customize wordlists based on reconnaissance
- Organization: Keep separate wordlists for different organizations
- Mutation Limits: Don’t create excessively large wordlists (>100MB)
- Deduplication: Always remove duplicates before use
- Length Filtering: Filter words by realistic password length
- Combination Strategy: Combine multiple sources strategically
- Documentation: Track wordlist provenance and creation date
- Authorization: Only use for authorized security testing
Troubleshooting
Section titled “Troubleshooting”Large Wordlist Performance Issues
Section titled “Large Wordlist Performance Issues”# If wordlist is too large
wc -l huge_wordlist.txt
# Reduce by filtering
awk 'length($0) >= 8 && length($0) <= 16' huge_wordlist.txt > optimized.txt
# Or take most likely subset
head -50000 huge_wordlist.txt > reduced.txt
Memory Usage
Section titled “Memory Usage”# Monitor memory during generation
time wordlistraider -u https://example.com -o wordlist.txt
# If running out of memory, process in chunks
# First extract words, then apply mutations separately
Encoding Issues
Section titled “Encoding Issues”# Ensure UTF-8 encoding
file wordlist.txt
# Convert if needed
iconv -f ISO-8859-1 -t UTF-8 wordlist.txt > wordlist_utf8.txt
Comparative Analysis
Section titled “Comparative Analysis”Wordlist Size Strategies
Section titled “Wordlist Size Strategies”| Strategy | Size | Speed | Success Rate |
|---|---|---|---|
| Generic (rockyou.txt) | Large (100K+) | Slow | Medium |
| Organization-Specific | Medium (10K-50K) | Fast | High |
| Fully Customized | Small (1K-10K) | Very Fast | Very High |
Comparison with Other Tools
Section titled “Comparison with Other Tools”vs. Crunch
Section titled “vs. Crunch”# Crunch - generates all permutations (huge wordlists)
crunch 8 12 -o permutations.txt # Likely >1GB
# WordlistRaider - intelligently filters (targeted wordlists)
wordlistraider -u https://target.com -o wordlist.txt # Likely <10MB
Conclusion
Section titled “Conclusion”WordlistRaider is essential for creating highly targeted, efficient wordlists that significantly improve brute-force success rates. By intelligently combining reconnaissance data with sophisticated mutations, it enables authorized security professionals to conduct more effective password and directory enumeration testing with minimal resource overhead.