Advanced OSINT framework for scanning IP addresses, emails, websites, and organizations to find information from different sources. Aggregates data from Shodan, Censys, social media, DNS, and more.
| Requirement | Details |
|---|
| Python 3.7+ | Core language requirement |
| pip | Python package manager |
| Virtual environment | Recommended for isolation |
| Internet connection | Required for API calls |
git clone https://github.com/bhavsec/reconspider.git
cd reconspider
pip install -r requirements.txt
pip install reconspider
python reconspider.py --version
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
| Service | API Key Required | Purpose |
|---|
| Shodan | Yes | Search internet-connected devices |
| Censys | Yes | Certificate and host data |
| VirusTotal | Yes | File and URL reputation |
| Hunter.io | Yes | Email discovery and verification |
| Clearbit | Yes | Company and person data |
| EmailHunter | Yes | Business email finder |
| GitHub | Optional | Repository and user data |
| Twitter/X | Optional | Social media reconnaissance |
# Set environment variables
export SHODAN_API_KEY="your_shodan_key"
export CENSYS_API_ID="your_censys_id"
export CENSYS_API_SECRET="your_censys_secret"
export VIRUSTOTAL_API_KEY="your_vt_key"
export HUNTER_API_KEY="your_hunter_key"
# ~/.config/reconspider/config.yaml
api_keys:
shodan: "your_shodan_key"
censys_id: "your_censys_id"
censys_secret: "your_censys_secret"
virustotal: "your_vt_key"
hunter: "your_hunter_key"
clearbit: "your_clearbit_key"
timeout: 30
retry_attempts: 3
output_format: json # json, csv, txt
| Command | Description |
|---|
python reconspider.py -i <ip> | Scan single IP address |
python reconspider.py -i <ip> -s shodan | Query Shodan for IP |
python reconspider.py -i <ip> -s censys | Query Censys for IP |
python reconspider.py -i <ip> -s all | Aggregate all sources |
# Basic IP scan
python reconspider.py -i 8.8.8.8
# Shodan scan only
python reconspider.py -i 93.184.216.34 -s shodan
# Censys scan with output file
python reconspider.py -i 93.184.216.34 -s censys -o ip_report.json
# Multiple IPs
python reconspider.py -i 8.8.8.8,1.1.1.1 -o dns_servers.json
# IP range scanning
python reconspider.py -i "192.168.1.0/24" -s shodan
IP Reconnaissance returns:
Hostname and reverse DNS
ISP and organization
Geographic location (country, city)
Hosting provider
Open ports and services
Running software versions
SSL certificate details
Domain history
Blacklist status
| Command | Description |
|---|
python reconspider.py -d <domain> | Scan domain |
python reconspider.py -d <domain> --dns | DNS enumeration |
python reconspider.py -d <domain> --certs | Certificate transparency scan |
python reconspider.py -d <domain> --subdomains | Subdomain enumeration |
# Full domain scan
python reconspider.py -d example.com
# DNS records only
python reconspider.py -d example.com --dns
# Certificate transparency logs
python reconspider.py -d example.com --certs
# Subdomain enumeration
python reconspider.py -d example.com --subdomains -o subdomains.txt
# Scan with Censys
python reconspider.py -d example.com -s censys
# Check for zone transfer vulnerability
python reconspider.py -d example.com --axfr
| Record Type | Information |
|---|
| A | IPv4 addresses |
| AAAA | IPv6 addresses |
| CNAME | Canonical names |
| MX | Mail exchange servers |
| TXT | Text records (SPF, DKIM, DMARC) |
| NS | Nameservers |
| SOA | Start of authority |
| SRV | Service records |
Certificate Transparency returns:
Issuer details
Issue and expiration dates
Domain names and SANs
Subject and subject alt names
Signature algorithm
Public key information
Revocation status
| Command | Description |
|---|
python reconspider.py -e <email> | Scan email address |
python reconspider.py -d <domain> --emails | Harvest emails from domain |
python reconspider.py -o <organization> --emails | Find org emails |
# Email verification
python reconspider.py -e john@example.com
# Harvest emails from domain
python reconspider.py -d example.com --emails
# Company email discovery
python reconspider.py -o "Acme Corp" --emails -o results.csv
# Email pattern detection
python reconspider.py -d example.com --email-pattern firstname.lastname
# Multi-domain email search
python reconspider.py -d example.com,example.net --emails --hunter
| Source | Details |
|---|
| Hunter.io | Business email finder |
| RocketReach | Professional profiles |
| Clearbit | Company data and emails |
| LinkedIn | Profile emails (requires auth) |
| GitHub | Developer emails |
| Public WHOIS | Domain registrant emails |
| Certificate logs | Email addresses in certs |
| Command | Description |
|---|
python reconspider.py --phone <number> | Lookup phone number |
python reconspider.py --phone <number> --reverse | Reverse phone lookup |
python reconspider.py -d <domain> --phones | Extract phone numbers from domain |
# Direct phone lookup
python reconspider.py --phone "+1-555-0123"
# Reverse lookup
python reconspider.py --phone "5550123" --reverse
# Extract from website
python reconspider.py -d example.com --phones
# Phone pattern search
python reconspider.py --phone-pattern "555-\d{4}" -d example.com
Phone reconnaissance returns:
Carrier information
Location (city, state)
Phone type (mobile, landline, VOIP)
Associated names
Associated email addresses
Call history patterns
Linked social media accounts
| Command | Description |
|---|
python reconspider.py --social <username> | Search across platforms |
python reconspider.py --twitter <handle> | Twitter/X profile search |
python reconspider.py --linkedin <profile> | LinkedIn data extraction |
python reconspider.py --github <username> | GitHub profile analysis |
# Cross-platform username search
python reconspider.py --social johndoe
# Twitter profile reconnaissance
python reconspider.py --twitter @johndoe
# GitHub profile analysis
python reconspider.py --github johndoe
# LinkedIn profile scraping
python reconspider.py --linkedin johndoe
# Instagram profile search
python reconspider.py --instagram johndoe --followers
| Platform | Information Retrieved |
|---|
| Twitter/X | Followers, tweets, location, bio, links |
| GitHub | Repos, commits, followers, organizations |
| LinkedIn | Profile, connections, employment, skills |
| Instagram | Bio, followers, posts, locations, links |
| Facebook | Profile info, friends, pages, activity |
| Reddit | Post history, karma, subreddits joined |
| Command | Description |
|---|
python reconspider.py -s shodan -i <ip> | IP lookup via Shodan |
python reconspider.py --shodan-query "webcam" | Search for webcams |
python reconspider.py --shodan-query "apache" --country US | Geo-filtered search |
# Find exposed webcams
python reconspider.py --shodan-query "webcam" --limit 50
# Search by country
python reconspider.py --shodan-query "mongodb" --country CN
# Port-specific search
python reconspider.py --shodan-query "port:3389" --limit 100
# Service version search
python reconspider.py --shodan-query "apache/2.4"
# Vulnerable software search
python reconspider.py --shodan-query "OpenSSL/1.0.1" --country US
| Filter | Description |
|---|
port:8080 | Specific port |
country:US | Specific country |
city:Boston | Specific city |
org:Google | Organization name |
os:Linux | Operating system |
product:Apache | Software product |
before:2023-01-01 | Before date |
| Command | Description |
|---|
python reconspider.py -s censys -i <ip> | IP lookup via Censys |
python reconspider.py --censys-query "ip:8.8.8.8" | Direct IP query |
python reconspider.py --censys-cert <domain> | Certificate search |
# Certificate search
python reconspider.py --censys-cert example.com
# IP with specific service
python reconspider.py -s censys -i 93.184.216.34
# Host lookup with details
python reconspider.py --censys-host 8.8.8.8
# Certificate issuer search
python reconspider.py --censys-query "issuer:\"Let's Encrypt\""
| Command | Description |
|---|
python reconspider.py -i <ip> -o report.json | JSON output |
python reconspider.py -i <ip> -o report.csv | CSV output |
python reconspider.py -i <ip> -o report.txt | Text output |
python reconspider.py -i <ip> --html | HTML report |
# JSON format (structured)
python reconspider.py -d example.com -o domain_report.json
# CSV format (spreadsheet)
python reconspider.py --emails -d example.com -o emails.csv
# HTML report (visual)
python reconspider.py -i 8.8.8.8 --html -o scan_report.html
# Pretty-printed output
python reconspider.py -d example.com -o results.txt --verbose
{
"target": "example.com",
"scan_date": "2026-05-01",
"ip_info": {
"addresses": ["93.184.216.34"],
"hostname": "example.com",
"organization": "EDGECAST",
"location": {
"country": "US",
"city": "Los Angeles"
}
},
"dns_records": {
"A": ["93.184.216.34"],
"MX": ["mail.example.com"]
},
"certificates": [],
"emails": ["admin@example.com"],
"vulnerabilities": []
}
| Command | Description |
|---|
python reconspider.py -i <ip> --honeypot | Check if IP is honeypot |
python reconspider.py --check-reputation <ip> | Reputation check |
python reconspider.py -d <domain> --verify | Domain verification |
# Check single IP
python reconspider.py -i 192.168.1.1 --honeypot
# Batch check IPs
python reconspider.py --honeypot -f ip_list.txt
# Domain verification
python reconspider.py -d example.com --verify
# Reputation scoring
python reconspider.py -i 8.8.8.8 --reputation
# Verify email authenticity
python reconspider.py -e user@example.com --verify
High-risk indicators:
Response from unusual geolocation
Multiple open ports without services
Suspicious SSL certificate
Rapid response patterns
Mismatched DNS records
Known honeypot signatures
No real legitimate traffic indicators
# Scan multiple IPs from file
python reconspider.py -f ip_list.txt -o batch_results.json
# Harvest emails from multiple domains
python reconspider.py -f domains.txt --emails -o emails.csv
# Scan IP ranges
python reconspider.py -i "10.0.0.0/24" -o range_scan.json
# Full org scan workflow
python reconspider.py -o "Acme Corp" \
--emails \
--phones \
--social \
--subdomains \
-o org_report.json \
--html
# Automated threat hunt
python reconspider.py -f suspicious_ips.txt \
--shodan \
--honeypot \
--check-reputation \
-o threat_report.html \
--alerts
# Monitor domain for changes
python reconspider.py -d example.com \
--certs \
--subdomains \
--schedule "daily" \
-o monitoring.json
# Weekly organization scan
python reconspider.py -o "Target Org" \
--emails \
--phones \
--schedule "weekly" \
--notify email@example.com
# Route through proxy
python reconspider.py -i 8.8.8.8 --proxy "http://proxy:8080"
# Use SOCKS proxy
python reconspider.py -d example.com --socks-proxy "socks5://localhost:1080"
# User-Agent rotation
python reconspider.py --social johndoe --rotate-ua
# Exclude certain results
python reconspider.py -d example.com --exclude "*.internal"
# Filter by date range
python reconspider.py --certs example.com --from 2025-01-01 --to 2026-05-01
# Limit results
python reconspider.py --emails example.com --limit 100
# Find overlapping targets
python reconspider.py -f domains.txt --correlate
# Link data sources
python reconspider.py -e user@example.com --link-accounts
# Timeline analysis
python reconspider.py -d example.com --timeline
-
Always obtain authorization — Only conduct reconnaissance on systems and organizations you own or have explicit written permission to test.
-
Respect rate limits — Use delays between requests and observe API rate limits to avoid account suspension.
-
Use VPN/Proxy — Route traffic through privacy-conscious services to protect your identity during reconnaissance.
-
Validate API keys — Test API credentials before running large scans to catch configuration issues early.
-
Document findings — Keep detailed records of scan dates, targets, findings, and methodology for reporting.
-
Verify information — Cross-reference data from multiple sources (Shodan, Censys, DNS) to ensure accuracy.
-
Filter false positives — Use honeypot detection and reputation checking to reduce noise in results.
-
Secure configuration files — Keep config.yaml and API keys in restricted directories (mode 600) to prevent exposure.
-
Start narrow, go broad — Begin with specific IP or domain, then expand to subdomains, emails, and social media.
-
Use structured output — Export to JSON or CSV for further analysis, correlation, and reporting in tools like Elasticsearch or Tableau.