SIFT Feuille de chaleur du poste de travail
SIFT (boîte à outils médico-légale SANS) La station de travail est un service médico-légal numérique complet et une distribution d'intervention basée sur Ubuntu. Développé par SANS, l'ISFT offre une collection complète d'outils de criminalistique numérique libres et libres, ce qui en fait une plateforme essentielle pour les enquêteurs numériques, les intervenants en cas d'incident et les professionnels de la cybersécurité.
Installation et configuration
Méthodes d'installation de l'EIFT
# Method 1: Download Pre-built VM
# Download SIFT Workstation OVA from SANS
# Import into VMware/VirtualBox
# Default credentials: sansforensics/forensics
# Method 2: Install on Existing Ubuntu
curl -Lo install-sift.sh https://github.com/teamdfir/sift-cli/releases/download/v1.10.0/sift-cli-linux
chmod +x install-sift.sh
sudo ./install-sift.sh install
# Method 3: Docker Installation
docker pull teamdfir/sift-workstation
docker run -it --rm teamdfir/sift-workstation
# Method 4: Manual Installation
git clone https://github.com/teamdfir/sift-cli.git
cd sift-cli
sudo python3 setup.py install
sift install
Exigences du système
# Minimum Requirements
CPU: 2 cores
RAM: 4 GB
Storage: 60 GB
Network: Internet connection for updates
# Recommended Specifications
CPU: 4+ cores
RAM: 8+ GB
Storage: 200+ GB SSD
Network: High-speed internet
Additional: USB 3.0 ports for evidence drives
```_
### Configuration initiale
```bash
# Update SIFT tools
sift update
# Upgrade SIFT installation
sift upgrade
# Check SIFT version
sift version
# List installed tools
sift list
# Configure timezone
sudo timedatectl set-timezone America/New_York
# Mount evidence drives
sudo mkdir /mnt/evidence
sudo mount -o ro,noexec,nodev /dev/sdb1 /mnt/evidence
```_
## Principaux outils judiciaires
### Analyse du système de fichiers
```bash
# The Sleuth Kit (TSK)
fls -r image.dd # List files recursively
icat image.dd 12345 > file.txt # Extract file by inode
ils image.dd # List inodes
fsstat image.dd # File system statistics
mmls image.dd # Partition table analysis
# Autopsy (Web-based interface)
autopsy & # Start Autopsy web interface
# Access via http://localhost:9999/autopsy
# AFFLIB (Advanced Forensic Format)
affcat image.aff > image.dd # Convert AFF to raw
affinfo image.aff # Display AFF metadata
affcompare image1.aff image2.aff # Compare AFF images
# ewf-tools (Expert Witness Format)
ewfinfo image.E01 # Display EWF metadata
ewfmount image.E01 /mnt/ewf # Mount EWF image
ewfverify image.E01 # Verify EWF integrity
Analyse de la mémoire
# Volatility Framework
volatility -f memory.dump imageinfo
volatility -f memory.dump --profile=Win7SP1x64 pslist
volatility -f memory.dump --profile=Win7SP1x64 netscan
volatility -f memory.dump --profile=Win7SP1x64 malfind
volatility -f memory.dump --profile=Win7SP1x64 hivelist
volatility -f memory.dump --profile=Win7SP1x64 hashdump
# Volatility 3
vol.py -f memory.dump windows.info
vol.py -f memory.dump windows.pslist
vol.py -f memory.dump windows.netscan
vol.py -f memory.dump windows.malfind
vol.py -f memory.dump windows.registry.hivelist
# LiME (Linux Memory Extractor)
sudo insmod lime.ko "path=/tmp/memory.lime format=lime"
sudo insmod lime.ko "path=/tmp/memory.raw format=raw"
# AVML (Acquire Volatile Memory for Linux)
sudo avml memory.lime
sudo avml --compress memory.lime.gz
Analyse des réseaux
# Wireshark
wireshark & # GUI interface
tshark -r capture.pcap # Command line analysis
tshark -r capture.pcap -Y "http" # Filter HTTP traffic
tshark -r capture.pcap -T fields -e ip.src -e ip.dst
# tcpdump
tcpdump -r capture.pcap # Read capture file
tcpdump -i eth0 -w capture.pcap # Capture to file
tcpdump -r capture.pcap host 192.168.1.1
# NetworkMiner
mono /usr/share/networkminer/NetworkMiner.exe
# Xplico
sudo service xplico start
# Access via http://localhost:9876
Analyse du registre (Windows)
# RegRipper
rip.pl -r SYSTEM -f system # Analyze SYSTEM hive
rip.pl -r SOFTWARE -f software # Analyze SOFTWARE hive
rip.pl -r NTUSER.DAT -f ntuser # Analyze user hive
rip.pl -l # List available plugins
# Registry Decoder
# GUI tool for registry analysis
python /usr/share/regdecoder/regdecoder.py
# hivex tools
hivexsh registry_hive # Interactive shell
hivexget registry_hive '\Root\Key' # Get registry value
hivexregedit --export registry_hive output.reg
Analyse chronologique
# log2timeline (plaso)
log2timeline.py timeline.plaso image.dd
psort.py -o l2tcsv timeline.plaso > timeline.csv
psort.py -o xlsx timeline.plaso -w timeline.xlsx
# mactime (TSK)
fls -r -m / image.dd > bodyfile
mactime -b bodyfile -d > timeline.csv
# Timesketch
timesketch_importer --file timeline.csv --sketch_id 1
# Super Timeline
log2timeline.py --parsers="win7,webhist" timeline.plaso image.dd
psort.py -o dynamic timeline.plaso
Analyse des logiciels malveillants
# YARA
yara rules.yar /path/to/files # Scan files with rules
yara -r rules.yar /path/to/dir # Recursive scanning
yara -s rules.yar file.exe # Show matching strings
# ClamAV
clamscan file.exe # Scan single file
clamscan -r /path/to/dir # Recursive scan
freshclam # Update virus definitions
# Radare2
r2 malware.exe # Open binary
aaa # Auto-analyze
pdf @main # Disassemble main function
VV # Visual mode
# Ghidra
ghidra & # Start Ghidra GUI
# Import binary and analyze
# strings
strings malware.exe # Extract strings
strings -n 10 malware.exe # Minimum length 10
strings -e l malware.exe # Little-endian Unicode
Analyse de disque et de fichier
Imagerie des disques
# dd (Data Dump)
dd if=/dev/sdb of=image.dd bs=4096 conv=noerror,sync
dd if=/dev/sdb of=image.dd bs=4096 status=progress
# dcfldd (Enhanced dd)
dcfldd if=/dev/sdb of=image.dd bs=4096 hash=md5,sha1
dcfldd if=/dev/sdb of=image.dd sizeprobe=if
# dc3dd (DoD version)
dc3dd if=/dev/sdb of=image.dd hash=md5 log=imaging.log
dc3dd if=/dev/sdb of=image.dd hofs=image.md5
# ddrescue (GNU ddrescue)
ddrescue /dev/sdb image.dd rescue.log
ddrescue --force /dev/sdb image.dd rescue.log
# FTK Imager (Wine)
wine /opt/ftk-imager/FTK\ Imager.exe
Découpage des fichiers
# Foremost
foremost -i image.dd -o carved_files/
foremost -t jpg,png,pdf -i image.dd -o output/
foremost -c /etc/foremost.conf -i image.dd -o output/
# Scalpel
scalpel -b -o carved_files/ image.dd
scalpel -c scalpel.conf -o output/ image.dd
# PhotoRec
photorec image.dd # Interactive mode
photorec /log /d output/ image.dd # Command line
# Bulk Extractor
bulk_extractor -o output/ image.dd
bulk_extractor -x all -o output/ image.dd
bulk_extractor -e email -o output/ image.dd
Analyse de Hash
# md5sum/sha1sum/sha256sum
md5sum file.exe # Calculate MD5
sha1sum file.exe # Calculate SHA1
sha256sum file.exe # Calculate SHA256
md5sum -c checksums.md5 # Verify checksums
# hashdeep
hashdeep -r /path/to/files # Recursive hashing
hashdeep -c md5,sha1 -r /path/ # Multiple algorithms
hashdeep -a -k known_hashes.txt unknown_files/
# ssdeep (Fuzzy hashing)
ssdeep file.exe # Calculate fuzzy hash
ssdeep -r /path/to/files # Recursive fuzzy hashing
ssdeep -m known_hashes.txt unknown_files/
# NSRL (National Software Reference Library)
# Compare hashes against NSRL database
nsrlsvr -f /path/to/nsrl/NSRLFile.txt
Analyse du journal
Analyse du registre système
# Log file locations
/var/log/syslog # System messages
/var/log/auth.log # Authentication logs
/var/log/apache2/access.log # Apache access logs
/var/log/apache2/error.log # Apache error logs
/var/log/mail.log # Mail server logs
# Log analysis tools
grep "Failed password" /var/log/auth.log
awk '\\\\{print $1\\\\}' /var/log/apache2/access.log|sort|uniq -c
tail -f /var/log/syslog # Real-time monitoring
# Logstash
/opt/logstash/bin/logstash -f config.conf
# Splunk Universal Forwarder
sudo /opt/splunkforwarder/bin/splunk start
Analyse du journal des événements de Windows
# python-evtx
evtx_dump.py System.evtx > system_events.xml
evtx_dump.py --json Security.evtx > security_events.json
# Log Parser (Wine)
wine LogParser.exe -i:EVT -o:CSV "SELECT * FROM Security.evtx"
# Event Log Explorer
# GUI tool for Windows event log analysis
# Windows Event Log analysis with Volatility
volatility -f memory.dump --profile=Win7SP1x64 evtlogs
volatility -f memory.dump --profile=Win7SP1x64 iehistory
Analyse du journal Web
# Apache/Nginx log analysis
awk '\\\\{print $1\\\\}' access.log|sort|uniq -c|sort -nr
grep "404" access.log|awk '\\\\{print $7\\\\}'|sort|uniq -c
grep "POST" access.log|grep -v "200"
# GoAccess
goaccess access.log -o report.html --log-format=COMBINED
goaccess access.log -c --log-format=COMBINED
# AWStats
perl awstats.pl -config=website -update
perl awstats.pl -config=website -output -staticlinks > awstats.html
Médecine légale mobile
Analyse Android
# ADB (Android Debug Bridge)
adb devices # List connected devices
adb shell # Access device shell
adb pull /data/data/com.app/ ./ # Extract app data
adb backup -all -f backup.ab # Create full backup
# Android backup extraction
dd if=backup.ab bs=1 skip=24|python -c "import zlib,sys;sys.stdout.write(zlib.decompress(sys.stdin.read()))"|tar -xvf -
# ALEAPP (Android Logs Events And Protobuf Parser)
python aleapp.py -t tar -i android_backup.tar -o output/
# Autopsy Mobile Forensics
# Import Android image into Autopsy
# Analyze with mobile forensics modules
Analyse iOS
# libimobiledevice
ideviceinfo # Device information
idevicebackup2 backup ./backup/ # Create backup
ideviceinstaller -l # List installed apps
# iOS backup analysis
python ios_backup_analyzer.py backup_folder/
# ILEAPP (iOS Logs Events And Protobuf Parser)
python ileapp.py -t tar -i ios_backup.tar -o output/
# 3uTools (Wine)
wine 3uTools.exe
Analyse de la base de données
Analyse SQLite
# SQLite command line
sqlite3 database.db # Open database
.tables # List tables
.schema table_name # Show table schema
SELECT * FROM table_name; # Query data
.dump # Export database
# SQLite Browser
sqlitebrowser database.db # GUI interface
# SQLite recovery
sqlite3_analyzer database.db # Analyze database
undark database.db --type=freelist # Recover deleted data
Autres formats de base de données
# ESE Database (Windows)
esedbexport -t tables database.edb output/
esedbinfo database.edb
# Registry as database
python registry_parser.py NTUSER.DAT
# Browser databases
sqlite3 places.sqlite "SELECT * FROM moz_places;" # Firefox history
sqlite3 History "SELECT * FROM urls;" # Chrome history
Réseau médico-légal
Analyse des paquets
# Wireshark command line
tshark -r capture.pcap -Y "http.request" -T fields -e http.host -e http.request.uri
tshark -r capture.pcap -Y "dns" -T fields -e dns.qry.name
tshark -r capture.pcap -z conv,ip # IP conversations
tshark -r capture.pcap -z http,tree # HTTP statistics
# tcpflow
tcpflow -r capture.pcap -o output/ # Extract TCP flows
tcpflow -r capture.pcap -e scanner # Scan for content
# Chaosreader
chaosreader capture.pcap # Extract sessions
chaosreader -D output/ capture.pcap # Extract to directory
# Network timeline
tshark -r capture.pcap -T fields -e frame.time -e ip.src -e ip.dst -e frame.protocols
Analyse du protocole
# HTTP analysis
tshark -r capture.pcap -Y "http" -T fields -e http.request.method -e http.request.uri
grep -a "GET\|POST" capture.pcap
# DNS analysis
tshark -r capture.pcap -Y "dns.flags.response == 0" -T fields -e dns.qry.name
tshark -r capture.pcap -Y "dns.flags.response == 1" -T fields -e dns.a
# Email analysis
tshark -r capture.pcap -Y "smtp" -T fields -e smtp.req.command -e smtp.req.parameter
tshark -r capture.pcap -Y "pop" -T fields -e pop.request.command
# FTP analysis
tshark -r capture.pcap -Y "ftp" -T fields -e ftp.request.command -e ftp.request.arg
Automatisation et écriture
Scénarios et outils de l'ISFT
# SIFT CLI commands
sift install --mode=server # Server installation
sift install --mode=desktop # Desktop installation
sift debug # Debug information
sift list-upgrades # List available upgrades
# Custom scripts location
/usr/local/bin/ # Custom tools
/home/sansforensics/Desktop/ # Desktop shortcuts
/opt/ # Third-party tools
# Environment variables
export VOLATILITY_LOCATION=/usr/bin/vol.py
export VOLATILITY_PLUGINS=/usr/lib/python2.7/dist-packages/volatility/plugins
Exemples d'automatisation
# Automated imaging script
#!/bin/bash
DEVICE=$1
OUTPUT=$2
echo "Imaging $DEVICE to $OUTPUT"
dcfldd if=$DEVICE of=$OUTPUT.dd hash=md5,sha1 bs=4096
md5sum $OUTPUT.dd > $OUTPUT.md5
sha1sum $OUTPUT.dd > $OUTPUT.sha1
# Automated analysis script
#!/bin/bash
IMAGE=$1
OUTPUT_DIR=$2
mkdir -p $OUTPUT_DIR
# File system analysis
fls -r $IMAGE > $OUTPUT_DIR/file_list.txt
mactime -b $OUTPUT_DIR/file_list.txt > $OUTPUT_DIR/timeline.csv
# Hash analysis
hashdeep -r $IMAGE > $OUTPUT_DIR/hashes.txt
# String extraction
strings $IMAGE > $OUTPUT_DIR/strings.txt
Scénarios de Python Forensics
# Basic file analysis
import hashlib
import os
def analyze_file(filepath):
with open(filepath, 'rb') as f:
data = f.read()
md5_hash = hashlib.md5(data).hexdigest()
sha1_hash = hashlib.sha1(data).hexdigest()
size = len(data)
return \\\\{
'path': filepath,
'size': size,
'md5': md5_hash,
'sha1': sha1_hash
\\\\}
# Registry analysis
import Registry
def analyze_registry(hive_path):
reg = Registry.Registry(hive_path)
root = reg.root()
for key in root.subkeys():
print(f"Key: \\\\{key.name()\\\\}")
for value in key.values():
print(f" Value: \\\\{value.name()\\\\} = \\\\{value.value()\\\\}")
Gestion des cas
Traitement des preuves
# Evidence mounting (read-only)
sudo mount -o ro,noexec,nodev,loop image.dd /mnt/evidence
sudo mount -o ro,noexec,nodev /dev/sdb1 /mnt/evidence
# Evidence verification
md5sum evidence.dd
sha1sum evidence.dd
sha256sum evidence.dd
# Chain of custody
echo "$(date): Evidence mounted by $(whoami)" >> chain_of_custody.log
echo "$(date): Analysis started" >> chain_of_custody.log
# Evidence documentation
exiftool evidence.dd # Metadata extraction
file evidence.dd # File type identification
hexdump -C evidence.dd|head # Hex dump preview
Production de rapports
# Automated reporting
log2timeline.py --parsers="win7" timeline.plaso image.dd
psort.py -o xlsx timeline.plaso -w timeline.xlsx
# Timeline visualization
python /usr/share/plaso/tools/psort.py -o l2tcsv timeline.plaso > timeline.csv
# Hash comparison reports
hashdeep -c md5,sha1 -r /evidence/ > evidence_hashes.txt
hashdeep -a -k known_good.txt evidence_hashes.txt > comparison_report.txt
# Custom reporting scripts
python generate_report.py --case "Case001" --evidence image.dd --output report.html
Documentation
# Case notes
mkdir case_001
cd case_001
echo "Case: 001" > case_notes.txt
echo "Date: $(date)" >> case_notes.txt
echo "Investigator: $(whoami)" >> case_notes.txt
# Screenshot documentation
gnome-screenshot -f screenshot_$(date +%Y%m%d_%H%M%S).png
# Command history
history > command_history_$(date +%Y%m%d).txt
# System information
uname -a > system_info.txt
lsb_release -a >> system_info.txt
sift version >> system_info.txt
Techniques avancées
Analyse de mémoire avancée
# Volatility plugins
volatility --info # List available plugins
volatility -f memory.dump --profile=Win7SP1x64 psscan
volatility -f memory.dump --profile=Win7SP1x64 dlllist -p 1234
volatility -f memory.dump --profile=Win7SP1x64 handles -p 1234
volatility -f memory.dump --profile=Win7SP1x64 getsids
volatility -f memory.dump --profile=Win7SP1x64 printkey -K "Software\Microsoft\Windows\CurrentVersion\Run"
# Custom Volatility plugins
cp custom_plugin.py /usr/lib/python2.7/dist-packages/volatility/plugins/
volatility -f memory.dump --profile=Win7SP1x64 custom_plugin
Détection antiforensique
# Timestamp analysis
mactime -b bodyfile -d|grep "1970\|2099" # Suspicious timestamps
stat file.txt # File timestamps
# Hidden data detection
binwalk firmware.bin # Embedded files
steghide info image.jpg # Steganography detection
outguess -r image.jpg output.txt # Outguess extraction
# Encryption detection
entropy -t file.bin # Entropy analysis
file file.bin # File type detection
hexdump -C file.bin|head # Manual inspection
Nuage médico-légal
# Cloud artifact analysis
sqlite3 cloud_sync.db "SELECT * FROM files;"
grep -r "dropbox\|google\|onedrive" /home/user/
# Browser cloud artifacts
sqlite3 ~/.config/google-chrome/Default/History "SELECT * FROM downloads;"
sqlite3 ~/.mozilla/firefox/profile/places.sqlite "SELECT * FROM moz_downloads;"
# Cloud storage analysis
rclone ls remote: # List cloud files
rclone copy remote: ./cloud_backup/ # Download cloud data
Dépannage
Questions communes
# Permission issues
sudo chown -R sansforensics:sansforensics /cases/
sudo chmod 755 /mnt/evidence
# Tool not found
which volatility # Check tool location
echo $PATH # Check PATH variable
sift list # List installed tools
# Memory issues
free -h # Check available memory
top # Monitor processes
kill -9 PID # Kill problematic process
# Disk space issues
df -h # Check disk usage
du -sh /cases/* # Check case sizes
ncdu /cases/ # Interactive disk usage
Optimisation des performances
# Increase swap space
sudo fallocate -l 4G /swapfile
sudo chmod 600 /swapfile
sudo mkswap /swapfile
sudo swapon /swapfile
# Optimize for SSD
sudo echo 'deadline' > /sys/block/sda/queue/scheduler
sudo mount -o remount,noatime /
# Parallel processing
parallel -j 4 "strings \\\\{\\\\} > \\\\{\\\\}.strings" ::: *.exe
find /evidence -name "*.exe"|parallel -j 4 yara rules.yar \\\\{\\\\}
Ressources
- Documentation du poste de travail SIFT
- [SANS Médecine légale numérique] (LINK_5)
- Communauté FDIR
- [Documentation sur la volatilité] (LINK_5)
- [La documentation de la trousse de sleuth] (LINK_5)