PUT less commonly used for creating in RESTful service
Read
GET
Update
POST, PUT, PATCH
In RESTFUL, PUT more ofthen used for replacing resources
Delete
DELETE
Vulnerabilities per available HTTP methods
Knowing the available HTTP methods:
GET: parameters in teh URI may disclose sensitive information
Identifies potential vulnerabilities
Gives clues about how well the system is maintained
May reveal additional functionality to test
A few examples of how attackers may use Specific HTTP methods:
GET: paremeters in the URI may disclose sensitive information
POST: paremeters can be used to exploit vulnerabilities
PUT: may be able to upload files to the webroot
TRACE: identify load balancers or other devices in front of the webserver
DELETE: remove files from the webroot
User-Agent
Used to identify web client. Some tools (sqlmap, nmap) or scripts thjat use Python's urllib3 library may identify themselves via HTTP user-agent. Make sure to customize the user-agent to not allow easy detection.
Referer
Identifies for target server what page the user-agent was viewing when a link was clicked. This can give potential sensitive information being included on the URL.
Cookie and Set-Cookie
Upon sending a Cookie HTTP request header, server will respond with Set-Cookie header. Stealing authenticated cookies can lead to a session hijacking. Secure and HttpOnly flags ensure that transimssion occurs over an HTTPS encrypted channel.
Authorization
Authorization request header is associated with HTTP-based authentication methods. Often times sending Base-64 encoded values via authorization header.
Cacheable and Cache-Control
GET and HEAD requests typically are cacheble. Status codes of 200,203,204,206,300,301,404,405,410,414,501.
If you see a 304: Not Modified, this is because the data was previously cached.
Server
Server field provides HTTP response equivalent to the User-Agent header field. It indicates the web server closest to the end user.
WWW-Authenticate
When server uses built-in authentication method of HTTP Basic or Digest and user authenticates. The server might respond by employing the WWW-Authenticate header.
Structure: WWW-Authenticate: <type> realm=<realm>
<type> identifies the particular authentication method being employed and <realm> describes the "protection space" and indicates the scope of what is protected by this authentication.
Authentication
Web Server Based Authentication
HTTP Basic Authentication
Basic is the simplest built-in authentication scheme defined in RFC 2617. Server sends parameters called Realm, which describes the resource that is protected in in scope. Credentials are stored on the server (.htaccess on Apache). Upon submitting credentials, browser will create a Authorization header, concate credentials with : between and then base 64 encode. Also pads to 3 bytes using the = character.
Issues:
Without SSL, due to easily decoded base 64 credentials, this can be completely insecure.
After initial authentication, Authorization header is passed in every subsequent requests (increase attack window)
There is no log out. Only way to log out is to close browser.
Designed as a fix to Basic Authentication scheme in RFC 2617, Digest authentication does not send encoded password over the network, but adds additional parameters in challenge-response process. Sets Realm, salts (nonces and cnonces), and md5 to calculate response.
Issues:
Without SSL, password could still be cracked if nonces and cnonce are captured.
After initial authentication, the WWW-Authorize header is passed on every subsequent request (Increases attack window)
Also known as IWA, is a proprietary authentication schema added by Microsoft. Schema uses NTLM, for NTLM (Challenge-Response protocol) and Kerberos (Handled by client with tickets) over HTTP. Mostly seen in intranets, Sharepoint uses, provides SSO, required both client and server to be on same Windows domain.
NTLM over HTTP is a two round authentication protocol. Can be identified with the beginning string in Negotiate Authorization header T1RMTVNTUAAB. (Base64 encoded string NTLMSSP). Kerberos has only one round also uses WWW-Authenticate: NTLM schema
Issues:
Without SSL, password could still be cracked if network traffic captured
After initial authentication, the WWW-Authorize header is passed in every subsequent request (Increases attack window)
There is no log out. As long as user is logged into domain, browser will automatically authenticate. Opens door to CSRF attack vector.
This is the most common way of user authentication. Developers create HTML form which is used to perform authentication. Submitted credentials are sent in a POST HTTP request (Can also be GET), must be over SSL otherwise credentials are sent in plain text. Normally back-end authentication is used (SQL databases).
Form based authentication can be broken into 3 important components:
Authentication Form (page that accepts credentials and submits to server-side processing code)
Processing Code (Server-side code that verifies credentials, upon success redirects user to target page)
Protected Resources (Any resources that can be accessed only by authenticated users, needed to be configured correctly by devs)
Issues:
Secured by dev (Without proper security controls, back-end database could be open to injection attacks: SQL, LDAP, etc)
Session handled by developer or framework (Typically via cookies) (Session handling needs to ensure log out after inactivity or when logout button is pressed)
Allows authentication of users without asking for credentials as authentication is handled by identity provider (3P server). Based on tokens and uses APIs and can allow for SSO.
Shodan is a search engine that enables users to find specific types of devices connected to the internet.
Unlike traditional search engines such as Google or Bing, that are designed to search the Web, Shodan scans and indexes IP addresses, non-HTTP/HTTPS ports, servers, routers, IoT devices, printers and any physical system connected on the internet.
Use tool to extract following metadata of files (e.g, pdf, images, etc.) This data can later be used to create wordlist or get additional information of a target:
Geolocation data
Software (+version)
Usernames
Full File paths
Code comments
Tools: Tesseract, exiftool, Foca
Virtual Host Discovery
Certificate Transparency reports offer a penetration tester an additional method for discovering hosts and subdomains within a domain. This includes "hidden" virtual hosts that are not linked publicly, but have been issued an x.509 cert.
Typing a username, is it visible if its valid or does respond return username value in response?
We can use bash script or burp to send a wordlist as username and random password and based on length of response, it will identify what are valid users.
Timing techniques without HTML differences
Knowing username formatting (e.g 1 Letter for first name and full last name), create word dictionary of top 100 last names and run a Fuzzer against login and sort response by RTT
Versioning Port Scan
Look at robots.txt
Finding Hidden parameters (Arjun)
AJAX Applications
--
Automated Tools
OSINT
Data mine tools:
Maltego (Commercial use)
SpiderFoot (OpenSource)
Warning: May break Privacy Terms for Pentest
Searching domain names and email addresses:
theHarvester
AutOSINT (Uses theHarvester and pyFOCA to get OSINT data)
Spidering
Burp Spider (Commercial)
Zap Spider
gospider
Alternative methods to spidering for Single Page Application (SPA) or AJAX applications
Use an AJAX spider (not reliable)
Manual clicks + Proxy
Forced Browsing (Directory Brute Forcing)
dirbuster (Old)
dirb
gobuster
curl
Nikto
w3af
Metasploit's WMAP and msfcrawler auxiliary module
Force Browsing dictionaries
SecLists
Dirb
DirBuster
FuzzDB
JBroFuzz
WMAP
Get Subject Alternative Names (SAN) Certificate
Download and use client SSL certificate
Show SSL certification
Fuzzing
Fuzzing involves replacing normal values with attempted exploits to all available inputs and reviewing responses
What needs to be fuzzed
Request headers
POST parameters
GET parameters
PUT payloads
Any input to client-side and server-side code
Input Examples
SQL Injection ('+or+1=1;%23)
Password Spraying(Different usernames same password)
XSS (<script>alert("42")</script>)
Directory Traversal / Local File inclusion (../../../../../etc/passwrd)
ameenmaali's qsfuzz (Query String Fuzz) allows you to build your own rules to fuzz query strings and easily identify vulnerabilities.
https://github.com/ameenmaali/qsfuzz
Strenghts: Identifies Security Deficiencies not readily apparent in deployed applications
Weakness: Requires access to source code; might overlook APIs or libraries leveraged by the application; overlooks ops side of apps
DAST
Strengths: Doesn't require source code access; necessarily accounts for ops side of apps; can be automated/integrated into pipeline; scales well for large applications
Weaknesses: Only as good as the tool, its configuration, and the user wielding it; significant incidence of false positive; difficulty with certain types of flaws
Tools
Free/Open Source
Commercial
ZAP Active Scan
Acunetix Vulnerability Scanner
SQLmap (SQL Injection specific)
Burp Scanner
W3AF
Fortify WebInspect
Metasploit WMAP
IBM AppScan
WPScan (WordPress specific)
Qualys WAS
Rapid7 AppSpider
Veracode Dynamic Analysis
Whitehat Sentinel
ZAP Scan
LinkFinder
LinkFinder is an open-source python script used to discover API endpoints, URL resources and query parameters within minified JavaScript files. The script uses a combination of regular expressions to gather hidden URLs and application routes which can then be used by security researchers to further test for vulnerabilities.
HTTP/1.1 200 OK
Date: Fri, 21 Jun 2000 11:11:11 GMT
Server: Apache/2.4.29 (Ubuntu)
Vary: Accepted-Encoding
Content-Length: 1061
Connection: close
Content-Type: text/html; charset=UTF-8
digestive --wordlist /opt/john/run/password.lst --username user --realm test --uri /digest-test/ --qop auth --nc 00000001 --method GET --nonce NONCE --response RESPONSE --cnonce CNONCE
# Fill in parameters per captured traffic
# Fill in NONCE, RESPONSE, and CNONCE from captured GET request.
# Due to GDPR might not get as much data from whois
dig test.com
whois 10.10.10.10 # IP collected from dig
# Will possibly output CIDR
dig test.com -t any
dig test.com -t axfr # Attempt a zone transfer
nmap --script=dns-zone-transfer test.com # Attempt a zone transfer
dnsrecon.py -a -d test.com # Attempt a Zone transfer
dig test.com -t axfr
# Capture SOA serial number (e.g 2020011201, YYYYMMDDNN)
# Decrease serial number by 1, and the entire zone will be downloaded
dig test.com -t ixfr=20200011200
# Nmap
nmap test.com --script=dns-brute
nmap test.com --script=dns-brute --script-args=dns-brute.hostlist=/opt/dnsrecon/namelist.txt
## By default nmap dictionary has 120+ lines, dnsrecon has 1900+.
# DNSrecon
dnsrecon.py -d test.com -t brt -n 10.10.10.10 -D /opt/dnsrecon/namelist.txt
## -n is used to give a domain server IP, if none is given the SOA will be used of domain.
## Larger dictionaries can be found:
## /opt/dnsrecon/subdomains-top1mil-5000.txt (5000 entries)
## /opt/dnsrecon/subdomains-top1mil-20000.txt (20000 entries)
dig -x $IP
dig @$IP version.bind chaos txt
# Use target's CIDR
# Nmap
nmap -sL 10.10.10.0/24 | grep \)
# DNSrecon
dnsrecon.py -r 10.10.10.0/24
# Metasploit
use auxiliary/gather/enum_dns
set DOMAIN test.com
set ENUM_RVL true
set IPRANGE 10.0.0.0/24
run
dig axfr @n1.example.com internal
intitle:"index of" "last modified" site.test.com
# Curl
curl -s --head test.com
# Netcat
nc -C test.com 80
nc -v test.com 80
## Get Server info for 1.0
HEAD / HTTP/1.0
## Get Server info for 1.1 (Virtual Host)
HEAD / HTTP/1.1
Host: test.com
# OpenSSL HTTPS
openssl s_client -crlf -connect www.test.com:443
HEAD / HTTP/1.1 # or GET / HTTP/1.0
Host:test.com
# Nmap
nmap -p 80,443,8080 --script=http-headers test.com
nmap -p 80,443,8080 --script=http-title $IP #Gather page titles from HTTP services
nmap -p 80,443,8080 --script=http-headers $IP #Get HTTP headers of web services
nmap -p 80,443,8080 --script=http-enum $IP #Find web apps from known paths
#Single line Force Browsing with curl and dirb wordlist
while read dir; do curl -Is http://domain.test.com/$dir | grep -E '200 |302 ' && echo $dir; done < /opt/dirb/wordlists/big.txt