reconFTW automates the entire process of reconnaissance for you. It outperforms the work of subdomain enumeration along with various vulnerability checks and obtaining maximum information about your target.
reconFTW uses a lot of techniques (passive, bruteforce, permutations, certificate transparency, source code scraping, analytics, DNS records...) for subdomain enumeration which helps you to get the maximum and the most interesting subdomains so that you be ahead of the competition.
It also performs various vulnerability checks like XSS, Open Redirects, SSRF, CRLF, LFI, SQLi, SSL tests, SSTI, DNS zone transfers, and much more. Along with these, it performs OSINT techniques, directory fuzzing, dorking, ports scanning, screenshots, nuclei scan on your target.
So, what are you waiting for? Go! Go! Go! 💥
- ⚙️ Config file
- Usage
- TARGET OPTIONS
- MODE OPTIONS
- GENERAL OPTIONS
- Example Usage
- To perform a full recon on single target
- To perform a full recon on a list of targets
- Perform full recon with more time intense tasks (VPS intended only)
- Perform recon in a multi domain target
- Perform recon with axiom integration
- Perform all steps (whole recon + all attacks) a.k.a. YOLO mode
- Show help section
- Axiom Support ☁️
- Sample video
- 🔥 Features 🔥
You can check out our wiki for the installation guide Installation Guide 📖
- Requires Golang > 1.15.0+ installed and paths correctly set ($GOPATH, $GOROOT)
Important: if you are not running reconftw as root, run sudo echo "${USERNAME} ALL=(ALL:ALL) NOPASSWD: ALL" | sudo tee -a /etc/sudoers.d/reconFTW
, to make sure no sudo prompts are required to run the tool and to avoid any permission issues.
git clone https://github.com/six2dez/reconftw
cd reconftw/
./install.sh
./reconftw.sh -d target.com -r
- Pull the image
docker pull six2dez/reconftw:main
- Run the container
docker run -it --rm \
-v "${PWD}/OutputFolder/":'/reconftw/Recon/' \
six2dez/reconftw:main -d example.com -r
-
View results (they're NOT in the Docker container)
- As the folder you cloned earlier (named
reconftw
) is being renamed toOutputFolder
, you'll have to go to that folder to view results.
- As the folder you cloned earlier (named
If you wish to:
- Dynamically modify the behaviour & function of the image
- Build your own container
- Build an Axiom Controller on top of the official image
Please refer to the Docker documentation.
Yes! reconFTW can also be easily deployed with Terraform and Ansible to AWS, if you want to know how to do it, you can check the guide here
You can find a detailed explanation of the configuration file here 📖
- Through
reconftw.cfg
file the whole execution of the tool can be controlled. - Hunters can set various scanning modes, execution preferences, tools, config files, APIs/TOKENS, personalized wordlists and much more.
👉 Click here to view default config file 👈
#################################################################
# reconFTW config file #
#################################################################
# General values
tools=~/Tools # Path installed tools
SCRIPTPATH="$( cd "$(dirname "$0")" >/dev/null 2>&1 ; pwd -P )" # Get current script's path
profile_shell=".$(basename $(echo $SHELL))rc" # Get current shell profile
reconftw_version=$(git rev-parse --abbrev-ref HEAD)-$(git describe --tags) # Fetch current reconftw version
generate_resolvers=false # Generate custom resolvers with dnsvalidator
update_resolvers=true # Fetch and rewrite resolvers from trickest/resolvers before DNS resolution
resolvers_url="https://raw.githubusercontent.com/trickest/resolvers/main/resolvers.txt"
resolvers_trusted_url="https://raw.githubusercontent.com/six2dez/resolvers_reconftw/main/resolvers_trusted.txt"
fuzzing_remote_list="https://raw.githubusercontent.com/six2dez/OneListForAll/main/onelistforallmicro.txt" # Used to send to axiom(if used) on fuzzing
proxy_url="http://127.0.0.1:8080/" # Proxy url
install_golang=true # Set it to false if you already have Golang configured and ready
upgrade_tools=true
#dir_output=/custom/output/path
# Golang Vars (Comment or change on your own)
export GOROOT=/usr/local/go
export GOPATH=$HOME/go
export PATH=$GOPATH/bin:$GOROOT/bin:$HOME/.local/bin:$PATH
# Tools config files
#NOTIFY_CONFIG=~/.config/notify/provider-config.yaml # No need to define
AMASS_CONFIG=~/.config/amass/config.ini
GITHUB_TOKENS=${tools}/.github_tokens
GITLAB_TOKENS=${tools}/.gitlab_tokens
SUBGPT_COOKIE=${tools}/subgpt_cookies.json
#CUSTOM_CONFIG=custom_config_path.txt # In case you use a custom config file, uncomment this line and set your files path
# APIs/TOKENS - Uncomment the lines you want removing the '#' at the beginning of the line
#SHODAN_API_KEY="XXXXXXXXXXXXX"
#WHOISXML_API="XXXXXXXXXX"
#XSS_SERVER="XXXXXXXXXXXXXXXXX"
#COLLAB_SERVER="XXXXXXXXXXXXXXXXX"
#slack_channel="XXXXXXXX"
#slack_auth="xoXX-XXX-XXX-XXX"
# File descriptors
DEBUG_STD="&>/dev/null" # Skips STD output on installer
DEBUG_ERROR="2>/dev/null" # Skips ERR output on installer
# Osint
OSINT=true # Enable or disable the whole OSINT module
GOOGLE_DORKS=true
GITHUB_DORKS=true
GITHUB_REPOS=true
METADATA=true # Fetch metadata from indexed office documents
EMAILS=true # Fetch emails from differents sites
DOMAIN_INFO=true # whois info
REVERSE_WHOIS=true # amass intel reverse whois info, takes some time
IP_INFO=true # Reverse IP search, geolocation and whois
METAFINDER_LIMIT=20 # Max 250
# Subdomains
RUNAMASS=true
RUNSUBFINDER=true
SUBDOMAINS_GENERAL=true # Enable or disable the whole Subdomains module
SUBPASSIVE=true # Passive subdomains search
SUBCRT=true # crtsh search
SUBNOERROR=true # Check DNS NOERROR response and BF on them
SUBANALYTICS=true # Google Analytics search
SUBBRUTE=true # DNS bruteforcing
SUBSCRAPING=true # Subdomains extraction from web crawling
SUBPERMUTE=true # DNS permutations
SUBREGEXPERMUTE=true # Permutations by regex analysis
SUBGPT=true # Permutations by BingGPT prediction
PERMUTATIONS_OPTION=gotator # The alternative is "ripgen" (faster, not deeper)
GOTATOR_FLAGS=" -depth 1 -numbers 3 -mindup -adv -md" # Flags for gotator
SUBTAKEOVER=false # Check subdomain takeovers, false by default cuz nuclei already check this
SUB_RECURSIVE_PASSIVE=false # Uses a lot of API keys queries
DEEP_RECURSIVE_PASSIVE=10 # Number of top subdomains for recursion
SUB_RECURSIVE_BRUTE=false # Needs big disk space and time to resolve
ZONETRANSFER=true # Check zone transfer
S3BUCKETS=true # Check S3 buckets misconfigs
REVERSE_IP=false # Check reverse IP subdomain search (set True if your target is CIDR/IP)
TLS_PORTS="21,22,25,80,110,135,143,261,271,324,443,448,465,563,614,631,636,664,684,695,832,853,854,990,993,989,992,994,995,1129,1131,1184,2083,2087,2089,2096,2221,2252,2376,2381,2478,2479,2482,2484,2679,2762,3077,3078,3183,3191,3220,3269,3306,3410,3424,3471,3496,3509,3529,3539,3535,3660,36611,3713,3747,3766,3864,3885,3995,3896,4031,4036,4062,4064,4081,4083,4116,4335,4336,4536,4590,4740,4843,4849,5443,5007,5061,5321,5349,5671,5783,5868,5986,5989,5990,6209,6251,6443,6513,6514,6619,6697,6771,7202,7443,7673,7674,7677,7775,8243,8443,8991,8989,9089,9295,9318,9443,9444,9614,9802,10161,10162,11751,12013,12109,14143,15002,16995,41230,16993,20003"
INSCOPE=false # Uses inscope tool to filter the scope, requires .scope file in reconftw folder
# Web detection
WEBPROBESIMPLE=true # Web probing on 80/443
WEBPROBEFULL=true # Web probing in a large port list
WEBSCREENSHOT=true # Webs screenshooting
VIRTUALHOSTS=false # Check virtualhosts by fuzzing HOST header
UNCOMMON_PORTS_WEB="81,300,591,593,832,981,1010,1311,1099,2082,2095,2096,2480,3000,3001,3002,3003,3128,3333,4243,4567,4711,4712,4993,5000,5104,5108,5280,5281,5601,5800,6543,7000,7001,7396,7474,8000,8001,8008,8014,8042,8060,8069,8080,8081,8083,8088,8090,8091,8095,8118,8123,8172,8181,8222,8243,8280,8281,8333,8337,8443,8500,8834,8880,8888,8983,9000,9001,9043,9060,9080,9090,9091,9092,9200,9443,9502,9800,9981,10000,10250,11371,12443,15672,16080,17778,18091,18092,20720,32000,55440,55672"
# Host
FAVICON=true # Check Favicon domain discovery
PORTSCANNER=true # Enable or disable the whole Port scanner module
PORTSCAN_PASSIVE=true # Port scanner with Shodan
PORTSCAN_ACTIVE=true # Port scanner with nmap
CDN_IP=true # Check which IPs belongs to CDN
# Web analysis
WAF_DETECTION=true # Detect WAFs
NUCLEICHECK=true # Enable or disable nuclei
NUCLEI_SEVERITY="info,low,medium,high,critical" # Set templates criticity
NUCLEI_FLAGS=" -silent -t $HOME/nuclei-templates/ -retries 2" # Additional nuclei extra flags, don't set the severity here but the exclusions like " -etags openssh"
NUCLEI_FLAGS_JS=" -silent -tags exposure,token -severity info,low,medium,high,critical" # Additional nuclei extra flags for js secrets
URL_CHECK=true # Enable or disable URL collection
URL_CHECK_PASSIVE=true # Search for urls, passive methods from Archive, OTX, CommonCrawl, etc
URL_CHECK_ACTIVE=true # Search for urls by crawling the websites
URL_GF=true # Url patterns classification
URL_EXT=true # Returns a list of files divided by extension
JSCHECKS=true # JS analysis
FUZZ=true # Web fuzzing
CMS_SCANNER=true # CMS scanner
WORDLIST=true # Wordlist generation
ROBOTSWORDLIST=true # Check historic disallow entries on waybackMachine
PASSWORD_DICT=true # Generate password dictionary
PASSWORD_MIN_LENGTH=5 # Min password lenght
PASSWORD_MAX_LENGTH=14 # Max password lenght
# Vulns
VULNS_GENERAL=false # Enable or disable the vulnerability module (very intrusive and slow)
XSS=true # Check for xss with dalfox
CORS=true # CORS misconfigs
TEST_SSL=true # SSL misconfigs
OPEN_REDIRECT=true # Check open redirects
SSRF_CHECKS=true # SSRF checks
CRLF_CHECKS=true # CRLF checks
LFI=true # LFI by fuzzing
SSTI=true # SSTI by fuzzing
SQLI=true # Check SQLI
SQLMAP=true # Check SQLI with sqlmap
GHAURI=false # Check SQLI with ghauri
BROKENLINKS=true # Check for brokenlinks
SPRAY=true # Performs password spraying
COMM_INJ=true # Check for command injections with commix
PROTO_POLLUTION=true # Check for prototype pollution flaws
SMUGGLING=true # Check for HTTP request smuggling flaws
WEBCACHE=true # Check for Web Cache issues
BYPASSER4XX=true # Check for 4XX bypasses
# Extra features
NOTIFICATION=false # Notification for every function
SOFT_NOTIFICATION=false # Only for start/end
DEEP=false # DEEP mode, really slow and don't care about the number of results
DEEP_LIMIT=500 # First limit to not run unless you run DEEP
DEEP_LIMIT2=1500 # Second limit to not run unless you run DEEP
DIFF=false # Diff function, run every module over an already scanned target, printing only new findings (but save everything)
REMOVETMP=false # Delete temporary files after execution (to free up space)
REMOVELOG=false # Delete logs after execution
PROXY=false # Send to proxy the websites found
SENDZIPNOTIFY=false # Send to zip the results (over notify)
PRESERVE=true # set to true to avoid deleting the .called_fn files on really large scans
FFUF_FLAGS=" -mc all -fc 404 -ach -sf -of json" # Ffuf flags
HTTPX_FLAGS=" -follow-redirects -random-agent -status-code -silent -title -web-server -tech-detect -location -content-length" # Httpx flags for simple web probing
GOWITNESS_FLAGS=" --disable-logging --timeout 5"
# HTTP options
HEADER="User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:72.0) Gecko/20100101 Firefox/72.0" # Default header
# Threads
FFUF_THREADS=40
HTTPX_THREADS=50
HTTPX_UNCOMMONPORTS_THREADS=100
KATANA_THREADS=20
BRUTESPRAY_THREADS=20
BRUTESPRAY_CONCURRENCE=10
GAU_THREADS=10
DNSTAKE_THREADS=100
DALFOX_THREADS=200
PUREDNS_PUBLIC_LIMIT=0 # Set between 2000 - 10000 if your router blows up, 0 means unlimited
PUREDNS_TRUSTED_LIMIT=400
PUREDNS_WILDCARDTEST_LIMIT=30
PUREDNS_WILDCARDBATCH_LIMIT=1500000
GOWITNESS_THREADS=20
RESOLVE_DOMAINS_THREADS=150
PPFUZZ_THREADS=30
DNSVALIDATOR_THREADS=200
INTERLACE_THREADS=10
TLSX_THREADS=1000
XNLINKFINDER_DEPTH=3
BYP4XX_THREADS=20
# Rate limits
HTTPX_RATELIMIT=150
NUCLEI_RATELIMIT=150
FFUF_RATELIMIT=0
# Timeouts
AMASS_INTEL_TIMEOUT=15 # Minutes
AMASS_ENUM_TIMEOUT=180 # Minutes
CMSSCAN_TIMEOUT=3600 # Seconds
FFUF_MAXTIME=900 # Seconds
HTTPX_TIMEOUT=10 # Seconds
HTTPX_UNCOMMONPORTS_TIMEOUT=10 # Seconds
PERMUTATIONS_LIMIT=21474836480 # Bytes, default is 20 GB
# lists
fuzz_wordlist=${tools}/fuzz_wordlist.txt
lfi_wordlist=${tools}/lfi_wordlist.txt
ssti_wordlist=${tools}/ssti_wordlist.txt
subs_wordlist=${tools}/subdomains.txt
subs_wordlist_big=${tools}/subdomains_n0kovo_big.txt
resolvers=${tools}/resolvers.txt
resolvers_trusted=${tools}/resolvers_trusted.txt
# Axiom Fleet
# Will not start a new fleet if one exist w/ same name and size (or larger)
# AXIOM=false Uncomment only to overwrite command line flags
AXIOM_FLEET_LAUNCH=true # Enable or disable spin up a new fleet, if false it will use the current fleet with the AXIOM_FLEET_NAME prefix
AXIOM_FLEET_NAME="reconFTW" # Fleet's prefix name
AXIOM_FLEET_COUNT=10 # Fleet's number
AXIOM_FLEET_REGIONS="eu-central" # Fleet's region
AXIOM_FLEET_SHUTDOWN=true # # Enable or disable delete the fleet after the execution
# This is a script on your reconftw host that might prep things your way...
#AXIOM_POST_START="~/Tools/axiom_config.sh" # Useful to send your config files to the fleet
AXIOM_EXTRA_ARGS="" # Leave empty if you don't want to add extra arguments
#AXIOM_EXTRA_ARGS=" --rm-logs" # Example
# TERM COLORS
bred='\033[1;31m'
bblue='\033[1;34m'
bgreen='\033[1;32m'
byellow='\033[1;33m'
red='\033[0;31m'
blue='\033[0;34m'
green='\033[0;32m'
yellow='\033[0;33m'
reset='\033[0m'
Check out the wiki section to know which flag performs what all steps/attacks Usage Guide 📖
Flag | Description |
---|---|
-d | Single Target domain (example.com) |
-l | List of targets (one per line) |
-m | Multiple domain target (companyName) |
-x | Exclude subdomains list (Out Of Scope) |
-i | Include subdomains list (In Scope) |
Flag | Description |
---|---|
-r | Recon - Full recon process (without attacks like sqli,ssrf,xss,ssti,lfi etc.) |
-s | Subdomains - Perform only subdomain enumeration, web probing, subdomain takeovers |
-p | Passive - Perform only passive steps |
-a | All - Perform whole recon and all active attacks |
-w | Web - Perform only vulnerability checks/attacks on particular target |
-n | OSINT - Performs an OSINT scan (no subdomain enumeration and attacks) |
-c | Custom - Launches specific function against target |
-h | Help - Show this help menu |
Flag | Description |
---|---|
--deep | Deep scan (Enable some slow options for deeper scan, vps intended mode) |
-f | Custom config file path |
-o | Output directory |
-v | Axiom distributed VPS |
-q | Rate limit in requests per second |
NOTE: this is applicable when you've installed reconFTW on the host (e.g. VM/VPS/cloud) and not in a Docker container.
./reconftw.sh -d target.com -r
./reconftw.sh -l sites.txt -r -o /output/directory/
./reconftw.sh -d target.com -r --deep -o /output/directory/
./reconftw.sh -m company -l domains_list.txt -r
./reconftw.sh -d target.com -r -v
./reconftw.sh -d target.com -a
./reconftw.sh -h
Check out the wiki section for more info Axiom Support
- As reconFTW actively hits the target with a lot of web traffic, hence there was a need to move to Axiom distributing the work load among various instances leading to reduction of execution time.
- During the configuration of axiom you need to select
reconftw
as provisoner. - You can create your own axiom's fleet before running reconFTW or let reconFTW to create and destroy it automatically just modifying reconftw.cfg file.
- Domain information (whois and amass)
- Emails addresses and users (emailfinder)
- Metadata finder (MetaFinder)
- Google Dorks (dorks_hunter)
- Github Dorks (gitdorks_go)
- GitHub org analysis (enumerepo, trufflehog and gitleaks)
- Passive (amass, subfinder and github-subdomains)
- Certificate transparency (crt)
- NOERROR subdomain discovery (dnsx, more info here)
- Bruteforce (puredns)
- Permutations (Gotator, ripgen and regulator)
- JS files & Source Code Scraping (katana)
- DNS Records (dnsx)
- Google Analytics ID (AnalyticsRelationships)
- TLS handshake (tlsx)
- Recursive search (dsieve).
- Subdomains takeover (nuclei)
- DNS takeover (dnstake)
- DNS Zone Transfer (dig)
- Cloud checkers (S3Scanner and cloud_enum)
- IP info (whoisxmlapi API)
- CDN checker (ipcdn)
- WAF checker (wafw00f)
- Port Scanner (Active with nmap and passive with smap)
- Port services vulnerability checks (vulners)
- Password spraying (brutespray)
- Web Prober (httpx)
- Web screenshoting (webscreenshot or gowitness)
- Web templates scanner (nuclei and nuclei geeknik)
- CMS Scanner (CMSeeK)
- Url extraction (gau,waymore, katana, github-endpoints and JSA)
- URL patterns Search and filtering (urless, gf and gf-patterns)
- Favicon Real IP (fav-up)
- Javascript analysis (subjs, JSA, xnLinkFinder, getjswords, Mantra)
- Fuzzing (ffuf)
- URL sorting by extension
- Wordlist generation
- Passwords dictionary creation (pydictor)
- XSS (dalfox)
- Open redirect (Oralyzer)
- SSRF (headers interactsh and param values with ffuf)
- CRLF (crlfuzz)
- Cors (Corsy)
- LFI Checks (ffuf)
- SQLi Check (SQLMap and ghauri)
- SSTI (ffuf)
- SSL tests (testssl)
- Broken Links Checker (katana)
- Prototype Pollution (ppfuzz)
- Web Cache Vulnerabilities (Web-Cache-Vulnerability-Scanner)
- 4XX Bypasser (byp4xx)
- Multithreading (Interlace)
- Custom resolvers generated list (dnsvalidator)
- Docker container included and DockerHub integration
- Ansible + Terraform deployment over AWS
- Allows IP/CIDR as target
- Resume the scan from last performed step
- Custom output folder option
- All in one installer/updater script compatible with most distros
- Diff support for continuous running (cron mode)
- Support for targets with multiple domains
- Raspberry Pi/ARM support
- 6 modes (recon, passive, subdomains, web, osint and all)
- Out of Scope Support + optional inscope support
- Notification system with Slack, Discord and Telegram (notify) and sending zipped results support
Follow these simple steps to end up with a private repository with your API Keys
and /Recon
data.
A Makefile
is provided to quickly bootstrap a private repo. To use it, you'll need the Github CLI installed.
Once done, just run:
# below line is optional, the default is ~/reconftw-data
export PRIV_REPO="$HOME/reconftw-data"
make bootstrap
To sync your private repo with upstream:
make sync
To upload juicy recon data:
make upload
-
Create a private blank repository on
Git(Hub|Lab)
(Take into account size limits regarding Recon data upload) -
Clone your project:
git clone https://gitlab.com/example/reconftw-data
-
Get inside the cloned repository:
cd reconftw-data
-
Create a new branch with an empty commit:
git commit --allow-empty -m "Empty commit"
-
Add the official repo as a new remote:
git remote add upstream https://github.com/six2dez/reconftw
(upstream
is an example) -
Update upstream's repo:
git fetch upstream
-
Rebase current branch with the official one:
git rebase upstream/main master
- Upload changes to your personal repo:
git add . && git commit -m "Data upload" && git push origin master
- Update tool anytime:
git fetch upstream && git rebase upstream/main master
If you want to contribute to this project, you can do it in multiple ways:
- Submitting an issue because you have found a bug or you have any suggestion or request.
- Making a Pull Request from dev branch because you want to improve the code or add something to the script.
- Take a look at the wiki section.
- Check FAQ for commonly asked questions.
- Join our Discord server
- Ask for help in the Telegram group
-
Thank you for lending a helping hand towards the development of the project!
Usage of this program for attacking targets without consent is illegal. It is the user's responsibility to obey all applicable laws. The developer assumes no liability and is not responsible for any misuse or damage caused by this program. Please use responsibly.
The material contained in this repository is licensed under MIT.