-
-
Notifications
You must be signed in to change notification settings - Fork 546
Description
Is your feature request related to a problem? Please describe.
Hello.
Whats about uniq results? By code+words.
It's a little tedious to scroll through the same hundreds of rows in results.
Calibrating 404 response is not an effective technique. Depending on the dictionary, WAF can work at any moment, which will give an output of hundreds of lines. If I specify only 200 code, then in applications with wildcard routing I will have many identical resources.
Code + number of words identifies the resource quite accurately.
Thank you!
Describe the solution you'd like
feroxbuster -u https://target.com/ -w dic.txt --uniq
and nothing more feroxbuster | sort | uniq -c | sort -n -r
Describe alternatives you've considered
dirsearch
- no
ffuf
- no
gobuster
- no
cat dic.txt | wfuzz -c --filter 'w|u()' -z stdin http://site.com/FUZZ
and very "elegant" solution:
ffuf -w dic.txt -ac -u http://site.com/FUZZ" -o results.csv -of csv
cat results.csv | grep -v 'FUZZ,url' | cut -d ',' -f 5,6,7 | tr ',' ' ' | sort | uniq -c | sort -n | while read count code len words; do
cat results.csv | grep ",$code," | grep ",$words," | head -n 1 | cut -d ',' -f 1 | while read url; do
echo "[+] $url"
done
done
easy peasy
Additional context
I dont know why no one tool has this options - may be no one didn't faced with WAF or wildcard routing?