B. Vulnerability testing strategy
Introduction
Now that we know what nuclei is and how it works, we can start building up an automated strategy that will help us increase our passive bug bounty scans while we go on with the aquatone chapter and execute our manual testing strategy.
I would high recomnmend having at least two VPS servers. This will make it easier to keep the two functionalities seperated. Later on, we will need to grab the files so it will be easier to configure if we can dedicate one IP adress to one function. However all of this is just a suggestion, i know some of you are wizards with bash so it's recommended that you write a much better script than what we are about to outline.
Finding subdomains
Some of you have already indicated to me that they wrote their own scripts to chain all the possible subdomain finder tools. This is the right thing to do here, only we might be able to prallalise the requests instead of sending them sequential for some more speed.
amass enum -d example.com >> domains-google.com.txt &
python3 dnsrecon.py -d google.com -D 5000.txt -t brt >> domains-google.com.txt &
...
We can then call this script on the VM using a command like
chmod +x chainScript.sh
sh chainScript.sh
If we then paramterarise the domain we can execute this script by calling a cronjob to check for new subdomains every day for example.
amass enum -d $1 >> domains-google.com.txt &
python3 dnsrecon.py -d $1 -D 5000.txt -t brt >> domains-google.com.txt &
...
The $ will refer to the first parameter and the & behind the commands means they can be sent to the background and will keep running. If we want to to manually run the script and keep it running even after closing our session we can add nohup in front of the command to run the sh script
nohup sh chainScript.sh google.com &
- nohup will keep the command running, even if we end the session
- google.com is the first parameter so will be filled into wherever we put $1
- & will send the command to the background
Removing duplicates from the list of subdomains
sort -u domains-google.com.txt
Since we are running a lot of tools, they will find a lot of duplicates and we don't want to do useless scans so we have to remove them.
- -u flag, removes duplicate entries
Checking if new domains exist
To fill in our work we need to create a worker that will check the list of subdomains every now and again and see if there are new subdomains because certain actions need be started and i love automation when implemented properly.
To do this i will make a small shell script that will make an md5 hash of the list of subdomains and store it in a file.
newSum=`cat Google.com.txt | md5sum`
oldSum=`cat oldSum.txt`
echo "$value"
if Newsum!=oldSum{
StartNuclei(Templates);
WriteNewHashToFile(oldSum.txt,newSum);
}
please note this is pseudo code to make things a bit easier to explain but it's the design that counts here and not the pretty code, that's very easy to find on google.
- We are reading the value of the MD5 hash of the content of the subdomain list
- We are comparing that value to the previous MD5 hash
- If the Hashes do not match, there's a new domain and nuclei springs into action to scan it with all it's existing templates
- And also write the new hash to a file.
Checking if new templates exist
We also need a similar worker to run which will check if new nuclei templates have been created. If a new template exists, nuclei needs to be started again
oldSum= 'cat oldSum.txt'
newSum= 'ls templates | md5sum'
lastScanDate=`cat lastScanDate.txt`
echo "$value"
if Newsum!=oldSum{
StartNuclei(getTemplateDate(),lastScanDate);
WriteNewHashToFile(oldSum.txt,newSum);
updateLastScanDate()
}
The biggest difference here is that we do an 'ls' command and not a 'cat' command because we don't just have one file for nuclei, instead there are a ton of tiny ones.
We also run the NEW templates only in this instance and not all of the templates as there is not perse a new subdomain.
Uncle Rat's Settings
I always put these scripts in a cronjob that run every hour, but depending on the complexity of the task or the list of subdomains you might have gathered, you may want to run the scripts less frequently. I Gave you mostly psuedo code for these scripts because i want you to build them yourselves. How you scan with nuclei will depend on what you want to do. Will you use all templates? Some? Will you use tags? Will you use severity? It is all up to you how you do this. You have the building blocks, no go play with the lego.