This article provides in depth tutorial on how to extract servers names from website which can be used for information gathering during penetration testing. The environment required for performing this information gathering are:
Backtrack Operation System
The following are the steps to find server name and Ip address from a website:
Step 1:- Open terminal in Backtrack and type wget http://www.victimpage.com
Step 2:- Extract information from the downloaded index.html.2 file by typing grep “href=” index.html.2 (This would extract information from index.html.2 with href=)
Step 3:- Further extract is required from the output which is been spotted in above image which can be done by typing following command grep “href=” index.html.2 | cut -d “/” -f3 | sort -u (This would extract information from above text and print only name of the servers which website is providing)
cut -d”/” -f3 –> This would only display information which has “/” and (-f3) would display information with 3rd occurrence of “/”
Step 4:- Saving the information in txt file by typing grep “href=” index.html.2 | cut -d “/” -f3 | sort -u > sanket.txt (This would save information in txt file which is extracted in step 3)
Step 5:- Creating bash script which would extract IP address from the name of the servers that have been extracted from website.
for hostname in $(cat sanket.txt);do
host $hostname |grep “has address”
Step 6:- Making script executable before running chmod 775 sanket.sh
Step 7:- Executing the bash script saving in txt file.
./sanket.sh > sanket_ips.txt
Step 8:- Final Step is to extract IP address by typing cat sanket_ips.txt | cut -d ” ” -f4 sort -u