Script http-robots.txt
Script types:
portrule
Categories:
default, discovery, safe
Download: https://svn.nmap.org/nmap/scripts/http-robots.txt.nse
Script Summary
Checks for disallowed entries in /robots.txt
on a web server.
The higher the verbosity or debug level, the more disallowed entries are shown.
Script Arguments
- slaxml.debug
See the documentation for the slaxml library.
- http.host, http.max-body-size, http.max-cache-size, http.max-pipeline, http.pipeline, http.truncated-ok, http.useragent
See the documentation for the http library.
- smbdomain, smbhash, smbnoguest, smbpassword, smbtype, smbusername
See the documentation for the smbauth library.
Example Usage
nmap -sV -sC <target>
Script Output
80/tcp open http syn-ack | http-robots.txt: 156 disallowed entries (40 shown) | /news?output=xhtml& /search /groups /images /catalogs | /catalogues /news /nwshp /news?btcid=*& /news?btaid=*& | /setnewsprefs? /index.html? /? /addurl/image? /pagead/ /relpage/ | /relcontent /sorry/ /imgres /keyword/ /u/ /univ/ /cobrand /custom | /advanced_group_search /googlesite /preferences /setprefs /swr /url /default | /m? /m/? /m/lcb /m/news? /m/setnewsprefs? /m/search? /wml? |_ /wml/? /wml/search?
Requires
Author:
License: Same as Nmap--See https://nmap.org/book/man-legal.html