Script http-backup-finder
Script types:
portrule
Categories:
discovery, safe
Download: https://svn.nmap.org/nmap/scripts/http-backup-finder.nse
Script Summary
Spiders a website and attempts to identify backup copies of discovered files. It does so by requesting a number of different combinations of the filename (eg. index.bak, index.html~, copy of index.html).
Script Arguments
- http-backup-finder.maxpagecount
the maximum amount of pages to visit. A negative value disables the limit (default: 20)
- http-backup-finder.withindomain
only spider URLs within the same domain. This widens the scope from
withinhost
and can not be used in combination. (default: false)- http-backup-finder.maxdepth
the maximum amount of directories beneath the initial url to spider. A negative value disables the limit. (default: 3)
- http-backup-finder.url
the url to start spidering. This is a URL relative to the scanned host eg. /default.html (default: /)
- http-backup-finder.withinhost
only spider URLs within the same host. (default: true)
- slaxml.debug
See the documentation for the slaxml library.
- httpspider.doscraping, httpspider.maxdepth, httpspider.maxpagecount, httpspider.noblacklist, httpspider.url, httpspider.useheadfornonwebfiles, httpspider.withindomain, httpspider.withinhost
See the documentation for the httpspider library.
- smbdomain, smbhash, smbnoguest, smbpassword, smbtype, smbusername
See the documentation for the smbauth library.
- http.host, http.max-body-size, http.max-cache-size, http.max-pipeline, http.pipeline, http.truncated-ok, http.useragent
See the documentation for the http library.
Example Usage
nmap --script=http-backup-finder <target>
Script Output
PORT STATE SERVICE REASON 80/tcp open http syn-ack | http-backup-finder: | Spidering limited to: maxdepth=3; maxpagecount=20; withindomain=example.com | http://example.com/index.bak | http://example.com/login.php~ | http://example.com/index.php~ |_ http://example.com/help.bak
Requires
Author:
License: Same as Nmap--See https://nmap.org/book/man-legal.html