Skip navigation

Audit Your Web Applications for Better Security

Free tools help you identify Web application vulnerabilities

You build your Web server, installing all the latest patches and using available guidelines to lock down the system. But is your Web server really secure? To find out, regularly inspect your Web applications for potential vulnerabilities.

One way intruders might penetrate your defenses is through Web application misconfiguration. Another way is through Web application vulnerabilities. I'll discuss how you can prevent some vulnerabilities from the start and how you can use several free tools to collect vulnerability information about your Web site.

Guidelines for Web App Installation and Configuration
Anytime you obtain a software package—whether it's a component, application, or underlying subsystem—following some simple guidelines can make your job much easier over time. Begin by scouring the provider's Web site for all the information you can find about installation and configuration. Read the configuration information especially closely because configuration oversights often create vulnerabilities. Keep in mind that configuration settings and associated potential exploits for many components vary by Web server platform (e.g., Microsoft IIS, Apache).

In addition, search the Internet for relevant security information. For example, many people use PHP as an underlying script engine for Web application development. But not all PHP users are aware of the Hardened PHP Project (http://www.hardened-php.net), which provides a set of patches for PHP that makes the engine far more secure. The Internet offers many similar security information gems. To find hardening information for a product, use search phrases that include the product name with the words hardening, hardened, and secure.

You should also inspect your Web server and Web site directory mappings and look at what the directories contain. You might have installed software packages or components not yet in use on your Web site but which someone could access by entering the right URL. You might want to perform a back-end audit to ensure the security of your directories, which can provide privileged access to the entire file system and registry. (The tools I discuss in this article might discover such packages or components, but performing a direct inventory and audit ensures that nothing is overlooked.)

Scan Your Web Server
You should scan your Web server and its applications regularly. The interval between scans will depend on your server and its use, but I recommend that you perform a scan at least quarterly.

To perform the scan, you can use tools that probe your Web server and applications directly, including NT OBJECTives' NTOinsight and NTOweb (http://www.ntobjectives.com/freeware) and screamingCobra (http://samy.pl/scobra). You can also use search engines to discover potential problem areas. Tools that use search engines to discover Web site vulnerabilities include Massive Enumeration Tool (MET—http://www.gnucitizen.org) and Foundstone's SiteDigger (http://foundstone.com/resources/ s3i_tools.htm).

One reason to use search engines regularly for security scanning is that they index the linked content on your Web site. Some search engines, such as Google, also maintain a history and cache of your Web pages, which can be valuable. For example, you might have a script in use today that you unlink from your Web pages tomorrow. Search engines can reveal whether you deleted both the script and its directory path. (Note: Keep in mind that it takes some time for a search engine's spider to find your Web site and index its content. So if you're launching a new site, you can expedite that process by submitting your URL directly to search engines. You can find a link for submission on each search engine's site.)

Search engines are also a good way to locate vulnerabilities because they can look for URLs that vulnerable applications are known to use. For example, search engines can look at the URL suffix (the part of the URL that follows the Fully Qualified Domain Name—FQDN) to find sites with exposed administration screens. Intruders routinely use search engines to find such information; you should use them to discover in advance what intruders might find on your Web server and address any potential problems.

SensePost's Wikto (http://www.sensepost.com/research/wikto) is a third kind of tool—a hybrid that performs both direct and search engine scans.

All of these tools are valuable, but keep in mind that their effectiveness varies depending on your particular OS, Web server platform, and associated software packages and components. In general, however, you can probably use any of these tools to scan your platform because so many current Web applications are written in scripting languages (e.g., Perl, PHP, VBScript) that can run on nearly any Web platform in use today. That said, let's look at these different kinds of tools.

Direct Scanning
NT OBJECTives, a company that has long provided excellent security tools to the Internet community, offers the free tools NTOinsight and NTOweb. NTOinsight is a Web server analyzer capable of crawling any number of servers to identify all linked resources. NTOweb is a plug-in for NTOinsight that works in conjunction with the Nikto database, which is a publicly available database of more than 3000 URL suffixes known to be associated with vulnerable applications. Together, these two tools provide an intense audit of your Web servers, whether the servers are based on Microsoft IIS, Apache, or another platform.

NTOinsight and NTOweb are relatively simple-to-use command-line tools. After you install them, run NTOweb first without any command-line options to download the latest Nikto database. You're then ready to use NTOinsight.

NTOinsight provides a long list of possible command-line options, including the ability to scan multiple servers, to use NT LAN Manager (NTLM) authentication, to scan proxy servers, to define maximum page requests as well as the intervals between page requests, and to examine domain and server traversal behavior. The command below is a reasonable one to use for your first test of the tool. The -h parameter specifies the host to scan; -sn is the name of the scan and of the directory for the resulting report; -ntoweb tells NTOinsight to perform checks against the Nikto database by using the NTOweb plug-in; and -delay is the delay between requests (in milliseconds).

ntoinsight -h www.myhost.nul 
  -sn myhost -ntoweb -delay 1000 

The scan process for NTOinsight (as well as for all the tools I discuss in this article) can take from a few minutes to several hours (to a day or more!) depending on how many Web pages your scanned servers contain. You can monitor progress as the scan takes place, as Figure 1 shows.

After the scan is finished, you can view a graphical report. By default, reports are stored in the \Reports\In sight\date\ directory, where date is the date of the scan. From the main report, you can link to more-detailed reports about discovered resources and potential vulnerabilities.

If you like NTOinsight, you might also like NT OBJECTives' NTOSpider, a commercial Web application vulnerability scanner. NTOSpider goes well beyond the abilities of NTOinsight to perform scans for a wide range of potential problems related to Secure Sockets Layer (SSL), Java, proxy servers, user sessions, and more. In addition, the tool can test your applications against SQL injection and cross-site scripting attacks.

Originating years ago at DEFCON 5, screamingCobra is a simple Perl script developed as part of a challenge to create an algorithm to detect script vulnerabilities. Despite its age, screamingCobra offers reasonable value. The script has four basic command-line options: show a status bar (-s), don't ignore any files (-i), use extra techniques to discover cbugs (-e), and verbose mode (-v). (Note: To learn about cbugs, see The Bugs Project Web site at http://www.mrc-bsu.cam.ac.uk/bugs.) Of course, you also need to specify the Web site address to scan. I recommend that you run the script with the -i, -e, and -v options enabled, as the following command shows:

perl screamingcobra.pl -e -i -v 
  http://www.mysite.nul 

When it runs, the script moves from page to page, gathering URLs from the Web site. When it encounters a URL that might be vulnerable, screamingCobra prints a message prefaced by "BUG FOUND". You can use the following command to gather screamingCobra's output and pipe it to a report.txt file:

perl screamingcobra.pl -e -i -v 
  http://www.mysite.nul > 
  report.txt 

If you have the grep utility installed, you can use it to save only reported bugs to a text file. To save reported bugs only, use the following command:

perl screamingcobra.pl -e -i -v 
  http://www.mysite.nul > grep 
  "BUG FOUND" > report.txt 

Overall, screamingCobra is an effective way to test application security, particularly if you use Linux as a Web platform. (Most of the checks screamingCobra performs relate to weaknesses commonly found in applications that run on Linux.) However, you should use it in addition to the other tools discussed in this article.

Using Search Engines
MET works in conjunction with Google's API to query Google for links to your site that might represent possible vulnerabilities. You can use MET to construct specialized queries, or you can work with the Google Hacking Database (GHDB—http://johnny .ihackstuff.com), another publicly available database of URL suffixes known to be related to vulnerable Web applications. For simplicity, I'll show you how to use MET with GHDB.

Note: Google's API (http://www.google.com/apis/index.html) lets you use tools such as MET that are designed to perform a large number of queries in Google's databases. To use the API, you must obtain a license from Google, a simple process that involves filling out a form and waiting for your API key to arrive by email. The API key lets you perform up to 1000 automated queries per day. (If you try to perform a large number of successive queries to Google directly without the API key, Google will eventually detect that activity and block your access for some period of time.)

Google hacking permits querying for specifics about a Web site without actually sending traffic to that Web site. This capability makes Google hacking a double-edged sword: Although attackers can probe your Web site by stealth, you can do the same to find potential vulnerabilities and unwanted exposure of information. Because Google does all the Web crawling, you must query Google to discover what its databases contain about your sites.

MET is written in the Python scripting language; to use MET, you'll need a copy of Python, which you can download at http://python.org. Install Python, then MET, then download a copy of GHDB and put it in a subdirectory of your Python directory. For example, extract GHDB to the folder python\etc (a folder that exists after you install Python). With that done, you're ready to go. If you used the MSI Installer version of MET, you'll find its script installed in the python\scripts directory, and you can use the following sample command to try it:

python scripts\google ghdb -v 
  --database=etc\ghdb.xml 
  --key=XYZ --output=mysite.txt 
 'site:mysite.dom' 

The -v option instructs MET to run in verbose mode, which lets you monitor its progress. Replace XYZ with your Google API key. The last parameter, 'site:mysite.dom', is the query filter. Be sure to leave that item as the last parameter on the command line, and change mysite.dom to your domain name. The command causes MET to try to find each URL listed in GHDB on your Web site, as Figure 2 shows, and write the results to the mysite.txt output file.

Note that because the Google API lets you make up to only 1000 queries per day, you'll need to pace the queries. Here's why: At the time of this writing, the version of GHDB at http://johnny.ihackstuff.com contains more than 1100 queries. If you use this version, you'll need to split the file in two and use the resulting two files on different days. If you split the file, pay special attention to the XML format. You need to have a correct set of headers in each file for the process to work.

SiteDigger 2.0 is another tool that queries Google to discover possible vulnerable applications on your Web site. The tool comes with 175 custom URL query strings, and it can also use GHDB. To get started, you'll need your Google API key, which you must enter into the license key box in SiteDigger's GUI.

Before you start your queries, use the Update menu to update the Foundstone and GHDB databases. Then, click the Options tab and select the Foundstone database or GHDB. In either case, you select and clear check boxes to choose which queries you want to perform. Select the Search tab to enter your Web site address and click Search.

After the search process is finished, you can click Export Results on the Search tab to export the results. SiteDigger produces a clear HTML-based report, as Figure 3 shows. Although the SiteDigger report isn't as detailed as the report that NTOinsight produces, it presents a useful summary of possible problems, explanations, and links to suspect URLs on your site. (Foundstone also provides vulnerability assessment services based on term agreements or on demand.)

A Hybrid Tool
Because it offers more features, Wikto is a bit more complicated to use. Wikto crawls your Web sites, makes copies of them on the local disk, queries Google (using the Google API) to find other possible URLs linked in your sites, and works with the Nikto database as well as with GHDB. The Nikto database queries your Web servers directly, whereas GHDB queries Google for URL patterns related to your Web servers.

Although Witko is basically an all-in-one scanner and analyzer, its reporting capabilities require exporting data to comma-separated value (CSV) format, then importing the data into a spreadsheet or database for report generation and further analysis.

Before you can download Wikto, you must sign up for a free account at the SensePost Web site. A link to register is near the top of the Wikto Web page. After you register, your ID and password are emailed to the address you provide.

To use Wikto, you also need two third-party tools—HTTrack (http:// www.httrack.com) and httprint (http://net-square.com/httprint)— both of which are free. HTTrack creates complete copies of Web sites, and httprint determines the server software that the Web site uses by analyzing Internet Control Message Protocol (ICMP) packets. Wikto uses these two tools to refine its scanning process.

After you install all three tools, configure Wikto and update its databases. Begin by clicking the SystemConfig tab to ensure that the path names for HTTrack, httprint, and the Nikto and GHDB databases are correct. You also need to enter your Google API key in the appropriate field. Click Save to save your configuration to disk. Next, click Update NiktoDB and follow the prompts. After that download is complete, click Update GHDB and again follow the prompts. At this point, you're ready to start an audit of your Web site.

The first step in the Web site audit process is to create a mirror of your Web site. Wikto uses the mirror to determine embedded links and directory paths that might not show up otherwise. Click the Mirror & Fingerprint tab, enter your Web site's home page address in the Target box, and click Start. Wikto will use HTTrack to make a complete copy of your Web site on the local system. When the mirror process is complete, you'll see a list of discovered directories in the Directories Mined box on the right-hand side of the display.

The second step is to go to the Googler tab, make sure the Site/Domain field and Google Keyword fields contain the URL for your site's home page, and click Start. Doing so will send a series of queries to Google that look for various file types identified as existing on your Web site. This process helps reveal additional directory paths on your Web site. When the process has finished, you might find that this step reveals far more directories than the first step revealed.

Next, go to the BackEnd tab. Click Import from Google and Import from Mirror to import the data gathered in step 1 and step 2. Below Update from SensePost, select Full from the dropdown menu, then click Update from SensePost. Doing so updates the full list of directories by importing that data from the SensePost Web site. Make sure you have your Web site home page in the IP/DNS name field, then click Start Mining. This step could take a long time to finish, depending on the number of directories and files, so be patient. It will reveal any directories and files that users can access. If this step exposes any administrative interfaces or other sensitive areas of the site, make sure you secure them with some authentication method.

After the process is finished, you can move on to the Wikto tab and the GoogleHacks tab, both of which are relatively self-explanatory. Make sure you have your Web site URL entered correctly, click Load to load each respective database, then click Start. Results are shown in the window on each tab.

After you finish the scanning, you can review the results and export them to CSV format, which you can then import into a spreadsheet or database for further analysis and report generation, as I mentioned previously. (In addition to Wikto, SensePost also provides a subscription-based online scanning service.)

Scanning Your Web Apps for Security
I've discussed several useful tools and described how to use them to assess the security of your Web-based applications. By using tools that scan your servers directly and tools that scan Internet search engines, you can find areas of vulnerability and remediate them to stay a step ahead of the average intruder. Best of all, because the tools I've discussed are free, you can start using them immediately and continue using them regularly!

TAGS: Security
Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish