Skip navigation

Google's Ratproxy Web Security Auditing Tool

Last week I wrote about three tools you can use to help find security problems in your Web sites. Those tools include Microsoft's UrlScan (which has been available for quite some time), as well two relatively new tools: the Microsoft Source Code Analyzer for SQL Injection tool, and HP's Scrawlr. If you missed that article, you can read it on our Web site at the URL below:
http://windowsitpro.com/article/articleid/99683

Shortly after Microsoft and HP announced their new tool offerings, Google coincidentally also announced a new security tool called Ratproxy. According to Google, the tool is "a semi-automated, largely passive web application security audit tool." The company added that the tool is "meant to complement active crawlers and manual proxies more commonly used for this task, and is optimized specifically for an accurate and sensitive detection, and automatic annotation, of potential problems and security-relevant design patterns based on the observation of existing, user-initiated traffic in complex web 2.0 environments."

Ratproxy is a Web proxy server that you run your Web browser traffic through. The tool inspects Web traffic, gathers information, and logs its findings. Ratproxy can also test for various detrimental conditions, as well as replay GET and POST requests (with or without altered request parameters)--all of which can lead to the discovery of potential security problems of varying levels of risk. Ratproxy isn't an automated scanner that you point at the top-level URL of a domain and set loose crawling a site. It's a manually operated tool that requires you to interact with a site as a regular user would. Although that approach is time consuming and could become tedious, it is in fact more of a real-world approach that gives you a bit more control over what sort of activity takes place during a scan.

The tool creates a variety of tracking information, including logging Web session headers, complete content traces, etc. After you're done interacting with a site through Ratproxy, you can generate a report in HTML format. While reviewing the report, you can use functionality built into the report to take further action, as I'll explain in a moment.

Google made Ratproxy available for free to anyone who wants a copy--complete with source code. The code is written in C, but the download package does not come with a pre-compiled executable, which means that you have to compile it yourself. Fortunately that's pretty easy to do if you've got a Linux system available with a GNU C Compiler (GCC) environment installed. Just unpack the source code into a directory, navigate to that directory, and issue the "make" command at the command prompt. After the code compiles, you'll find an executable called "ratproxy" in the directory where you unpacked the code. You'll also find a script, ratproxy-report.sh, that can be used to generate an HTML-based report from the Ratproxy log after scanning a site.

I took the tool for a test drive and found that it's pretty easy to use. You can start the tool in passive mode or in two active testing modes. In passive mode the tool will log various trace information, but it doesn't do any active "disruptive" testing, meaning that it won't try to alter content and replay it back to the site being tested. If you start the proxy with the -X switch it'll perform various tests for cross-site scripting and cross-site request forgery. When you use the -C switch the proxy will automatically replay Web requests using modified parameters where appropriate. And, you can use both switches at the same time.

After visiting a site via Ratproxy I then generated a report using the ratproxy-report.sh script and found that the report is pretty easy to read and understand. Each URL is listed as a separate item in the report, complete with various related information, including a possible risk level, type of risk, the actual URL used by the browser for that item, payload information, Web site response, etc. Comprehending the data does require some knowledge of HTML output as delivered by a Web server, along with an understanding of how GET and POST requests are structured and used. If you don't understand at least that much about Web development, along with the potential security implications, then the report isn't going to be of much use to you. However, you could pass the report along to someone who can analyze it for you.

While reviewing the items in the report, you can manually replay requests to a particular URL. If a URL involves GET or POST parameters you'll see a yellow button labeled "edit values." When you click that button a form is revealed where you can change parameters as you see fit. To replay the altered parameters back to the Web site, simply click the corresponding URL at the beginning of that report item. You'll then need to generate a new report to see the results of altered requests.

Google says that it uses Ratproxy internally to test its own applications. From my perspective, the tool looks like a good choice for testing Web application security. It's not the only tool you'll need to adequately test Web application security, but it's definitely a good addition your suite of testing tools.

You can download a copy of Ratproxy at the first URL below. Be sure to review the online documentation at the second URL, which explains far more about the tool than what I've covered here. Also, be certain to review the README file that comes with the package, as it contains a long list of possible command-line switches you can use, in addition to other important information you need to know before using the tool.
http://code.google.com/p/ratproxy
http://code.google.com/p/ratproxy/wiki/RatproxyDoc

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish