{{{#!rst ===================================================== 用 Firefox 来 Hacking Web 2.0 程序 ===================================================== :原文地址: `Security Focus `_ :摘录翻译: `惑者 `_ Introduction ============================== AJAX and interactive web services form the backbone of “web 2.0” applications. This technological transformation brings about new challenges for security professionals. This article looks at some of the methods, tools and tricks to dissect web 2.0 applications (including Ajax) and discover security holes using Firefox and its plugins. The key learning objectives of this article are to understand the: * web 2.0 application architecture and its security concerns. * hacking challenges such as discovering hidden calls, crawling issues, and Ajax side logic discovery. * discovery of XHR calls with the Firebug tool. * simulation of browser event automation with the Chickenfoot plugin. * debugging of applications from a security standpoint, using the Firebug debugger. * methodical approach to vulnerability detection. Web 2.0 application overview ================================= The newly coined term “web 2.0” refers to the next generation of web applications that have logically evolved with the adoption of new technological vectors. XML-driven web services that are running on SOAP, XML-RPC and REST are empowering server-side components. New applications offer powerful end-user interfaces by utilizing Ajax and rich internet application (Flash) components. This technological shift has an impact on the overall architecture of web applications and the communication mechanism between client and server. At the same time, this shift has opened up new security concerns [ref1]_ and challenges. New worms such as Yamanner, Samy and Spaceflash are exploiting “client-side” AJAX frameworks, providing new avenues of attack and compromising confidential information. .. image:: 1.jpg **Figure 1. Web 2.0 architecture layout.** As shown in Figure 1, the browser processes on the left can be divided into the following layers: * **Presentation layer** - HTML/CSS provides the overall appearance to the application in the browser window. * **Logic & Process** - JavaScript running in the browser empowers applications to execute business and communication logic. AJAX-driven components reside in this layer. * **Transport** - XMLHttpRequest (XHR) [ref2]_ . This object empowers asynchronous communication capabilities and XML exchange mechanism between client and server over HTTP(S). * **The server-side components** on the right of Figure 1 that typically reside in the corporate infrastructure behind a firewall may include deployed web services along with traditional web application resources. An Ajax resource running on the browser can directly talk to XML-based web services and exchange information without refreshing the page. This entire communication is hidden from the end-user, in other words the end-user would not “feel” any redirects. The use of a “Refresh” and “Redirects” were an integral part of the first generation of web application logic. In the web 2.0 framework they are reduced substantially by implementing Ajax. Web 2.0 assessment challenges ===================================================== In this asynchronous framework, the application does not have many *Refreshes* and *Redirects*. As a result, many interesting server-side resources that can be exploited by an attacker are hidden. The following are three important challenges for security people trying to understand web 2.0 applications: 1. **Discovering hidden calls** - It is imperative that one identify XHR-driven calls generated by the loaded page in the browser. It uses JavaScript over HTTP(S) to make these calls to the backend servers. 2. **Crawling challenges** - Traditional crawler applications fail on two key fronts: one, to replicate browser behavior and two, to identify key server-side resources in the process. If a resource is accessed by an XHR object via JavaScript, then it is more than likely that the crawling application may not pick it up at all. 3. **Logic discovery** - Web applications today are loaded with JavaScript and it is difficult to isolate the logic for a particular event. Each HTML page may load three or four JavaScript resources from the server. Each of these files may have many functions, but the event may be using only a very small part of all these files for its execution logic. We need to investigate and identify the methodology and tools to overcome these hurdles during a web application assessment. For the purpose of this article, we will use Firefox as our browser and try to leverage some of its plugins to combat the above challenges. Discovering hidden calls =============================================== Web 2.0 applications may load a single page from the server but may make several XHR object calls when constructing the final page. These calls may pull content or JavaScript from the server asynchronously. In such a scenario, the challenge is to determine all XHR calls and resources pulled from the server. This is information that could help in identifying all possible resources and associated vulnerabilities. Let's start with a simple example. Suppose we can get today’s business news by visiting a simple news portal located at: ``http://example.com/news.aspx`` The page in the browser would resemble the screenshot illustrated below in Figure 2. .. image:: 2.jpg Being a web 2.0 application, Ajax calls are made to the server using an XHR object. We can determine these calls by using a tool known as Firebug [ref3]_ . Firebug is a plug-in to the Firefox browser and has the ability to identify XHR object calls. Prior to browsing a page with the plugin, ensure the option to intercept XHR calls is selected, as shown in Figure 3. .. image:: 3.jpg With the Firebug option to intercept XMLHttpRequest calls enabled, we browse the same page to discover all XHR object calls made by this particular page to the server. This exchange is shown in Figure 4. .. image:: 4.jpg **Figure 4. Capturing Ajax calls.** We can see several requests made by the browser using XHR. It has loaded the dojo AJAX framework from the server while simultaneously making a call to a resource on the server to fetch news articles. http://example.com/getnews.aspx?date=09262006 If we closely look at the code, we can see following function in JavaScript:: function getNews() { var http; http = new XMLHttpRequest(); http.open("GET", " getnews.aspx?date=09262006", true); http.onreadystatechange = function() { if (http.readyState == 4) { var response = http.responseText; document.getElementById('result').innerHTML = response; } } http.send(null); } The preceding code makes an asynchronous call to the backend web server and asks for the resource getnews.aspx?date=09262006. The content of this page is placed at the ‘result’ id location in the resulting HTML page. This is clearly an Ajax call using the XHR object. By analyzing the application in this format, we can identify vulnerable internal URLs, querystrings and POST requests as well. For example, again using the above case, the parameter “date” is vulnerable to an SQL injection attack. Crawling challenges and browser simulation =========================================================== An important reconnaissance tool when performing web application assessment is a web crawler. A web crawler crawls every single page and collects all HREFs (links). But what if these HREFs point to a JavaScript function that makes Ajax calls using the XHR object? The web crawler may miss this information altogether. In many cases it becomes very difficult to simulate this environment. For example, here is a set of simple links:: go1
go2
go3
The “go1” link when clicked will execute the getMe() function. The code for getMe() function is as shown below. Note that this function may be implemented in a completely separate file. :: function getMe() { var http; http = new XMLHttpRequest(); http.open("GET", "hi.html", true); http.onreadystatechange = function() { if (http.readyState == 4) { var response = http.responseText; document.getElementById('result').innerHTML = response; } } http.send(null); } The preceding code makes a simple Ajax call to the hi.html resource on the server. Is it possible to simulate this click using automation? Yes! Here is one approach using the Firefox plug-in Chickenfoot [ref4]_ that provides JavaScript-based APIs and extends the programmable interface to the browser. By using the Chickenfoot plugin, you can write simple JavaScript to automate browser behavior. With this methodology, simple tasks such as crawling web pages can be automated with ease. For example, the following simple script will “click” all anchors with onClick events. The advantage of this plug-in over traditional web crawlers is distinct: each of these onClick events makes backend XHR-based AJAX calls which may be missed by crawlers because crawlers try to parse JavaScript and collect possible links but cannot replace actual onClick events. :: l=find('link') for(i=0;i`_ .. [ref2] `XHR Object specification `_ .. [ref3] `Firebug download `_ ; `Firebug usage `_ .. [ref4] `Chickenfoot quick start `_ .. [ref5] `Chickenfoot API reference `_ .. [ref6] `Venkman walkthrough `_ .. [ref7] `wsChess `_ }}}