Showing posts with label Burp Extension. Show all posts
Showing posts with label Burp Extension. Show all posts

Tuesday, February 2, 2016

Burp Suite Extension: Burp Importer

Burp Importer is a Burp Suite extension written in python which allows users to connect to a list of web servers and populate the sitemap with successful connections. Burp Importer also has the ability to parse Nessus (.nessus), Nmap (.gnmap), or a text file for potential web connections. Have you ever wished you could use Burp’s Intruder to hit multiple targets at once for discovery purposes? Now you can with the Burp Import extension. Use cases for this extension consist of web server discovery, authorization testing, and more!

Click here to download source code

Installing

  1. Download Jython standalone Jar: http://www.jython.org/downloads.html
  2. In the Extender>Options tab point your Python Environment to the Jython file.
  3. Add Burp Importer in the Extender>Extensions tab.

General Use

Burp Importer is easy to use as it’s fairly similar to Burp’s Intruder tool. The first section of the extension is the file load option which is optional and used to parse Nessus (.nessus), Nmap (.gnmap), or a list of newline separated URLs (.txt). Nessus files are parsed for the HTTP Information Plugin (ID 24260). Nmap files are parsed for open common web ports from a predefined dictionary. Text files should be a list of URLs which conform to the java.net.URL format and separated by a newline. After the files are parsed a list of generated URLs will be added to the URL List box.


The URL List section is almost identical to Burp Intruder’s Payload Options. Users have the ability to paste a list of URLs, copy the current list, remove a URL from the list, clear the entire list, or add an individual URL. A connection to each item in the list will be attempted using the java.net.URL class and Burp makeHttpRequest method.

Gnamp file parsed example:


Nessus file parsed example:


The last section of the extension provides the user a checkbox option to follow redirects, run the list of URLs, and a run log. Redirects are determined by 301 and 302 status codes and based on the ‘Location’ header in the response. The run log displays the same output which shows in the Extender>Extensions>Output tab. It shows basic data any time you run the URL list such as successful connections, number of redirects (if enabled), and a list of URLs which are malformed or have connectivity issues.

Running a list of hosts:


Items imported into the sitemap:


Use Case – Discovery

One of the main motivations for creating this extension was to help the discovery phase of an application or network penetration test. Parsing through network or vulnerability scan results can be tedious and inefficient which is why automation is a vital part of penetration testing. This extension can be utilized as just a file parser which generates valid URLs to use with other tools and can also be used to gain quick insight into the web application scope of a network. There are many ways to utilize this tool from a discovery perspective, which include:

  • Determine the web scope of an environment via successful connections added to the sitemap.
  • Search or scrape for certain information from multiple sites. An example of this would be searching multiple sites for e-mail addresses or other specific information.
  • Determine the low-level vulnerability posture of multiple sites or pages via spidering then passive or active scanning.

Use Case – Authorization Testing

Another way to use this extension is to check an application for insecure direct object references. This refers to restricting objects or pages only to users who are authorized. To do this requires at least one set of valid credentials or potentially more depending on how many user roles are being tested. Also, session handling rules must be set to use cookies from Burp’s cookie jar with Extender.

The following steps can then be performed:

  1. Authenticate with the highest privileged account and spider/discover as many objects and pages as possible. Don’t forget to use intruder to search for hidden directories as well as convert POST requests to GET which can also be used to access additional resources (if allowed by the server of course).
  2. In the sitemap right click on the host at the correct path and select ‘Copy URLs in this branch.’ This will give you a list of resources which were accessed by the high privileged account.
  3. Logout and clear any saved session information.
  4. Login with a lower privileged user which could also be a user with no credentials or user role at all. Be sure you have an active session with this user.
  5. Open the Burp Importer tab and paste the list of resources retrieved from the higher privileged account into the URL List box. Select the ‘Enable: Follow Redirects’ box as it helps you know if you are being redirected to a login or error page.
  6. Analyze the results! A list of ‘failed’ connections and the number of redirects will automatically be added to the Log box. These are a good indicator if the lower privileged session was able to access the resources or if they were just redirected to a login/error page. The sitemap should also be searched to manually verify if any unauthorized resources were indeed successfully accessed. Entire branches of responses can be searched using regex and a negative match for the ‘Location’ header to find valid connections.
Here we can see the requests made to the DVWA application while logged in as 'admin' were not able to connect and redirected to the login page after the original administrative session was logged out of and killed. During this use case the DVWA application was not vulnerable to insecure direct object references.

There are many other uses for this extension just use your imagination! If you come up with any cool ideas or have any comments please reach out to me.

Tuesday, July 2, 2013

Burp Extension: Directory and File Listing Parser - Burp Site Map Importer

Click Here to Download Source Code

Penetration testers, rejoice! While conducting application penetration tests it’s sometimes necessary to request specific information from the application owner or client. As a pen tester it can be extremely beneficial to perform a test with a full directory and file listing of the application, which sometimes can be difficult to acquire.

So let’s assume all clients are perfect and provide a full directory and file listing of their application (funny, I know) but what do we do with it? My process usually involves manually looking over everything trying to find keywords which jump out… I just might want to take a look at adminpassword.txt. Depending on the size of the application I may attempt to reach every file but usually this is not an efficient use of time. I wanted to create a quick and easy process for dealing with directory and file listings so I created a Burp Suite extension which will do a lot of the work for me.

My Burp extension contains two main features. The first feature is the ability to parse the listing file and generate a list of valid URLs to request each resource. The second feature is generating a request for each URL and importing the valid request/response pairs into Burp’s Target Site Map. Why is having a full site map helpful? We now have the ability to see the entire structure of the application, search within all valid responses, conduct manual testing or an active scan on ALL accessible resources, and much more. The process flow looks like this:

The Burp extension is written in python so a standalone jython jar will be needed to run it: Currently the extension is only tested and working with jython-standalone 2.5.3


After loading the extension you will have an option in the context menu to “Import Directory Listing”:

A GUI will appear for the extension. Fields such as hostname, SSL, and port will automatically populate depending on the request or response the menu option was originally invoked from. Cookies will also be displayed and used in any requests the extension makes. This feature makes it easy to compare site maps of two application user roles (based on varying session information such as cookies) to determine if each role has the correct access.

In this example I have selected the “Import Directory Listing” menu option on the DVWA web application which is running on my local machine. Now we must fill out all options on the left side of the GUI including hostname, full directory path (windows only, but CAN be used to modify URLs from a linux listing type) which specifies where the root of the web application sits, SSL, port, file listing type, and path to listing file.

On a Windows XP machine, I used the ‘dir /s’ command in cmd.exe which displays all files from the current directory and all sub directories. If the application is sitting on a Windows platform this is a very common command used for directory and file listings. A partial output of a directory and file listing for the DVWA web application (selected in the above image as C:\dvwa-listing.txt) looks like this:

dir /s:
 Volume in drive C has no label.
 Volume Serial Number is 5033-AA99

 Directory of C:\xampp\htdocs\dvwa

09/08/2010  09:50 PM    <DIR>          .
09/08/2010  09:50 PM    <DIR>          ..
09/08/2010  09:49 PM               497 .htaccess
08/26/2010  11:15 AM             2,792 about.php
06/06/2010  07:55 PM             5,066 CHANGELOG.txt
09/08/2010  09:50 PM    <DIR>          config
03/16/2010  12:56 AM            33,107 COPYING.txt
09/08/2010  09:50 PM    <DIR>          docs
09/08/2010  09:50 PM    <DIR>          dvwa
03/16/2010  12:56 AM               883 ids_log.php
06/06/2010  07:52 PM             1,878 index.php
03/16/2010  12:56 AM             1,761 instructions.php
08/26/2010  11:18 AM             2,580 login.php
03/16/2010  12:56 AM             2,738 security.php
06/06/2010  10:58 PM             1,350 setup.php
09/08/2010  09:50 PM    <DIR>          vulnerabilities
              16 File(s)         59,772 bytes

 Directory of C:\xampp\htdocs\dvwa\config

09/08/2010  09:50 PM    <DIR>          .
09/08/2010  09:50 PM    <DIR>          ..
08/26/2010  10:32 AM               576 config.inc.php
08/26/2010  11:06 AM               576 config.inc.php~
               2 File(s)          1,152 bytes

 Directory of C:\xampp\htdocs\dvwa\docs

09/08/2010  09:50 PM    <DIR>          .
09/08/2010  09:50 PM    <DIR>          ..
08/26/2010  10:32 AM           526,043 DVWA-Documentation.pdf
               1 File(s)        526,043 bytes

The parser is also capable of parsing directory listing files from various linux commands such as ‘ls -lR’ and ‘ls –R’. Examples of the format are as follows:

ls -lR:
.:
total 124
-rw-rw-r--  1 user user  2792 Aug 26  2010 about.php
-rw-rw-r--  1 user user  5066 Jun  6  2010 CHANGELOG.txt
drwxrwxr-x  2 user user  4096 Jul  1 16:52 config
-rw-rw-r--  1 user user 33107 Mar 16  2010 COPYING.txt
drwxrwxr-x  2 user user  4096 Jul  1 16:52 docs
drwxrwxr-x  6 user user  4096 Jul  1 16:52 dvwa
-rw-rw-r--  1 user user   883 Mar 16  2010 ids_log.php
-rw-rw-r--  1 user user  1878 Jun  6  2010 index.php
-rw-rw-r--  1 user user  1761 Mar 16  2010 instructions.php
-rw-rw-r--  1 user user  2580 Aug 26  2010 login.php
-rw-rw-r--  1 user user   413 Mar 16  2010 logout.php
-rw-rw-r--  1 user user  2738 Mar 16  2010 security.php
-rw-rw-r--  1 user user  1350 Jun  6  2010 setup.php
drwxrwxr-x 11 user user  4096 Jul  1 16:52 vulnerabilities

./config:
total 8
-rw-rw-r-- 1 user user 576 Aug 26  2010 config.inc.php
-rw-rw-r-- 1 user user 576 Aug 26  2010 config.inc.php~

./docs:
total 516
-rw-rw-r-- 1 user user 526043 Aug 26  2010 DVWA-Documentation.pdf

./dvwa:
total 16
drwxrwxr-x 2 user user 4096 Jul  1 16:52 css
drwxrwxr-x 2 user user 4096 Jul  1 16:52 images
drwxrwxr-x 3 user user 4096 Jul  1 16:52 includes
drwxrwxr-x 2 user user 4096 Jul  1 16:52 js
ls -R:
.:
about.php
CHANGELOG.txt
config
COPYING.txt
docs
dvwa
external
favicon.ico
README.txt
robots.txt
security.php
setup.php
vulnerabilities

./config:
config.inc.php
config.inc.php~

./docs:
DVWA-Documentation.pdf

./dvwa:
css
images
includes
js

Now that we have a directory listing for our application, let’s parse out all of the directories and files and create valid URLs. Click the “Generate URL List” button to start the parsing. Tip: If the generated URLs don’t look correct you can modify the fields on the left of the GUI and regenerate the list or copy the list and use your own text editor to make changes. (Please e-mail SmeegeSec@gmail.com with any parsing issues or suggestions)

A text area is populated with the URLs and the total count of directories and files processed.

At this point we have a few options. We can take our list of URLs and use them in Burp’s Intruder. To do this it would be very easy, all that needs to be done is remove the protocol, hostname, and port from each URL within a text editor. From there we take the path of each resource as a payload in a GET request in Intruder. We could then look to see which resources we are able to reach by analyzing status codes and content length. A second option is built into the extension via the “Import URL List to Burp Site Map” button. This button makes a request to each URL in the list (with cookie information, if it was found) and if a valid response is returned, will add the request and response to Burp’s Site Map. Requests with keywords such as logout, logoff, etc. are skipped to avoid ending sessions. The import to site map functionality was one of the main features I wanted to implement.

Warning: Actual requests are being made. Remove any resources you don't want being made, such as delete_database.php!! Regex to remove resources will be added in updates.

Done! A message dialog tells the user how many URLs were valid and imported into the site map. In the above image you can see a full site map and proxy history which was not found by spidering or brute forcing directories/files, but rather a simple directory and file listing of the application. With a full site map we are now ahead of the game. If you have multiple testers testing an application you can save the state in Burp and distribute it to save time, almost completely bypassing the discovery phase.

Note: So far the parsing does not consider virtual directories or different URL mappings from different web frameworks. Future updates may include parsing of mapping files such as ASP.NET’s web.config and Java’s web.xml.

Tip: Running a plugin multiple times or multiple plugins at a time may require increased PermGen, an example to increase the max when launching Burp would be:

java -XX:MaxPermSize=1G -jar burp.jar

Also, please provide feedback if you use this extension. With many different output formats for directory and file listings it can be difficult to write a dynamic parser which works for every format. If you have a listing file which is not being properly parsed please contact me so I can include it in an update. Thanks!

Click Here to Download Source Code

Friday, April 19, 2013

WSDL Wizard: Burp Suite Plugin for Detecting and Discovering WSDL Files


Introduction

WSDL (Web Service Description Language) files often provide a unique and clear insight into web application functionality. They can be made public and accessible by client-side users or private. From an attacker’s point of view public WSDL files make finding vulnerabilities easier, however even if the files are private, any vulnerabilities will still exist. From the perspective of a penetration tester, depending on what kind of test it is, it may or may not be appropriate to ask the client for WSDL files before the test.

So how do we find the WSDL files which are public? What if there are no direct links to find during spidering? The WSDL Wizard plugin! There have been tools in the past such as OWASP’s WSFuzzer which do an average job when they work. I wanted to write a Burp Suite extension in python which would make WSDL file detection and discovery not only easy to use but also efficient and safe in terms of scope.

This plugin searches the current Burp Suite site map of a user defined host for URLs with the ?wsdl extension while also building a list of viable URLs to fuzz for 'hiding' WSDL files. Two different methods are available to check for ?wsdl files, using urllib2 (default) or Burp's API. The fuzzing method depends on which function is called in the code so switching is easy. When comparing efficiency against large site maps, urllib2 was about 30 percent faster. All found WSDL files are added to the existing site map and printed out in the Extender tab output section.

After we have our WSDL files it’s time to make use of them. I did a previous post called Automatic Web Services Communication and Attacks which discusses using tools such as soapUI to inject into SOAP requests.

WSDL Wizard Use

WSDL Wizard runs off all of the requests and responses in Burp’s Site Map of a user selected host. Before running the plugin it is beneficial to have as much information in the site map as possible. This plugin should be used at the end of the information gathering stage when you have near complete coverage of the application.


To run this plugin three things are needed:

In the Extender > Options tab, select the location of your Jython standalone file.

In the Extender > Extensions tab load WSDLWizard.py

Now we are ready to use the extension. A menu option of ‘Scan for WSDL Files’ will appear if the user right clicks in the message viewer, site map table, or proxy history. In the following case I am running the plugin on OWASP’s WebGoat vulnerable application. In the site map there is already a valid request and response for the WSDL file.

The output from the plugin reports on the host which was scanned, the full URL of the detected WSDL file, and how many other viable URLs it fuzzed with the ?wsdl extension. In this case the WSDL file was detected and 29 files were fuzzed but no more found because there is only one in this application.

Using a site map without a WSDL file in it really displays the main feature of this plugin. It will detect /WebGoat/services/WSDLScanning as being a candidate to fuzz with the ?wsdl extension.

After running the plugin the output reports 1 WSDL file was fuzzed and adds that request and response to the current site map.

Now we have all the WSDL files and the real fun begins! I have now shown how to detect WSDL files and even discover new ones you never knew were there.

I hope everyone likes it. This is my first Burp plugin so feedback would be very much appreciated. Any comments, suggestions, or errors please comment on my blog or email me at SmeegeSec@gmail.com