His 06 2013
His 06 2013
His 06 2013
06/2013
1
Dear Readers,
Web application security is a branch of Information
Security that deals specifically with security of
websites, web applications and web services. We
would like to show you the technical side of this area
of expertise. We are sure that after reading this issue
you will improve your skills and find out a lot of
security methods.
The whole issue concerns of 7 articles:
Paros Proxy
uPrint.. iHack..
Next-generation
monitoring
SOC
environments
[Hack]in(Sight)
Editorial Section:
Authors:
Renato
Vikas
Kumar,
Augusto,
Miroslav
Francisco
Ludvik ,
Pragati Ogal
Caballero,
Matthew
Rai, Krunoslav
Clapham,
Rukavina,
Rahul
Jamgade,
Jon Zeolla.
Girish Kumar,
Ahmed Rasul, Sameh Sabry.
Copy-editors:
Copy-editors:
Agata Brzozowska, Manish
Chasta, Dhawal
Manish
Chasta, Dhawal
Desai, Kevin
Desai,
McIntyre,
Kevin
McIntyre,
Robrecht
Robrecht
Minten ,
Zsolt Nemeth,
Minten,
Zsolt Nemeth,
Phil Quinan,
Phil Larry
Pool, David
Quinan,
Larry
Sanborn
Pool, David
(Axiom),
Andy Stern.
Sanborn
(Axiom), Andy Stern.
DTP:
Anandu
Jim
Steele
Ashokan
Publisher:
Hack Insight Press Pawe Pocki
www.hackinsight.org
Editor in Chief:
Pawe Pocki
[email protected]
All trademarks presented in the magazine
were used
only for informative purposes.
06/2013
3
Table
Of
Contents
www.hackinsight.org
How to Estimate Development Security Maturity
Page 5: Seen the security design or development failures of middleware,
antivirus, browsers, and other technologies lately? Worrying about inheriting
other peoples security failures? How can anyone avoid others design flaws?
With a more secure design, of course!
Paros Proxy
Page 15: In the age we are in, technology has infiltrated the fabric of society,
and in turn, our everyday lives. From social media to communication to
business, technology is now part of our lives. Security has always been a
concern for all human beings, and technology is no different. As with any
form of security, any vulnerability that allows an attacker to breach a system
is a serious threat.
uPrint.. iHack..
Page 22: I print and you hack? Is that what you are trying to say Sameh? Am I
in danger having a printer close to me?The answer to the above-mentioned
questions is YES.
XSS
Page 37: Recently most of the websites are dynamic websites that provides
user specific information as per the profile and other settings of the user. It is
different as compared to static websites that shows same contents to all the
users who visits the site.
Web App Pentesting Methodology
Page 50: In this article, we will show the phases and most important testing when
performing a vulnerability assessment on a Website.
In this article, we will show the phases and most important testing when performing a vulnerability
assessment on a Website.
Introduction
The goal of this article is to explain how to use a variety of tools to carry out a vulnerabilities analysis of a
website; and to introduce various methods to identify and exploit vulnerabilities.
To do this, we will divide the article as follows: Information gathering, automatic testing (scanners) and
manual testing. Note: In this article we do not discuss source code assessment another phase in the Web
Application Pentesting Methodology. To carry out this task and others described in this article, we
recommend that the readers revise "The Open Web Application Security Project (OWASP 1)".
In order to follow the steps in this article and try the various techniques to identify vulnerabilities it is
recommended that one of the following frameworks be installed: DVWA2, Web Security Dojo3 or Web For
Pentester4.
Note: Remember that any technique or launch of an intrusive tool on a Website without authorization is a
crime in most countries.
Information
Gathering
Extract Metadata
Creation of
Dictionary
Download Website
Online tools
Identification of
Email accounts
Identification of
Virtual Host
Automatic Testing
(scanners)
Launch Tools (Free &
commercial)
Spidering
Interesting files
Bruce force folders
and files
Fuzzing
https://www.owasp.org/index.php/Main_Page
http://www.dvwa.co.uk/
3
http://www.mavensecurity.com/web_security_dojo/
4
https://pentesterlab.com/web_for_pentester.html
2
Manual Testing
Testing
vulnerabilities
Surfing Website
Identify componets
& plugins
Headers, http
methods, sessions,
certificates, etc.
Manipulation of
parameters
Analysis of Flash,
Java and other files.
Authentication
system
Information gathering
In this first phase we will try to identify as much information as possible, after we try to carry out more
complex and specific attacks, toward the applications analyzed.
A good starting point is to get to know the target Website as much as possible, by downloading its
structure, files, and any other relevant information. To accomplish this task, we use the following tools:
wget, httrack5
./wget rck http://<WEBSITE>
Note: -r recursive mode, -c continues downloading and k after the downloaded, modify the links to
routes to surf correctly.
If the Website analyzed is available online, we can speed up some information collected usually in the
manual testing stage such as HTTP headers and identification of several CMSs and their versions (because
the metadata extraction is based on whatweb software) and assess its level of security based on this data.
To accomplish this task, there is an online tool on the Internet desenmascara.me6:
Another task that we perform in this phase will be to identify the maximum number of email accounts, in
order to have valid user names in different application areas. To accomplish this task, we could use the
following tools: theharvester7, maltego8, msfcli (metasploit9) among others.
./theharvester.py -d <WEBSITE> -l 500 -b google
* Note: In this example we limit to 500 the google search.
With msfcli (metasploit):
./msfcli modules/auxiliary/gather/search_email_collector DOMAIN=<WEBSITE> E > ouput_emails.txt
Also, we need to perform a search for documents within the Website, in order to identify those that
contain metadata and thus able to perform an extraction of them for more information such as user
http://www.httrack.com/
http://desenmascara.me/
7
https://code.google.com/p/theharvester/
8
http://www.paterva.com/web6/products/maltego.php
9
http://www.metasploit.com/
6
06/2013
51
names, versions, internal devices etc. To accomplish this task, we use the following tools: Metagoofil10,
FOCA11.
./metagoofil.py d <WEBSITE> -t doc,pdf l 200 n 50 o domainfolder f output_files.html
Note: In the above example, we limit the search to 200 per search, with a maximum of 50 files to
download and save the files in the directory domainfolder.
This task can also be performed manually through a google search, as indicated in the following example:
site: www.url.com ext:pdf intitle:"Documents and settings"
Besides searching metadata, email accounts and downloading the site, we try to identify the different
interfaces of Website management.
One task that would be very interesting to do, is create a custom dictionary of the Website. To carry out
this task, we can use the script cewl.rb12
./cewl.rb depth 2 min_word_length 5 write output_word_list.txt http://<WEBSITE>
Note: In the above example, with the --depth option you specify the kind of depth, in this case 2 and also
specify the minimum length of 5 characters, -- min_word_length.
To complete this phase, we will perform a search for other domains that are hosted on the same IP (virtual
host). To accomplish this task, we can use the tool: revhosts13
./revhosts pig vhh <IP_WEBSITE>
Similarly, one could perform this same search through Bing, with the following request:
IP:<IP_Website>
http://www.edge-security.com/metagoofil.php
http://www.informatica64.com/foca.aspx
12
http://www.digininja.org/projects/cewl.php
13
http://securitytnt.com/revhosts/
14
http://www.cirt.net/nikto2
15
http://w3af.org/
16
https://code.google.com/p/skipfish/
17
http://arachni-scanner.com/
11
18
https://www.owasp.org/index.php/OWASP_Zed_Attack_Proxy_Project
http://www.acunetix.com/
20
http://www-01.ibm.com/software/awdtools/appscan/
21
https://download.hpsmartupdate.com/webinspect/
22
http://www.mavitunasecurity.com/netsparker/
23
https://code.google.com/p/golismero/
24
http://www.open-labs.org/
25
https://www.owasp.org/index.php/Category:OWASP_DirBuster_Project/es
26
http://www.powerfuzzer.com/
27
http://portswigger.net/burp/proxy.html
28
https://code.google.com/p/fuzzdb/
19
06/2013
53
Manual Testing
In the last phase, we will try to unite and use all the information gathered in previous phases (information
gathering and scanners). To do this, we will perform numerous tests manually to identify potential
vulnerabilities that we have not detected at earlier stages. This phase has a number of additional benefits
such as Intelligent - Can handle previous limitations and eliminates false positives. On the other side,
Manual testing limitations have Time-consuming, Coverage of every field.
The first task to perform manually is to browse through the Website to identify other elements that we
have not previously identified. To accomplish this task, we could use the following tools: Burpproxy, ZAP,
sitescope, or firefox etc.
A second task will be to try to identify the components and plugins that have enabled the Website, as
might be the following types of CMS (Content Managment Systems): Joomla Component, Wordpress
plugin,
Php-Nuke, drupal, Movable Type, Custom CMS, Blogsmith/Weblogs, Gawker CMS, TypePad,
Blogger/Blogspot, Plone, Scoop, ExpressionEngine, LightCMS, GoodBarry, Traffik, Pligg, Concrete5, Typo3,
Radiant CMS, Frog CMS, Silverstripe, Cushy CMS etc.
After identifying the type of CMS or a components such Website, proceed to perform a search for known
vulnerabilities and / or associated with it and try other, which may not have been discovered. To
accomplish this task, we could search by Internet vulnerabilities associated with component and / or plugin
or using specific tools such as: joomla Scan29 or cms-explorer30.
The next task will be the analysis of the headers of the server to identify the server type, version among
other information. To accomplish this task, we could use any tool like a proxy or a simple telnet connection
to the Website or simply typing the target on desenmascara.me, the previous online tool mentioned in the
Information gathering stage
29
30
http://www.enye-sec.org/programas/joomla-scan/
https://code.google.com/p/cms-explorer/
As part of this third phase, fingerprinting should be done to identify the architecture and configuration of
the site. To perform this task, we could use the tool: httprint31
One of the most important tasks at this stage is the modification of parameters, to identify any errors and /
or vulnerabilities. To accomplish this task, we could use any proxy to manipulate the requests to the
Website.
Note: This task is additional and / or part of the task of fuzzing the parameters into Website.
On the other hand, there are many tasks that we must take to identify specific vulnerabilities through the
modification of parameters as:
Alteration of the normal operation of the application by: single quotes , nulls values %00, carriage
returns, random numbers, among others. This allows us to obtain different types of errors when analyzing
it, and could lead to numerous Web vulnerabilities. To accomplish this task, we could use a proxy and
manipulate the Website parameters. For example, PHP technology can perform modification and arrays
31
http://www.net-square.com/httprint.html
06/2013
55
sending a request to end with [].This could cause an unhandled error and provide application information.
Identification and verification of path disclosure through the generation of unhandled errors. To perform
this task we could use a proxy or use as an automatic tool like inspathx.
Identification and verification of vulnerabilities like cross site scripting, sql injection, XPath, SSI, CSRF,
clickjacking among others.
Identification and verification of iframe injection, to carry out this task we can modify the parameter in the
url to identify with something like:
id=folder/file.html por id=http://www.[external-domain]
Identification and verification manual of CSRF (Cross Site Request Forgery). To accomplish this task, we
could try in the forms (usually where most often find this vulnerability). To check this, you will need to copy
an original request (GET / POST) on a form and then make a change in the parameters and re-send the
same request modified. If the server does not return an error, it can be considered that it is vulnerable to
CSRF. To perform this task, we can use the tools csrftester or burp proxy.
Identify and analyze the different types of file extension, allowing knowing specifically the type of
technology used in the Website.
Identification and error handling generated by modifying parameters. This task aims to create controlled or
not errors, by allowing the Website, to provide us with information such as versions, internal IP addresses,
among other information technology used.
Identification and verification of SSL certificates. To accomplish this task, we can use openssl 32 (as well as
TLSSLed33 tools which allows us to verify this information automatically SSL). Finally, we need determine
the period of validity of licenses.
Information gathering of the certificates SSL
./openssl s_client connect <WEBSITE>:443
Note: Types allowed by the server certificates
Testing SSLv2
./openssl s_client no_tls1 no_ssl3 connect <WEBSITE>:443
Identification and verification of encoding supported by the Website. To accomplish this task, we could use
the tool EcoScan34.
Identification and testing HTTP methods in the Website. To accomplish this task, we could use a proxy or
client that allows us to interact with the Website, once the connection is made, tested different HTTP
methods (HEAD, PUT, OPTIONS, TRACE, PROPFIND, CONNECT).
Note: With cadaver tool, we could exploit the webdav methods (if are enabled).
./telnet <WEBSITE> 80
Trying 67.X.X..18...
32
http://www.openssl.org/
http://blog.taddong.com/2013/02/tlssled-v13.html
34
http://open-labs.org/
33
Connected to <WEBSITE>.
Escape character is '^]'.
OPTIONS / HTTP/ 1.1
Host: 67.X.X.18
HTTP/1.1 200 OK
Date: Mon, 15 Apr 2013 16:13:08 GMT
Server: Apache
Allow: GET,HEAD,POST,OPTIONS
Identification and search for comments, variables, debug information, values and other information, which
should not be in the HTML. To accomplish this task, we need to see the source code of the application and
search for keywords, variables or comments that could give us some interesting information.
Identification and analysis of Flash files in the Website. To accomplish this task, the first thing to do is
identify and download all flash files that exist on the Website. To do this, we could use the Google search
engine:
filetype:swf site:domain.com
On the other hand, we could also find a swf files with wget tool:
./wget -r -l1 -H -t1 -nd -N -nd -N -A.swf -erobots=off <WEBSITE> -i output_swf_files.txt
Note: In the above example; -r-l1 -> We will search only one level and in each subdirectory found
recursively, -t1 -> Only makes a connection attempt, -nd -> We copy the directory files directly in Ethernet
frames rather than real -N -> timestamp Preserves original file that is downloaded [-np] no parent, no
follow links to parent directories, only the current and one down, for-r-l1, [-A.swf] wget -A indicates the
type of file to download only, in this case only "swf". [-erobots = off] And finally, this prevents files wget
ignore 'robots.txt' that may have, as it may be that within these files has indications that subdirectories
seekers should not see (including wget) . With this we avoid it and look at the whole level.
Once we have identified and downloaded *.swf files, we must analyze the code, the functions (as
loadMovie) variables in order to identify those that call and allow other types of vulnerabilities such as
cross site scripting. Below shows some vulnerable functions:
_root.videourl = _root.videoload + '.swf';
video.loadMovie(_root.videourl);
getURL - payload. javascript:alert('css') getURL (clickTag, '_self')
load* (in this case: loadMovie) - payload: as
function.getURL,javascript:alert('css')
TextField.html - payload: <img src='javascript:alert("css")//.swf'>
To accomplish this task, we could use the tools Deblaze35 and SWFIntruder36 among others. We should also
analyze the parameter AllowScriptAccess, Flash Parameter Pollution or sensitive APIs as:
35
36
http://deblaze-tool.appspot.com/
https://www.owasp.org/index.php/Category:SWFIntruder
06/2013
57
Normally it should proceed with attacks on default accounts and dictionary attacks. When we have
identified an authentication system, we detect its architecture and then try usernames and passwords
known to the architecture, implementation and other default accounts as: admin, administrator, root,
system, user, default, name application, among others.
To carry out this attack can do it manually or with tools like hydra37.
./hydra L users.txt P pass.txt <WEBSITE> http-head /private
It is also important to do a brute force and default credentials, to try bypass the authentication and
checking if the authentication system is configured and properly protected.
On the other hand, access should try to be gained in those areas which initially / supposedly are only
accessible through authenticated users, commonly known as cross directories (directory traversal). It is
also important to verify that the connections and settings that make up the typical credential retrieval
system, is configured correctly.
Finally, we note that all these tests and phases of a Web vulnerability assessment (scanners and manual
testing) can be carried out in white box (with credentials of the application) and black box (without
application credentials).
37
http://www.thc.org/thc-hydra/
Conclusion
As we have seen, for a complete analysis of vulnerabilities in a Website, we must take into account many
variables and conduct from all the steps described in this article (from information gathering to a manual
analysis and having a good automatic analysis with many tools available).
Finally, it is important to note that each of the tests performed have been conducted under certain
conditions, which may be susceptible to changes (updates, change settings, etc), therefore when trying to
replicate similar testing on the same websites, it may have different results.
38
39
http://www.s21sec.com
http://csfi.us/
06/2013
59
Thank you for reading our magazine from cover to cover. Please share with us your
comment about this issue on Twitter or Facebook:
@Hackinsight
http://www.facebook.com/hackinsight
The techniques described in our articles may only be used in private, local networks.The editors hold
no responsibility for misuse of the presented techniques or consequent data loss.