
In this article I will talk about tools for testing security of web applications. The main stages and check-list of works presented in the
previous article .
Most of the utilities are contained in popular penetration testing distributions: Kali Linux, BlackArch, BackBox Linux. For those who do not have the opportunity for one reason or another to use these distributions, I publish links to github / utility pages.
Main steps
For completeness testing, you should try to follow the recommendations below, customizing certain steps depending on the web application. Next, I will describe the stages and utilities that fit each of them.
')
Intelligence service
Port scan. At this stage, the timeless classics will help -
nmap . For those who have encountered this utility for the first time, it is necessary to take into account that by default nmap scans ~ 1000 ports (the first and most popular ones above), and also does not scan UDP - keep this in mind.
Scan subdomains. At this stage, working with the dig utility and understanding AXFR queries is useful. Also useful is the
subbrute utility.
Examination of visible content. Here, if not strange, you will need your own eyes - in order to visually explore a web application, to understand its logic of work. Small hint: in order to make the initial check anonymous and not attract attention, use the cache of search engines and systems like google.tranlsate.
Search for hidden content (directories, files, information). At this stage, the
dirb ,
dirsearch utilities will be useful, you can use the tools Foca (outdated) and maltego (registration is required, there is a paid version).
Definition of platform and web environment. Here you need to use the add-on to the browser wappalyzer or
whatweb utility.
Definition of input forms. At this stage, you can limit yourself to a visual inspection of the forms on the pages identified as a result of the search for hidden content.
Separately, I would like to mention the “combines” for collecting information:
theharvester and
recon-ng - with these tools you can get quite a lot of information - from identifying accounts and subdomains to searching for critical information on the site.
Access control
At this stage, both instrumental and manual verification of the password policy requirements is required.
To check, it is necessary to conduct a dictionary attack, for example, with the help of
hydra or
patator , using
previously known credentials: this way you can reveal the protection against such attacks (or its absence).
Definition of password policy requirements. Here is a manual check of the policy requirements logic. Using only numbers (such as a pin code) without brute protection is a very bad idea.
Testing account recovery. At this stage, there are several links or triggers for resetting the password (preferably from different accounts). Here it will be necessary to identify and determine the hash (a frequent occurrence), for example using the
hashID . Next, you need to make a comparison of reset triggers (for example, links) using the comparison utilities (for example, comparer in the burp suite).
Testing session save functions. Testing account identification features. Authorization and access control. Session studies (lifetime, session tokens, signs, simultaneous attempts, etc.) Check CSRF. For these tasks
mantra is well suited - there is a version in the form of both firefox and chrome build.
Parameter fuzzing
Testing of a web application can be performed both in instrumental mode (
w3af ,
vega ,
arachni ,
sqlmap , Acunetix, Netsparker, etc.) and semi-instrumental -
Burp Suite ,
OWASP ZAP , etc.
Using these tools, both automatic and manual (most accurate) mode, you can identify the following vulnerabilities: injections (SQL, SOAP, LDAP, XPATH, etc.), XSS vulnerabilities, redirects and redirects — the entire spectrum of web vulnerabilities (OWASP TOP 10).
Checks the logic of the web application
Testing the application logic on the client side. Testing for the so-called. “Race condition” - race condition. Testing the availability of information based on access rights or lack thereof. Check the possibility of duplication or separation of data. At this stage, we will need to study well the logic of the application and operation using
Burp Suite ,
OWASP ZAP or the same
mantra . Detection of such vulnerabilities in the automatic mode is almost impossible (except for utilities with the code to identify the formal signs of such vulnerabilities and study the source code).
Checking server environment
Check server architecture. Search and identification of public vulnerabilities. Check server accounts (services and services). Determine server or component settings (SSL, etc.). Authorization check. Here you can use both specialized scanners (for service) and well-known ones, for example, such as
OpenVAS ,
Armitage / Metasploit .
Total
Having a plan for testing the application, we can step by step explore all its components for the presence of certain vulnerabilities. Based on the web application, certain items can be supplemented with application-specific checks or software.
In the comments I will be glad to answer your questions.