Identifying relevant files and directories from crawling results

We have already crawled an application's full directory and have the complete list of referenced files and directories inside it. The next natural step is to identify which of those files contain relevant information or represent an opportunity to have a greater chance of finding vulnerabilities.

More than a recipe, this will be a catalog of common names, suffixes, or prefixes that are used for files and directories that usually lead to information useful for the penetration tester or to the exploitation of vulnerabilities that may end in a complete system compromise.

How to do it...

  1. First, what we want to look for is login and registration pages, the ones that can give us the chance to become legitimate users of the application, or to impersonate one by guessing usernames and passwords. Some examples of names or partial names are:
    • Account
    • Auth
    • Login
    • Logon
    • Registration
    • Register
    • Signup
    • Signin
  2. Another common source of usernames, passwords, and design vulnerabilities related to them are password recovery pages:
    • Change
    • Forgot
    • lost-password
    • Password
    • Recover
    • Reset
  3. Next, we need to identify if there is an administrative section of the application, a set of functions that may allow us to perform high-privileged tasks on it, such as:
    • Admin
    • Config
    • Manager
    • Root
  4. Other interesting directories are the ones of Content Management Systems (CMS) administration, databases, or application servers, such as:
    • Admin-console
    • Adminer
    • Administrator
    • Couch
    • Manager
    • Mylittleadmin
    • PhpMyAdmin
    • SqlWebAdmin
    • Wp-admin
  5. Testing and development versions of applications are usually less protected and more prone to vulnerabilities than final releases, so they are a good target in our search for weak points. These directory names may include:
    • Alpha
    • Beta
    • Dev
    • Development
    • QA
    • Test
  6. Web server information and configuration files are as follows:
    • config.xml
    • info
    • phpinfo
    • server-status
    • web.config
  7. Also, all directories and files marked with Disallow in robots.txt may be useful.

How it works...

Some of the names listed in the preceding section and their variations in the language in which the target application is made may allow us access to restricted sections of the site, which is a very important step in a penetration test. Some of them will provide us information about the server, its configuration, and the developing frameworks used. Some others, such as the Tomcat manager and JBoss administration pages, if configured incorrectly, will let us (or a malicious attacker) take control of the web server.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.118.163.158