How it works...

In this recipe, we showed some key aspects to take into account in order to avoid damage to the information and disruption to services when executing automated scanning against our target application.

The main reason for requiring special measures is that web application vulnerability scanners, in their default configurations, tend to crawl the entire site and use the URLs and parameters obtained from this crawling to send further payloads and probes. In applications that don't properly filter the data they receive, these probes can end up stored in the database or executed by the server, and this could cause integrity problems, permanently alter or damage existing information, or disrupt services.

To prevent this damage, we recommended a series of actions focused on preparing the testing environment, knowing what the tools are doing and keeping them updated, carefully selecting what is to be scanned, and keeping extensive record of all actions.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.216.239.46