Besides unit and integration testing, other forms of testing should be considered. They are described in the next sections.
User Acceptance Testing (UAT) looks at testing from a user's point of view. In the case of an API, the user is a piece of software consuming the service. Regardless of the type of user, this form of testing is important to ensure that a RESTful web service exposes a consistent and feature-complete API. UAT tends to be less automated than other types of testing. However, UAT test managers should ultimately have the final say in whether a software solution is ready for general availability.
Another important criterion in measuring the production readiness of a RESTful web service is whether it will perform in line with the expected Service Level Agreements (SLAs) under load. For example, during peak times, the service might be expected to handle 1,000 requests per second, with an average response time of no more than 250 milliseconds.
There are a number of commercial and open source products to test whether a web service can handle such load. The most common is the open source software, Apache JMeter (http://jmeter.apache.org). With JMeter, developers can create test plans that can be executed at defined rates and capture response times. The screenshot that follows shows the result of running a test plan that contains one call to our sample property management system to retrieve the room with ID, 1
:
We executed http://localhost:8080/rooms/1
1,000 times concurrently (with 10 threads), and the average response time was 11 ms. By increasing the number of threads, we can simulate more load on the service.
Simulating real production load is not easy to achieve. Therefore, service designers may see discrepancies between performance under simulated load and real load. This fact does not take away the value of load testing. It merely suggests that service designers should not rely solely on load testing results to ensure that SLAs are met.
18.225.234.28