Shortcomings of integration tests

However, in order to verify the application's production behavior, these tests are not sufficient. There will be differences in technology, configuration, or runtime that eventually lead to gaps in the test cases. Examples are enterprise containers with different versions, mismatches in the bean configuration once the whole application is deployed, different database implementations, or differences in JSON serialization.

Ultimately, the application runs in production. It makes a lot of sense to verify the behavior in environments that are equivalent to production.

Certainly, it is advisable to craft several test scopes to both have tests with isolated scope and faster feedback as well as integrational tests. The shortcoming of code level integration tests is that they often take a great amount of time.

In my projects in the past, integration tests running containers such as Arquillian, usually were responsible for the vast majority of the build time, resulting in build with 10 minutes and more. This greatly slows down the Continuous Delivery pipeline, resulting in slow feedback and fewer builds. An attempt to solve this shortcoming is to use remote or managed containers in Arquillian tests. They will run with a longer life cycle than the test run and eliminate the startup times.

Code level integration tests are a helpful way to quickly verify application configuration, what cannot be tested using unit or component tests. They are not ideal for testing business logic.

Integration tests that deploy the whole application on simulated environments, such as embedded containers, provide certain value, but are not sufficient to verify production behavior since they are not equivalent to production. No matter whether on code level or simulated environments, integration tests tend to slow down the overall pipeline.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.136.18.141