As you start to build a test suite, you may notice the runtime getting quite long. If it's so long that you aren't willing to run it at least once a day, you need to stop coding and focus on speeding up the tests, whether it involves the tests themselves or the code under test.
This assumes you have started to build a test suite using some of the following practices:
These are slow starting steps to start adding tests to a system that was originally built without any automated testing. One of the trade offs to get moving on automated testing involves writing relatively expensive tests. For instance, if one of your key algorithms is not adequately decoupled from the database, you will be forced to write a test case that involves setting up some tables, processing the input data, and then making queries against the state of the database afterwards.
As you write more tests, the time to run the test suite will certainly grow. At some point, you feel less inclined to spend the time waiting for your test suite to run. Since a test suite is only good when used, you must pause development and pursue refactoring either the code or the test cases themselves in order to speed things up.
This is a problem I ran into. My test suite initially took about 15 minutes to run. It eventually grew to take one-and-a-half hours to run all the tests. I reached a point where I would only run it once a day and even skipped some days. One day I tried to do a massive code edit. When most of the test cases failed, I realized that I had not run the test suite often enough to detect which step broke things. I was forced to throw away all the code edits and start over. Before proceeding further, I spent a few days refactoring the code as well as the tests, bringing the run time of the test suite back down to a tolerable 30 minutes.
That is the key measurement: when you feel hesitant to run the test suite more than once a day, this may be a sign that things need to be cleaned up. Test suites are meant to be run multiple times a day.
This is because we have competing interests: writing code and running tests. It's important to recognize this:
When testing takes a big chunk of our daily schedule, we must start choosing which is more important. We tend to migrate towards writing more code, and this is probably the key reason people abandon automated testing and consider it unsuitable for their situation.
It's tough, but if we can resist taking the easy way out, and instead do some refactoring of either the code or our tests, we will be encouraged to run the tests more often.
It's less science and more voodoo exactly what to refactor. It's important to seek out opportunities that give us a good yield. It's important to understand this can be either our test code, or production code, or some combination of both that needs to be refactored:
Coverage obtained from tests can help. All of these approaches have positive consequences for our code's quality. More efficient algorithms lead to better performance and looser coupling helps to keep our long-term maintenance costs down.
18.117.229.44