Chapter 14. Test-Driven Development

The best candidates for rapid development are systems for which requirements remain constant throughout the lifetime of the project. Sadly, such projects are a rarity, possibly to the extent that like such mythical beasts as the Yeti and the Loch Ness monster, their existence is questionable at best, and most likely fictitious.

Software projects have proven themselves prone to change at even the latest stage of the development process. A change can strike a project from a variety of directions: requirements are subject to change when use cases are enhanced, removed, or added; the system design may also change, either to reflect an update to the requirements or because of an initial discrepancy or shortcoming. Furthermore, an application’s code base is constantly in a state of flux, as mistakes are made and rectified, and enhancements and optimizations are applied.

We’ve already covered techniques that seek to both assess the impact of change and mitigate its disruptive effects. Models offer a low-cost option for measuring the consequences a change will have on the ability of the design to execute the system, while code generation helps build an adaptive foundation for the project in order that changes can be absorbed without unduly stretching out the project schedule. We’ve seen that a layered design is also important, as it confines the potential ripple effects of a change to a single part of the architecture.

Another software engineering method can be added to this arsenal of techniques to further strengthen the base of our adaptive foundation: test-driven development.

Test-driven development is an approach in which the project is protected from change by a shield of unit tests. In this chapter, we examine the benefits of test-driven practices for rapid development and identify how testing techniques contribute to productivity and agility.

The testing activity is critical to the successful delivery of a software system, and we discuss over the next two chapters the various aspects of the testing process and the role it plays in rapid development.

The focus of this chapter is on the advantages and application of test-driven development, and we introduce the unit-testing framework JUnit. Methods for the autogeneration and running of unit tests are covered, as are the role of mock objects in the testing process.

Testing as a Development Paradigm

Software engineers generally regard testing as a necessary evil. While all developers acknowledge the important role of testing in producing quality software, few admit to relishing the task of testing their own code, or anybody else’s for that matter.

XP has turned this perception of the testing process around, breathing new life into the art of software testing. XP has achieved this turnaround by making the writing and running of tests central to the development effort.

Note

The practices of XP covered in Chapter 3.

For many developers, testing prior to the practices of XP was a laborious process. A test case had to be written, test data prepared, and expected results documented before any test could be executed. To make the testing process more palatable, XP seized upon the notion of using automated scripts for all developer-related testing. With XP, developers write code to test code, and writing code is something every engineer enjoys.

The XP approach to testing differs from traditional processes in that the test is written before any implementation code is produced. With the test in place, the next task is to write the minimal amount of code to pass the test—no more, no less. Writing the code to pass a test has the test driving the development process, hence the term test-driven development, or test-first development, as it is also known.

This test-driven approach moves the testing process to the forefront of the engineer’s attention. The result of this emphasis on testing is tests that prove the implementation meets the requirements and code built to pass these tests.

Test-driven development existed before XP. However, XP has popularized the practice and helped gain widespread acceptance of the merits of test-centric development.

Adopting test-driven development does not require you to embrace the XP process. XP embodies a range of interlocking practices, of which test-driven development is only one. Instead, a test-first approach to development is a valuable addition to any methodology, and its use is actively encouraged as part of the IBM Rational Unified Process (RUP).

Note

Chapter 3 covers the IBM Rational Unified Process.

The Benefits of Test-Driven Development

Following a test-driven approach to development enables the production of a comprehensive suite of automated tests over the course of the project that can be run quickly and easily. By including the testing in the build process, the impact of an application change on the code base can immediately be gauged. The cost of identifying implementation errors at the construction stage is far less than discovering them during a formal testing process by a separate quality assurance team.

Here is a summary of the benefits a test-driven approach can provide:

  • Testing requirements are considered early.

  • Tests are not omitted, since they are written first.

  • The act of writing testing serves to clarify requirements.

  • Writing testable code tends to produce better organized software.

  • The usability of interfaces is improved because developers are required to work with the interface under test.

  • Code changes can be validated immediately as part of the build process.

  • The practice of refactoring is supported.

  • A higher quality deliverable is provided to the quality assurance team, since unit tests catch defects earlier in the process

A test-driven approach therefore provides a project with the agility it needs to incorporate change into its structure. Changes can be made quickly, accurately, and with impunity on behalf of the software engineer.

Having a comprehensive test suite in place allows invasive techniques such as refactoring to be followed safely. Thus, engineers can undertake sweeping changes to the code base, safe in the knowledge that any errors introduced by such a process will be detected immediately.

The Cost of Test-Driven Development

A test-driven approach to development can significantly improve the quality of software delivered into formal testing. However, while the theory behind test-driven development is attractive, the practice is not without its pitfalls. The writing of code, whether it is for adding functionality or writing tests, still consumes valuable project resources, namely a developer’s time. The implementation of an effective automated test suite is not a trivial task. Some of the factors to consider are these:

  • Complexity.

    Each test that forms part of a greater test suite must operate in isolation. It must not exhibit side effects that have an impact on the behavior of other tests within the suite. Achieving this level of isolation between tests is a technical challenge, since the use of common resources, specifically the database, increases coupling between tests.

  • Test coverage.

    A strategy should exist to define the scope and distribution of all tests. A project with a poorly structured test strategy is liable to unit test bloat, whereby developers generate an excessive volume of unit tests with overlapping test coverage. To avoid test duplication, ensure you define your testing strategy early in the project and communicate it to all members of the team.

  • Maintenance.

    As the code base of the application grows, so does the number of automated unit tests. Requirement and design changes are likely to result in the need to update numerous test cases. Although the benefits of the automated test may justify the maintenance overhead of tests, this additional time and cost must be factored into the project schedule.

  • The build process.

    To be truly effective, a regularly scheduled build process should execute all unit tests as part of the build. The process needs to be able to run all unit tests and report test failures accordingly. The effort required to establish and maintain unit tests as part of the build process must be factored into the project’s timeline.

Test-driven development can be undertaken effectively with the application of a sound test strategy and the use of a suitable testing framework. In the next sections, we look at the use of the open source testing framework JUnit for building test suites.

Introducing JUnit

For writing unit tests for Java, we have to look no further than JUnit, an open source testing framework initially developed by Kent Beck and Erich Gamma. JUnit has become the de facto standard for Java unit testing. It is well supported by most development tools, and a wealth of reference material, tutorials, and other literature is available on its use.

The JUnit framework provides a ready-made test harness for executing unit tests as well as an API for reporting the success or failure status of a test. The JUnit framework is available for download from http://www.junit.org.

The success of JUnit is due in part to its simple design, as JUnit is both easy to learn and work with. Listing 14-1 shows the basic structure of a JUnit test case.

Example 14-1. JUnit Test Case

package customer.admin.ejb;

import junit.framework.TestCase;

public class CustomerAdminBeanTest extends TestCase {

  protected void setUp() throws Exception {
    super.setUp();
  }

  protected void tearDown() throws Exception {
    super.tearDown();
  }

  public CustomerAdminBeanTest(String arg0) {
    super(arg0);
  }

  public final void testValidateAccount() {
    //TODO Implement validateAccount().
  }
}

The example unit test shown in Listing 14-1 was automatically generated using Eclipse for a session bean with a single business method, validateAccount(). In its current state, the code shown isn’t going to be performing much of a test, as we still have to add the testing functionality to the class. Nevertheless, the code does illustrate the structure of a JUnit test case.

To create a test that can be executed by the JUnit framework, you must write a test class responsible for conducting the individual test. This is achieved by defining a subclass of TestCase.

For each method on the class under test, a test method is required in the testing class. The method name should be prefixed with test. Methods must also be public, have no parameters, and return void. Adhering to these rules means the JUnit framework can use Java reflection to identify the test methods on the class dynamically. Without this approach, it would be necessary for all test methods to be registered with JUnit for execution. Reflection therefore simplifies the process of setting up a test case.

The basic test shown in Listing 14-1 contains only a single test method, testValidateAccount(). All the preparation work for the test can be done inside of this method—for example, instantiating objects required for the test or creating database connections. All of this test preparation, however, clutters the test code—something we must avoid, since a test should be easily readable.

For this reason, TestCase has two protected methods that can be overridden to provide setup and housekeeping tasks. The setUp() method is run before each test, while tearDown() is run once the test has completed.

tip

The setUp() and tearDown() methods are run before every test method in the TestCase. If you have a setup-type operation that needs to be performed just once for all tests within the TestCase, then you must do this in the constructor. However, be wary of all code that is placed within the constructor, as JUnit simply reports that the test case could not be instantiated if an error occurs. A more detailed stack trace can be obtained from JUnit if setUp() is used.

You can confirm expected results at each step of the test’s execution with the JUnit API. JUnit provides a number of assert methods for this purpose, where a failed assertion causes JUnit to report that the test has failed. Table 14-1 lists the different assertion types provided by JUnit for determining the status of an executing test.

Table 14-1. JUnit Assertion Types

Assert Type

Description

assertEquals

Asserts that two items are equal.

assertFalse

Asserts that a condition is false.

assertNotNull

Asserts that an object isn’t null.

assertNotSame

Asserts that two objects do not reference the same object.

assertNull

Asserts that an object is null.

assertSame

Asserts that two objects refer to the same object.

assertTrue

Asserts that a condition is true.

fail

Fails a test unconditionally.

Listing 14-2 shows a complete test. In the example, the CustomerAdminBean session bean is tested from outside of the container using the remote interface of the enterprise bean.

Example 14-2. CustomerAdminBeanTest.java with Complete Test

package customer.admin.interfaces;

import java.util.Hashtable;
import javax.naming.InitialContext;
import junit.framework.TestCase;
import customer.domain.Customer;
import customer.factory.CustomerFactory;

public class CustomerAdminBeanTest extends TestCase {

  private CustomerAdminBean customerAdminBean = null;

  /**
   * Obtain a remote interface to the Customer EJB.
   */
  protected void setUp() throws Exception {
    super.setUp();

    Hashtable props = new Hashtable();

    props.put(InitialContext.INITIAL_CONTEXT_FACTORY,
       "weblogic.jndi.WLInitialContextFactory");
        props
           .put(InitialContext.PROVIDER_URL, "t3://localhost:7001");

        customerAdminBean = CustomerAdminBeanUtil.getHome(props)
            .create();
  }

  /**
   * Expects the class under test to declare the Customer does not have a valid
   * account, i.e. method should return false
   *
   * @throws Exception
   */
  public final void testValidateAccount() throws Exception {

    Customer customer = CustomerFactory.getCustomer(1);

    assertFalse(customerAdminBean.validateAccount(customer));
  }

} // CustomerAdminBeanTest

The setUp() method obtains the remote interface for the session. The tearDown() method has been removed, as we won’t be performing any housekeeping in this example. The actual test is provided in the testValidateAccount() method. The expected result from the test is that the validateAccount() method on the session bean will correctly determine the Customer has an invalid account and will return a value of false. To verify this condition, the call to the session bean is wrapped in an assertFalse() statement.

With the test created, the next step is to execute the test and examine the results. Running unit tests should be as quick and painless as possible. The next section discusses some of the options for invoking test cases.

Running JUnit Tests with Eclipse

For executing test cases, JUnit uses a TestRunner. Three versions of TestRunner are available, depending on how you wish to execute the tests:

  • junit.textui.TestRunner directs all test output to stdout.

  • junit.awt.ui.TestRunner provides a graphical AWT-based user interface for running tests.

  • junit.swingui.TestRunner provides a Swing-based graphical user interface for executing tests.

Although the JUnit framework provides the necessary tools for executing tests, it is preferable if the testing process integrates into your chosen development workbench.

Having an IDE that can run unit tests leads toward the goal of a single workbench as a one-stop shop for J2EE development. Rather than jumping out of the IDE in order to execute test cases, it is far more productive to have the IDE perform this operation.

The Eclipse workbench, like most IDEs, has excellent support for JUnit and comes with all of the necessary JUnit libraries as part of the install. Using an IDE like Eclipse, it is a trivial matter to execute all test cases and receive immediate feedback of the results of a test run.

Note

An overview of the Eclipse workbench is provided in Chapter 13.

Figure 14-1 shows the Eclipse JUnit Fast View. This view is displayed after telling Eclipse the TestCase instance is to be run as a JUnit test. This action is invoked from the Eclipse run menu.

Eclipse JUnit Fast View.

Figure 14-1. Eclipse JUnit Fast View.

The results of the CustomerAdminBeanTest test case are displayed in Figure 14-1, and in this instance, it is a failed test. The results of the test run are represented by a colored bar at the top of the view. Green is a pass and red is a fail. Figure 14-1 shows a red bar indicating failure. The reasons for the failure are given in the lower pane. In the example, we have an assertion failure: the validateAccount() code failed to correctly recognize an invalid customer account.

Having Eclipse make the running of tests so convenient allows for a rapid development cycle. The developer can update code and launch the test for immediate feedback on the success of the change.

Being able to run tests quickly is one factor in building a rapid development environment. Another is the time taken to write the test case. If we are going to embrace a test-first approach as part of a development methodology, then we need to consider how to reduce the time taken to write the tests.

Now that you have an appreciation of what is involved in writing a JUnit test, let’s look at some of the options for expediting the development of a test suite.

Generating Unit Tests

To have a comprehensive regression-testing suite in place requires writing a lot of test cases. Full-bore test-driven development requires a unit test for every class. Taken to these levels, an approach is required to prevent the creation of test cases from becoming a chore for the developer. Given the boilerplate nature of unit tests, they are ideal candidates for code generation techniques.

Generating Unit Tests with Eclipse

In addition to running unit tests, Eclipse makes it easy to generate a unit test for any given class. A wizard is used to lay down the code, and it inspects the methods on the class under test to determine which test methods are to be included in the test case.

The Eclipse JUnit test wizard is shown in Figure 14-2.

Eclipse JUnit test case wizard.

Figure 14-2. Eclipse JUnit test case wizard.

The wizard offers several options for configuring the JUnit test generated. It is possible to specify which methods from TestCase should be overridden, define the class under test, and nominate the package for the test. When defining the package, it is considered best practice to use the same package as the class under test. With this approach, the relationship between test and implementation is unambiguous.

Note the warning from Eclipse at the top of the wizard dialog about generating a test case for an interface. Ideally, we want a test case for each class. An interface can be implemented by any number of classes. Nevertheless, this approach does make sense if we are testing a session bean via its remote interface from outside of the EJB container. Methods of the bean itself, such as setSessionContext(), cannot be called by the test case, only by the container. It is therefore nonsensical to generate a test case based on the class, because we cannot test these container-only methods.

tip

It is a good idea to place all test cases in the same package as the class under test but in their own source directory. This makes packaging easier and avoids inadvertently deploying test cases in J2EE modules.

This tactic also provides the testing class with access to the package-level members of the class under test.

The final dialog for the wizard is shown in Figure 14-3. This screen allows the exposed methods on the class or interface to be selected for inclusion in the test case. Based on the methods selected, Eclipse creates the method stubs for writing the test, leaving the job of providing the test detail to the developer.

Eclipse test method selection dialog.

Figure 14-3. Eclipse test method selection dialog.

Generating tests in this manner may seem contrary to the principles of test-driven development. Previously, it was explained that a true test-driven approach to development involves writing the test case ahead of the class under test. How, then, can the Eclipse code generator be used for laying down the test case based on the class under test, when according to the principles of test-driven development, no actual class should exist?

There are perhaps two factors surrounding the rationale for including test code generators in IDEs. First, test case generators like the one offered by Eclipse are an indication that while developers are using JUnit for writing tests, in many cases engineers are falling back on the traditional approach of writing the test case after the class under test has been written. This test-second approach has value, but it does not emphasize the importance of the test case, and inadequate test cases may be the result.

The second factor is a question of whether we should be using test-driven development to drive the design or drive the implementation effort. There are many differing opinions on this topic.

My thoughts are that the best designs are produced by modeling, not by coding, particularly when extremely large systems are involved. Combining a model-based approach with a test-driven approach would see a minimal set of architecturally significant interfaces and classes, along with their responsibilities, defined before the main coding effort commences. With this approach, test-driven development is used to both ratify the interfaces being promoted and guide the writing of the code that sits behind each interface.

Note

Chapter 5 covers the benefits of UML models in software design.

note

A model-based approach to design does not preclude the use of code-level refactoring to further advance and refine the design.

Using a combination of modeling and testing techniques to drive the development means all architecturally significant interfaces can be legitimately defined ahead of the test cases. Ideally, our modeling tool will have generated the structure of the application for us. This raises the question as to whether the modeling tool can also generate the test cases in tandem. This approach has implications if a Model-Driven Architecture (MDA) paradigm is adopted, a subject we cover in the next section.

Unit Tests and MDA

An MDA approach enables a large percentage of the code needed for an application to be generated from a high-level platform-independent model.

Note

Model-Driven Architecture is covered in Chapter 8.

It is possible to leverage the power of MDA tools in order to support test-driven development during the construction, or implementation, phase of a project. This can be achieved by having the MDA tool generate all necessary test cases on our behalf.

This approach has significant advantages that go beyond the time-savings made by having the test cases autogenerated in an IDE. Many software projects fail to build up a comprehensive test suite for the application. Often, the result of unit test development efforts are a patchwork of tests spread unevenly across the application. Consequently, some areas of the system are overtested, [1] while other areas are completely overlooked.

This problem arises because the responsibility of producing unit tests traditionally is placed on the developer. Each developer is likely to take a different view of the value of unit tests. Some developers might diligently generate tests for all their work, while others may be far less rigorous in their approach to testing.

Standards that stress the importance of testing and provide guidelines to the necessary test coverage expected can help. Nevertheless, it still falls to the individual to follow the standards and work within the guidelines.

With MDA, it is possible to make the development of a consistent test suite much less of a hit-or-miss affair. The architect should be thinking strategically as to where test coverage is required for the design. Based on these decisions, an MDA transformation mapping can be used to generate test cases for the appropriate elements within the model.

Using the MDA tool in this manner ensures a consistent approach to test coverage is taken. Developers simply need to fill in the implementation detail for each generated test case.

Generating Test Cases with AndroMDA

The cartridge system of AndroMDA makes it easy to add the capability to the tool to generate test cases. Two options are available: either develop a new test cartridge or extend one of the existing cartridges by adding a new Velocity template.

Note

Velocity templates and the Velocity template language are described in Chapter 6.

Updating one of the existing cartridges is perhaps the easiest method for getting something in place quickly, while building a new cartridge from scratch allows tests to be built for any project type by adding the test cartridge to the AndroMDA classpath. Note, however, that the latter option is a far more challenging and time-consuming, undertaking.

Regardless of which approach is taken, a Velocity template lies at the heart of the solution. To get you started, Listing 14-3 provides an example UnitTest.vsl template.

Example 14-3. UnitTest.vsl AndroMDA Cartridge Velocity Template

#set($packagename=$transform.findPackageName($class.package))
#set($remoteInterface=$str.lowerCaseFirstLetter(${class.name}))
package $packagename;

import java.util.Hashtable;
import javax.naming.InitialContext;
import junit.framework.TestCase;

public class ${class.name}Test extends TestCase {

  private ${class.name} $remoteInterface = null;

  /*
   * Perform all set up work here
   *
   */
  protected void setUp() throws Exception {
    super.setUp();

    Hashtable props = new Hashtable();

    props.put(InitialContext.INITIAL_CONTEXT_FACTORY,
       "weblogic.jndi.WLInitialContextFactory");
    props
       .put(InitialContext.PROVIDER_URL, "URL_AS_PROPERTY");

    // Obtain remote interface to the session bean under test
    //
    $remoteInterface = ${class.name}Util.getHome(props).create();
  }

  protected void tearDown() throws Exception {
  }

#foreach ($op in $class.operations)
  #set($testMethod = $op.getName())

  public final void test$testMethod() throws Exception {

    // TODO Add your test here

  }
#end

}

In the configuration of the MDA cartridge, associate the template with all model elements with a stereotype of Service. The template applies to any cartridge that generates EJB components for all classes stereotyped as a Service.

Note

Refer to Chapter 8 for information on the specifics of configuring AndroMDA cartridges.

The template generates a test case for each service element from the model. Tests take on the name of the service with Test appended to the name.

Walking through the example template, the setUp() method follows from the previous example and obtains the remote interface to the session bean generated by AndroMDA for the service. The remote interface is intended for use in each of the test methods.

A Velocity Template Language (VTL) #foreach statement is used to iterate through each operation on the model element. The name of the operation is obtained by interrogating the model. From the list of operations, the test methods for each business method on the service can be built up.

You can use the example shown in Listing 14-3 as a basis for further experimentation. AndroMDA makes all model metadata accessible from within the template, allowing the generation of sophisticated test cases. The trick is to avoid getting carried away and making the generated tests overly complicated. You’ll find a little goes a long way.

tip

Be devious and add a fail("Test not implemented") in every test method. This ensures tests fail unless the developer corrects the problem by supplying a valid test.

Testing from the Inside Out

The concept of a test-driven approach to development is extremely attractive if the full benefits of the paradigm can be realized. An exhaustive set of automated tests provides the safety net necessary for rapidly accommodating change on the project. In addition, it raises software quality, which in turn reduces the number of defects detected in formal systems testing. All of these benefits combine to facilitate a rapid development process.

While the concept of test-driven development is sound, the practicalities of implementing a suite of automated unit tests at the level required represents a sizeable technical challenge for the project team.

Writing valid unit tests is not a trivial task. In some cases, writing the test can prove a greater technical challenge than writing the class under test.

One of the greatest difficulties lies in isolating the class under test. Within the boundaries of a system, an object collaborates with other objects in order to execute specific functionality. For example, an object may require a number of domain objects returned from the persistence layer. The state of these domain objects determines the behavior of the class under test.

Designing a test that provides a class with all the information it needs to perform a specific operation can be an involved process, as dependencies between components make the testing of objects in isolation problematic. Good design practice that sees component coupling carried out through strongly typed interfaces can help ease the burden of writing unit tests. However, pulling an object out from a nest of collaborating instances is far from a trivial exercise in even the most well-designed system. Moreover, complete test coverage requires that a number of different scenarios be run. This involves setting the state of collaborating objects for each test scenario.

Object isolation is not the only issue with a test-driven approach. If we stick rigidly to the rule that each class is fully tested before proceeding to the next class to be developed, we are forced to adopt a bottom-up approach, whereby all foundation classes must be implemented ahead of any classes relying on those foundation services.

A bottom-up approach is not always the most practical approach. Project scheduling may dictate that classes in upper layers of the architecture must be implemented in parallel with classes from the lower layers. This raises the question of how to test an object when the classes upon which it is dependent have yet to be written, a problem inherent to even traditional development approaches.

One solution to the problem of how to test objects in isolation and undertake a top-down approach is to use mock objects, familiarly known as mocks.

What Is a Mock?

A mock object is best described as a dummy, or stub, implementation of an actual class. Mocks take the place of the real objects a class under test relies upon. Thus, if the class under test requires a domain object in a particular state for a specific test scenario, then you can substitute the domain object with a mock. Likewise, if the domain object has yet to be implemented, a mock can take its place.

The term mock was first coined in a notable paper by Tim Mackinnon, Steve Freeman, and Philip Craig, Endo-Testing: Unit Testing with Mock Objects, [Mackinnon, 2000]. The authors invented the term endo-testing as a play on endoscopic surgery, a process that enables the surgeon to work on the patient without having to make a large incision to allow access.

Mocks may sound very similar to test stubs in that they stub out the real code. Like test stubs, a mock can be configured to return a specific result in order to conduct a test scenario. This approach makes it easier to simulate events that are difficult to produce in a real environment, for example, a database connection being unavailable or an exception being thrown from a method call.

The comparison with test stubs is valid, but mocks go beyond what is offered by the typical test stub. Units test operate on the public- and package-level members of a class. Although they should exercise all execution paths within the code and test all boundary conditions, they do not interact directly with the internals of the class. However, using mocks, the internals of a class can be inspected as part of the test; hence the reference to endoscopic surgery—mocks enable testing from the inside.

For the class under test, we can predict what calls we expect to be made on a mock object for a given test scenario. The mock object can record these calls. If an expected call fails to be made or the calls are made in the wrong order, then the mock can elect to fail the test. Thus, mocks allow a form of white-box testing to be undertaken as part of the unit testing process.

White-box testing is testing inside the methods of a class. Unlike black-box testing, which tests what a class does, white-box testing focuses on how a class performs its tasks. It is usually an invasive process and requires the class to be implemented in such as way that the internals can be examined while executing. Mocks offer a mechanism to inspect the internals of the class without having to structure the code in a manner that directly supports white-box testing.

Working with Mocks

Mocks require that we follow good design practices and architect our system around the use of interfaces. Divorcing an object’s implementation from its interface is sound software engineering. It also facilitates the use of mock objects because the approach enables mocks to be easily substituted in place of the real object.

When it comes to writing mocks for a unit test, you can write the mock from scratch, implementing all the methods on the interface as needed for the test. This approach is labor-intensive and means the amount of code dedicated to testing the system is prone to bloating in size. Thankfully, a number of frameworks are available that can help reduce the effort involved in producing mocks. We consider these next.

Mock Flavors

Mocks take on the interface they are mocking. With an interface defining the shape of the mock, the mock object is another ideal candidate for code generation.

Although numerous options exist for generating mock objects, there are essentially two distinct approaches: static mocks and dynamic mocks.

Static mocks can be either handwritten or the output of a mock object code generator. Several code generators for mock objects are available; MockCreator and MockMaker are two notable examples. Each of these generators creates a mock object for a given interface. Optional Eclipse plug-ins are available for each generator.

For more information on MockCreator, see http://mockcreator.sourceforge.net. To obtain MockMaker, visit the site at http://www.mockmaker.org. Both code generators are freely available. In addition, XDoclet provides a mock generator, <mockdoclet>, which generates mock objects from the @mock.generate tag.

Note

Chapter 6 covers the use of XDoclet.

By now, you should be well versed in the use of code generators and how XDoclet attributes can drive code generation. We now look at dynamic mock objects, which typically rely on Java reflection to work their magic.

Dyna-Mocks to the Rescue

As with static mock generators, a number of freely available dynamic mock implementations exist. Again, we have two notable examples: jMock and EasyMock. Both enjoy a level of popularity among developers. and each provides a reasonable level of documentation. In selecting between the two, your best option is to perform your own bake-off and determine which best suits the needs of your particular project.

JMock is maintained under the Codehaus project. so see http://jmock.codehaus.org. For EasyMock, pay a visit to http://www.easymock.org.

To show how mocks can assist in the process of unit testing and to demonstrate how dynamic mocks work, we go through an example unit test that uses a dynamic mock as the collaborating object for the class under test. The example uses EasyMock for defining the mock object.

The objective of the example test is to validate that the class under test calls the correct methods on the collaborating object. The example is based on two classes: the class under test, Invoice, and the collaborating object, which is an implementation of the interface ICustomer. Listing 14-4 shows the ICustomer interface.

Example 14-4. The ICustomer Interface

public interface ICustomer {

  /**
   * Customer's discount entitlement
   */
  public abstract double discountRate();

  /**
   * Customer's interest on credit sales
   */
  public abstract double creditRate();

} // ICustomer

Listing 14-5 has the implementation of the Invoice class.

Example 14-5. The Invoice Class under Test

public class Invoice {

  private double    invoiceAmount;

  private ICustomer customer;

  /**
   * Associate the invoice with a customer
   *
   */
  public Invoice(ICustomer customer) {
    this.customer = customer;
  }

  /**
   * Set the invoice amount
   */
  public void setInvoiceAmount(double invoiceAmount) {
    this.invoiceAmount = invoiceAmount;
  }

  /**
   * Calculate the customer's discount for the invoice amount
   */
  public double discount() {

    return invoiceAmount * (customer.creditRate() / 100);
  }

} // Invoice

The Invoice class doesn’t do much, but a mistake has still been made. The discount() method returns the value of the discount the customer receives off the total invoice amount. The calculation is based on the customer’s special discount rate. In this system, loyal customers get a better discount. A customer’s discount percentage is returned from the method discountRate() by the object implementing the ICustomer interface. Unfortunately, in this case, the discount() method on the Invoice class has been incorrectly implemented by using the customer’s credit rate in place of the discount rate.

It’s a silly error, but problems of this type can easily arise if care isn’t taken when using code-editor productivity features like code assist or code completion. Here the developer has inadvertently selected the wrong method from the list. Code assist is an invaluable editor feature, but you have to be careful not to get too lazy.

To detect this type of error, a test is required that confirms the correct methods are being called on the collaborating objects, in this case ICustomer. This is easily done with mocks.

Listing 14-6 gives the test case for the test, using a dynamic mock object in place of the implementation for the ICustomer interface.

Example 14-6. The InvoiceTest Unit Test Case

import junit.framework.TestCase;
import org.easymock.MockControl;

public class InvoiceTest extends TestCase {

  protected void setUp() throws Exception {
    super.setUp();

    // Create mock based on interface
    //
    control = MockControl.createControl(ICustomer.class);
    customerMock = (ICustomer) control.getMock();

    // Prepare the class under test
    //
    invoice = new Invoice(customerMock);
    invoice.setInvoiceAmount(200.0);
  }

  public final void testDiscount() {

    // Configure the mock ready for the test
    //
    customerMock.discountRate();
    control.setReturnValue(10.0);

    // Place mock in the active state
    //
    control.replay();

    // Run the test
    //
    assertEquals(20.0, invoice.discount(), 0.0);

    // Ensure test class calls discountRate() method
    //
    control.verify();
  }

  private MockControl control;
  private ICustomer   customerMock;
  private Invoice     invoice;

} // InvoiceTest

The setUp() method of the test case is where the dynamic mock is created. Using Easy-Mock, two classes are created: an instance of MockControl, which takes the type of interface to be mocked as a parameter and is used for controlling the mock object, and the mock object itself, which is returned from the MockControl instance with a call to the factory method get-Control().

The mock returned from the MockControl exposes the ICustomer interface, so we can substitute the mock for the real object. MockControl achieves this feat of impersonation by using java.lang.reflect.Proxy under the covers.

The final act of the setUp() method is to create an instance of Invoice for using in the test. The mock is passed to the Invoice object via its constructor. The Invoice instance is then seeded with an invoice amount for the test. This value is used for checking against our expected results.

The final steps for setting up the test are completed within the testDiscount() method on the TestCase. At this stage, the newly created mock is in record mode. In this state, the methods that must be invoked on the mock when the test is run can be specified. EasyMock enables this information to be specified in detail. You can specify the order of calls, the number of times each call should be made, and the expected parameters. Refer to the EasyMock documentation for a full list of its capabilities.

The test in the example isn’t overly complex. Next, we ensure that the correct method is invoked on the collaborating object and that the class under test returns the expected result.

For defining the expected method invocation, the required method is called on the mock object with the mock in the record state. In this case, the method is discountRate().

The next step is to set the result returned by the mock when the class under test invokes the method. This is achieved by calling setReturnValue() on the MockControl instance.

Now that the mock is configured, it is placed in the active state with a call to replay() on the MockControl.

A call to discount() on the Invoice object initiates the test. The called method is wrapped in a JUnit assertEquals so the expected results can be confirmed. The final call to verify() on the MockControl instructs the mock controller to assert if the methods specified when the mock was in the record state are not invoked. All that remains to do is run the test.

Here is the error reported when the test case is run.

junit.framework.AssertionFailedError:
  Unexpected method call creditRate():
    creditRate(): expected: 0, actual: 1

The mock asserts during the execution of the test that an unexpected method, creditRate(), has been called. We’ve found our bug. The troublesome code can be corrected and the test rerun to achieve the green bar indicating a successful test.

Based on the example, you can see that mocks add another dimension to the type of unit testing you can perform. Not only can the mock stand in for the real collaborating object, as is possible with a traditional test stub, it can also validate the class under test from the inside. This all leads to a rigorous testing process and hence higher quality code.

Choosing Between Static and Dynamic Mocks

Dynamic mocks are attractive—they offer the ability to create powerful test stubs at runtime. All the code relating to a test can be maintained within the test case itself, avoiding the need to support further test code for the implementation of the mock.

Despite these benefits, mocks are not suited to all test scenarios. The example shown cannot be used to test an EJB component through its remote interface, as the dynamic proxy cannot be marshaled as part of the call. Moreover, as has already been stated, testing is not a trivial undertaking, and it is likely a combination of static and dynamic mocks will make up the overall test suite.

A general heuristic to follow when working with mocks is to use dynamic mocks where possible for their convenience and setup speed. Dynamic mocks help keep the size of the code base down and can be localized to a single test. For situations in which this cannot be achieved, look to generate the mock, and if all else fails, simply write your own implementation. Static mocks can be used to test Enterprise JavaBeans, and the strong typing of a static interface can be an advantage because it enables the compiler to detect errors in the test suite.

Summary

After reading this chapter, you should have an appreciation of the important role testing has to play throughout the software development lifecycle. The testing activity is more than a backstop at the end of the project for catching defects. It is critical to rapid application development, which can be greatly facilitated by putting in place an automated unit test suite for the project. The presence of this test suite and the adoption of a test-driven approach to development offers two major benefits:

  • Agility.

    The project team can respond quickly and accurately to any changes, whether in the form of updated end-user requirements or design modifications. All changes can be immediately validated with the unit test suite, thereby providing a safety net for potentially sweeping changes to the design and code base.

    The ability to absorb changes without adversely impacting the project’s timeframe and compromising quality is an essential ingredient for successful rapid application development.

  • Accuracy.

    The cost of resolving defects in a system increases as the application progresses through the stages of the software development lifecycle. Defects detected during a formal system test by a dedicated quality-assurance team incur a far higher penalty than those detected early in the process during implementation. By having the development team adopt a rigorous, upfront approach to testing, the quality of releases into the formal test environment is improved, thereby reducing the number of defects detected at the later stages of the project.

Unit tests also reduce the time it takes to resolve a defect, since unit tests make it easily reproducible. Furthermore, adopting a test-driven approach helps instill the importance of testing for producing quality software within the project team.

In the next chapter, we keep the focus on the testing effort but broaden the scope to include tools and techniques for automating the process of conducting functional and load-based testing.

Additional Information

The process of writing unit tests has gathered a critical mass of knowledge thanks to the large take up of JUnit, and a set of best practices has emerged from the wider developer community for writing effective units tests. To find out more about the intricacies of automated unit testing, the JUnit site found at http://www.junit.org is an excellent starting point and provides an extensive set of links to literature on the subject.

Kent Beck has published a book, Test Driven Development by Example [Beck, 2002], on test-driven development that illustrates the entire process with a complete example.

For more details on mocks, the uncontested home of mock objects on the Web is http://www.mockobjects.com. The site provides an extensive amount of information on anything and everything relating to mock objects.



[1] Some people would argue that you can never have enough testing.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.224.67.84