C H A P T E R  12

Test Frameworks

This chapter covers the topic of test frameworks. In a broad sense, a test framework is a technology or set of tools that supports automated software testing. Most developers are familiar with one of the code-driven test frameworks, which are commonly referred to as unit testing frameworks. In this chapter, you will learn about the four most popular unit testing frameworks. Another important component to any test framework is the test runner. You will learn about a variety of test runners, from the minimalist runners to those that are well-integrated into Visual Studio. This chapter also provides an overview of the xUnit1 test pattern and explains how each of the unit testing frameworks discussed fits the pattern.

Later in this chapter you will learn about other test frameworks. These include mock object frameworks, database testing with NDbUnit, user interface testing frameworks, and acceptance testing frameworks. Specific practices are highlighted to emphasize how test frameworks offer .NET developers more effective testing and better test methodologies. Table 12-1 lists the ruthlessly helpful practices related to test frameworks.

images

COMMENTARY

Unit Testing Frameworks

The fundamental component of code-driven testing is the unit testing framework. Broadly speaking, these testing frameworks are known as xUnit test frameworks because they generally fit the xUnit test pattern. The xUnit test pattern is described a little later in this chapter. For now, the main point is that the purpose of these unit testing frameworks is to provide facilities that help do the following:

  • Identify the test method
  • Identify the test classes and fixtures
  • Provide constructs with the means to
    • Perform data-driven tests
    • Perform pre-test setup and arrangement
    • Perform post-test teardown and cleanup
    • Perform test-fixture setup and teardown
    • Handle tests that expect an exception
  • Provide constructs that support
    • Making assertions
    • Skipping or ignoring tests
    • Categorizing tests

____________

1 Martin Fowler provides a nice background on xUnit at http://martinfowler.com/bliki/Xunit.html.

The unit test framework helps developers write tests that are effective and can endure as the system evolves.

This chapter will focus on four unit test frameworks: NUnit, MbUnit, MSTest, and xUnit.net. These are the most popular unit test frameworks for .NET development. Each of these unit test frameworks offers a somewhat different approach and benefits to unit testing software. Table 12-2 provides a description of these widely-used .NET unit testing frameworks.

images

Test Runners

A test runner is an application that performs the tests. The .NET test runners typically load one or more test assemblies and recognize the test methods written with the unit test framework. This set of test methods is often referred to as the test suite.

As the test suite runs, the runner indicates progress and status when each test method does the following:

____________

2 See “Why did we build xUnit.net?” at http://xunit.codeplex.com/wikipage?title=WhyDidWeBuildXunit

  • Executes successfully
  • Fails to meet the constraint of an assertion
  • Fails because an error occurs
  • Is skipped, ignored, or inconclusive

Test runners differ in that some provide a graphical user interface (GUI) while others are a console application. The GUI runner is intended to provide a visual interface so that the person running the test can see the tests execute, monitor the results, and interact with the test runner. The console runner is intended to provide command-line execution of the test suite with the general goal of automating the test run. Both types of test runners are important in the development of a complete and comprehensive set of automated tests.

NUnit GUI and Console Runners

The NUnit test framework is widely used and supported by many third-party tools, such as continuous integration servers, coverage tools, and Visual Studio add-ins. All of the examples in Chapter 8 were written using the NUnit framework. This section starts with the assumption that NUnit is the framework you have selected to write test code. In order to remain focused on the topic of the test runner, it also assumes that the test code is already written. To follow along with the sample code, open the Lender.Slos.sln file, found in the SamplesCh121_NUnit folder, from within Visual Studio. Rebuild the solution to ensure that all the code builds properly.

With the unit tests code written, the challenge is to run the test code through a test runner. NUnit provides a Windows graphical user interface (GUI) as a part of NUnit. This program is a Windows forms application named nunit.exe (or nunit-x86.exe). If NUnit is installed then it is available through a program or desktop shortcut. Otherwise, it can be found within the NUnit installation folder, under the bin folder. A screenshot of the NUnit GUI is shown in Figure 12-1.

images

Figure 12-1. The NUnit graphical user interface (GUI)

The NUnit documentation provides a complete description of how to use the NUnit GUI runner.3 To keep things brief, this section is limited to the essential information and a basic orientation. With the NUnit GUI application running, select the File images Open Project … menu item. Navigate to the SamplesCh121_NUnitTests.Unit.Lender.Slos.Financial folder and open the Tests.Unit.Lender.Slos.Financial.nunit project file. Press the Run button.

Notice the tree view on the left-hand side of the screen; this is the list of tests found in the Tests.Unit.Lender.Slos.Financial.dll assembly. The list of tests is the test suite. There is one failing test, which is shown in Figure 12-1. When you select the failing test in the tree, the right-hand side of the screen provides detail about that failing test on the Errors and Failures tab. It is clear that the test expected the value of 0.005909, but since the actual value is 0.005908, the test failed. In addition, the Errors and Failures tab shows the stack trace, which helps reveal the line of code that caused the test to fail. Change the expected value for the third test case from 0.005909 to 0.005908, which is provided to the RatePerMonth_WithValidAnnualPercentageRate_ExpectProperRatePerPeriod method. Rebuild the solution and return to the NUnit GUI runner. Run the tests again and all of the tests should pass.

The NUnit GUI runner is straightforward to run from outside of Visual Studio. However, for debugging in Visual Studio with the Start Debugging menu (F5 debugging), it is certainly a lot better to run the NUnit GUI runner as part of the debugging session. You accomplish this by changing the project settings for the Tests.Unit.Lender.Slos.Financial project in Visual Studio. You can follow along by referring to the sample code under the SamplesCh121_NUnit folder. To start, open the properties window for the test project from Visual Studio's Solution Explorer. Select the Debug tab. This is shown in Figure 12-2. In the Start Action area, make the change to “Start an external program” with the nunit.exe program selected. Also, in the Start Options the “Command line arguments” should have the relative path to the Tests.Unit.Lender.Slos.Financial.nunit project file. Save the Visual Studio project.

____________

images

Figure 12-2. Configuring the Visual Studio debugger to run NUnit GUI

Once Visual Studio is configured to run the NUnit GUI runner for debugging, performing F5 debugging launches the NUnit GUI runner. While the NUnit GUI is running tests, breakpoints in the test code and the code-under-test are hit when they are reached.

images Practice 12-1 Use the Test Runner to Debug Both the Test Code and the Code-Under-Test

An alternative to running the NUnit GUI runner is the console runner. This is the nunit-console.exe program found under the NUnit bin folder. The NUnit console runner affords you the opportunity to automate the task of running testing and provides many command-line options.4 Within Visual Studio the NUnit console can be run using the External Tool feature available under the Tools menu within Visual Studio. Figure 12-3 illustrates how to configure the NUnit console runner as an external tool.5

images

Figure 12-3. Configuring Visual Studio external tools to run NUnit console

____________

5 This configuration assumes that you created an environment variable named NUnitRoot that points to the proper NUnit folder, for example, C:ToolsNUnitv2.5.10.11092in et-2.0.

Once configured as an external tool, the NUnit console is run from Visual Studio by selecting the newly added Run Tests menu item under the Tools main menu. The results of the NUnit console are output to the Visual Studio Output window, as shown in Figure 12-4.

images

Figure 12-4. Output from running NUnit console as an external tool

What you have learned up to this point is the minimalist's way to integrate test runners with Visual Studio. For the NUnit unit test framework, the GUI and console runners are provided for free. In the next section, you will look at the prospect of buying a commercial product that offers more integration with Visual Studio.

ReSharper Test Runner

This section continues with the premise that you have selected NUnit as your testing framework. Now, you are looking for a test runner that is well-integrated into Visual Studio to achieve greater developer effectiveness and productivity. The Visual Studio add-in test runners provide that boost. They make it very easy to run a single unit test or a subset of unit tests from a context menu, toolbar, and other places in Visual Studio. The results are displayed within purpose-built Visual Studio windows. Having the output right there helps you analyze the test results and navigate to the test code or code-under-test, whenever there is a failing test. This brings unit testing front and center within the Visual Studio IDE. This is a big help to the developer because the workflow to write test code, run test code, and resolve issues is more efficient and the cycle time is much shorter.

For unit tests written with the MSTest framework, the Visual Studio product provides many of these facilities, which we will cover in the next section. This section is about commercial test runners, most of which are Visual Studio add-ins, that focus on both supporting multiple unit testing frameworks (especially NUnit, MbUnit, MSTest, and xUnit.net) and providing “right-click access” to running tests from within Visual Studio. Let's take a closer look at one of these products: ReSharper.

ReSharper is a Visual Studio extension product from JetBrains (www.jetbrains.com). In Chapter 6, you learned that ReSharper is a general tool that provides code inspections and refactoring functionality. Another major feature of ReSharper is that it provides unit testing support that is well integrated with Visual Studio. It directly supports the NUnit and MSTest frameworks. Plug-ins are available to support MbUnit and xUnit.net.

ReSharper provides many context menus, docking windows, toolbars, sidebar marks, and more.6 To run unit tests, simply right-click a tests method, test class, test project, or the entire solution, and ReSharper will run the tests defined within that context. For example, to run all the tests within a test project, simply right-click the project node in the Visual Studio Solution Explorer window and the context menu includes the Run Unit Tests choice, as shown in Figure 12-5. ReSharper provides similar context menus throughout the Visual Studio IDE. ReSharper offers many ways to support unit testing such as helping you to debug the test code and the code-under-test by selecting the appropriate context menu item.

images

Figure 12-5. ReSharper context menu for running tests

Another significant element that ReSharper adds to the Visual Studio IDE is the Unit Test Explorer window, shown in Figure 12-6. This window allows you to see all the available tests within your solution. This is important because the solution may contain a variety of tests that include unit, integration, surface, performance, and the other types of tests that you learned about in Chapter 8. It is important that the developer run all the tests that are appropriate to the work they are currently performing; just as important, the developer can avoid running tests that involve setup and configuration or are not relevant to their current work.

____________

images

Figure 12-6. ReSharper provides the Unit Test Explorer window within Visual Studio.

When you run tests with ReSharper, a new unit test session is started. There is a Unit Test Sessions window within Visual Studio that allows you to see all the sessions that have run or are currently running. This window is shown in Figure 12-7. The tree provides an effective way to understand how the running test methods fit within the context provided by the test classes. In addition, when there are multiple test cases there is a node added to the tree for each test case. In Figure 12-7, it is clear from the status column that one test failed and the test case node identifies the culprit. By selecting a failed-test node in the tree, the Output tab below reveals the specific reason why that test failed. Notice that the Output tab also provides a stack trace, which is helpful when debugging exceptions thrown from deep in the call stack of a failing test.

images Practice 12-2 Purchase a Visual Studio Add-In Test Runner to Achieve Greater Productivity

images

Figure 12-7. ReSharper provides the Unit Test Sessions window within Visual Studio.

As you learned in Chapter 11, JetBrains also sells a .NET code coverage tool, called dotCover. This product integrates well with the unit testing tools of ReSharper. As shown in Figure 12-8, the statement-level coverage is reported within the Coverage tab of the ReSharper Unit Test Sessions window. Bringing code coverage results into the Visual Studio IDE is an important productivity boost for developers writing test code. In effect, dotCover detects when a test covers a particular statement in code and, equally important, identifies and highlights code that is not covered by test code.

images

Figure 12-8. ReSharper and dotCover provide coverage reporting from within Visual Studio.

images Note If profiling performance is important to you, ReSharper can allow you to quickly profile performance using its unit test runner from within Visual Studio. This requires an additional purchase and installation of the dotTrace Performance product, which is a .NET profiling tool from JetBrains.

Being productive within Visual Studio with a unit test framework other than Visual Studio Test Framework is a primary reason developers choose a commercial unit test runner that integrates with the Visual Studio IDE. These commercial add-in products also offer a lot more than running tests. Table 12-3 provides a list of products worth evaluating.

images

Visual Studio Test Runner

So far, the premise has been that you and your development team are using the NUnit test framework. Let's switch gears and consider following the “Microsoft way” of developing test code. This means using the Visual Studio Test Framework and the integrated test runner. If you have Visual Studio 2010 Ultimate, Premium, or Professional you can run automated tests from within Visual Studio or from the command line.7

Writing tests with the Visual Studio Test Framework does open up a lot of possibilities for running tests from within the IDE. First, there is the Test main menu item. From there, all the tests in the solution can be run, as shown in Figure 12-9.

____________

7 For running MSTest from the command line see http://msdn.microsoft.com/enus/library/ms182486.aspx.

images

Figure 12-9. Visual Studio test runner main menu

The Visual Studio IDE also includes a Test View window, shown in Figure 12-10. This window allows you to see all the available tests within your solution and to group and select tests. It helps you find the tests that you want to run. This is important so that developers can run only the tests that are appropriate to their current work.

images

Figure 12-10. The Visual Studio Test View window

After the tests are run, Visual Studio displays the results of the tests in the Test Results window, shown in Figure 12-11. A failing test is quickly identified and the information in the column explains why the test failed. There are toolbar buttons available to further run and debug the tests based on the results.

images

Figure 12-11. The Visual Studio Test Results window

Visual Studio test projects include test settings files and the use of diagnostic data adapters. The details are beyond the scope of this book;8 however it is important to know how to use Visual Studio tests settings to run tests under code coverage.

Within a Visual Studio solution that has a test project there is the Local.testsettings file, which runs tests locally without any of the diagnostic data adapters selected. I will briefly point out the steps to add code coverage as a diagnostic data adapter to the sample project found in SamplesCh123_VisualStudio folder. To start, open the Lender.Slos.sln file in Visual Studio. Within the Solution Explorer window there is a Local.testsettings item under the Solution Items folder. If you double-click the Local.testsettings item, Visual Studio opens the Test Settings window, shown in Figure 12-12. On the left-hand side, select the Data and Diagnostics item. For this selection, you can see that Code Coverage is a choice in the list in the Configure section. Check the Enabled checkbox and press the Apply button.

____________

8 For information on diagnostic data adapters and test settings see http://msdn.microsoft.com/enus/library/dd286743.aspx.

images

Figure 12-12. Configuring Visual Studio tests to run under code coverage

Once the change is applied, the next thing is to double-click the Code Coverage item in the list. This brings up the Code Coverage Detail window, as shown in Figure 12-13.

images

Figure 12-13. Configuring the settings in the Code Coverage Detail window

Within the Code Coverage Detail window the specific configuration for the test settings file are established. Press the OK button to save the configuration. Close the Test Settings window.

With the code coverage settings in place, the Visual Studio Test Runner now runs the tests under coverage and provides the results in the Code Coverage Results window, as shown in Figure 12-14.

images

Figure 12-14. The Visual Studio Code Coverage Results window

Gallio Test Runner

The Gallio Project is an effort to bring a common and consistent test platform to .NET development. The Gallio Automation Platform provides test runners that run tests for a number of test frameworks, including NUnit, MbUnit, MSTest, and xUnit.net. Gallio includes the command-line runner, called Echo, and a Windows GUI named Icarus. Because Gallio integrates so many testing tools and approaches it is definitely a platform worth considering if you want a complete and comprehensive testing platform. More information about Gallio and this automation platform for .NET can be found on the Gallio project website at www.gallio.org.

Figure 12-15 shows the Gallio Icarus test runner interface. The list of tests is shown in a tree view on the left-hand side. The test results are summarized in the panel on the right-hand side.

images

Figure 12-15. The Gallio Icarus GUI test runner

Gallio provides a complete execution log that details the output from the test run. An example of the execution log is shown in Figure 12-16, with specific details about one failing test.

images

Figure 12-16. The Gallio test runner execution log

xUnit.net Test Runner

The xUnit.net test framework offers a number of significant capabilities. Primary among them is performance. Many developers who work with xUnit.net find that it is a really good framework. It is sometimes referred to as a lean framework. Although xUnit.net introduces changes in terminology, it takes a fresh approach and is very flexible. One important virtue is that xUnit.net supports the ability to extend the framework.

The commercial test runners described earlier either support xUnit.net directly or there is a plug-in available. If you are not using a commercial test runner, your basic options are to use the xUnit.net GUI runner, as shown in Figure 12-17, or to use the xUnit.net console runner. Both of these runners come with the framework.

images

Figure 12-17. The xUnit.net GUI test runner

The xUnit.net console runner can be set up to run as an external tool in a manner similar to how the NUnit console is configured, as shown in Figure 12-3. The output from the xUnit.net console runner is shown in Figure 12-18.

images

Figure 12-18. Output from running xUnit.net console as an external tool

To this point you have learned about the four test frameworks and various test runners. In the next section I will broadly compare and contrast the test frameworks by examining how they fit into the xUnit test pattern.

XUnit Test Pattern

All xUnit test frameworks share the same basic architecture, referred to as the xUnit test pattern.9 First, there is the execution of one or more individual unit test methods within the test class. Also, before each test method is executed you can define a setup method, which prepares things before the test is run. After each test, a teardown method serves to clean up after the test method. Another feature is that of the test fixture, sometimes known as a test context, which initializes, sets up preconditions, and allocates resources needed to run any of the test methods. The fixture creates a known-good state before the tests run. The fixture also cleans up and disposes of any resources to return the context back to its original state. Let's look at each aspect of the xUnit test pattern in turn.

Identifying the Test Method

Each of the four unit test frameworks, NUnit, MbUnit, MSTest, and xUnit.net, uses an attribute to identify the test methods within an assembly. There is the primary attribute, which identifies a test method that has the basic test method signature. This signature is a public method without arguments that returns void. For example, using the NUnit test framework, the basic test method is defined like this:

____________

9 For more information see http://xunitpatterns.com/XUnit%20Basics.html.

[Test]
public void BasicTestMethodSignature()
{

}

When a secondary attribute is used, it identifies test methods that are driven by parameter values or some other source of data. For example, using NUnit a data-driven test is defined like this:

[TestCase("How many roads must a man walk down?", 42]
[TestCase("What is your favorite prime number?", 73]
public void ParameterDrivenTestMethodSignature(
    string ultimateQuestion,
    int expectedAnswer)
{

}

Table 12-3 lists the primary and secondary attributes used to identify test methods and data-driven tests with the four unit test frameworks.

images

Identifying the Test Class and Fixture

The goal of unit testing is to run tests in isolation and in a fully-repeatable way. In order to accomplish this it is often necessary to write code that performs before-test tasks, which precede the call to the test method. The test framework will call your before-test method to accomplish the setup. That method establishes the necessary preconditions and initial state required to run one of the test methods within the fixture. The test class and the fixture are basically synonymous.

The idea here is that the method that prepares the preconditions is identified, usually by an attribute, and the unit test framework ensures that that code is run before each test method is run. Similarly, the code that performs the after-test tasks is identified so that the unit test framework runs that method after each test is run, regardless of whether the test passes or fails. Table 12-4 provides the list of attributes or mechanisms of the four frameworks that identify within the test class any before-test or after-test methods.

images

Beyond the before-test and after-test methods, the fixture itself provides an environment and a context for the tests. The fixture setup method contains code that must be run before any of the tests in the test class are run. For example, the setup method could allocate resources to be used by all test methods within the test class. Similarly the fixture teardown method contains code that must be called after all the tests in the test class are run. This is important to ensure that the next test class starts with the test environment in a known-good state. The unit test framework supports the goal of creating and disposing of the test class's context through fixture setup and teardown methods. Table 12-5 provides the list of attributes or mechanisms to identify the fixture setup and fixture teardown methods for the four frameworks.

images

An example may best illustrate how a test runner might use these attributes to execute the test code. Listing 12-1 shows the code for a test class written to use the NUnit test framework. The test class itself is decorated with the TestFixture attribute. Each of these methods is decorated with one of the attributes defined within the NUnit.Framework namespace. These attributes allow the NUnit test runner to determine the methods to execute and the correct order in which to execute them. The basic idea is that the test runner inspects the test assembly to find the test classes and methods using the attributes of the framework.

Listing 12-1. Unit Tests Written to Use the NUnit Test Framework

namespace Tests.Unit.Lender.Slos.Financial
{
    using System;

    using NUnit.Framework;

    [TestFixture]
    public class FixtureTests
    {
        [TestFixtureSetUp]
        public void FixtureSetup()
        {
            Console.WriteLine("Fixture setup");
        }

        [TestFixtureTearDown]
        public void FixtureTeardown()
        {
            Console.WriteLine("Fixture teardown");
        }

        [SetUp]
        public void TestSetup()
        {
            Console.WriteLine("Before-test");
        }

        [TearDown]
        public void TestTeardown()
        {
            Console.WriteLine("After-test");
        }

        [Test]
        public void TestMethod_NoParameters()
        {
            Console.WriteLine("Executing 'TestMethod_NoParameters'");
        }

        [TestCase(0)]
        [TestCase(1)]
        [TestCase(2)]
        public void TestMethod_WithParameters(int index)
        {
            Console.WriteLine("Executing 'TestMethod_WithParameters' {0}", index);
        }
    }
}

Consider how the NUnit test runner executes the test code in Listing 12-1. The first method that is executed is the FixtureSetup method, which has the [TestFixtureSetup] attribute. It is called before any of the test methods are called.

Before each test method is called the TestSetup method is called, because the NUnit runner detects that it has the [Setup] attribute.

Next, one of the designated test methods is executed. After each test method is executed, the TestTeardown method is called because it is attributed with [TearDown]. This cycle repeats until all the test methods in the fixture are done executing.

The last thing that the NUnit runner does is execute the FixtureTeardown method because it has the [TestFixtureTearDown] attribute. Following is the output from running these unit tests in the NUnit test runner.


Fixture setup

***** Tests.Unit.Lender.Slos.Financial.FixtureTests.TestMethod_NoParameters

Before-test

Executing 'TestMethod_NoParameters'

After-test

***** Tests.Unit.Lender.Slos.Financial.FixtureTests.TestMethod_WithParameters(0)

Before-test

Executing 'TestMethod_WithParameters' 0

After-test

***** Tests.Unit.Lender.Slos.Financial.FixtureTests.TestMethod_WithParameters(1)

Before-test

Executing 'TestMethod_WithParameters' 1

After-test

***** Tests.Unit.Lender.Slos.Financial.FixtureTests.TestMethod_WithParameters(2)

Before-test

Executing 'TestMethod_WithParameters' 2

After-test


Fixture teardown

The Visual Studio Test Framework works in much the same way as NUnit; the attributes, however, are named differently. Listing 12-2 shows a test class written with the Visual Studio Test Framework. The basic idea remains the same: the test runner inspects the test assembly to find the test classes and methods using the attributes of the framework.

Listing 12-2. Unit Tests Written to Use the Visual Studio Test Framework

namespace Tests.Unit.Lender.Slos.Financial
{
    using System.Diagnostics;

    using Microsoft.VisualStudio.TestTools.UnitTesting;

    [TestClass]
    public class FixtureTests
    {
        [AssemblyInitialize]
        public static void AssemblyInit(TestContext context)
        {
            Trace.WriteLine("Test assembly initialize");
        }

        [AssemblyCleanup]
        public static void AssemblyCleanup()
        {
            Trace.WriteLine("Test assembly cleanup");
        }

        [ClassInitialize]
        public static void FixtureSetup(TestContext context)
        {
            Trace.WriteLine("Fixture setup");
        }

        [ClassCleanup]
        public static void FixtureTeardown()
        {
            Trace.WriteLine("Fixture teardown");
        }

        [TestInitialize]
        public void TestSetup()
        {
            Trace.WriteLine("Before-test");
        }

        [TestCleanup]
        public void TestTeardown()

        {
            Trace.WriteLine("After-test");
        }

        [TestMethod]
        public void TestMethod_NoParameters()
        {
            Trace.WriteLine("Executing 'TestMethod_NoParameters'");
        }

        [TestMethod]
        public void TestMethod_WithParameters_0()
        {
            TestMethod_WithParameters(0);
        }

        [TestMethod]
        public void TestMethod_WithParameters_1()
        {
            TestMethod_WithParameters(1);
        }

        [TestMethod]
        public void TestMethod_WithParameters_2()
        {
            TestMethod_WithParameters(2);
        }

        private void TestMethod_WithParameters(int index)
        {
            Trace.WriteLine(string.Format(
                "Executing 'TestMethod_WithParameters' {0}",
                index));
        }
    }
}

Consider how the Visual Studio test runner executes the test code in Listing 12-2. The first method that is executed is the AssemblyInit method, which has the [AssemblyInitialize] attribute. Only one method in an assembly can have this attribute. It is called before any of the test classes are initialized. After this, the execution is very similar to the NUnit runner. The next method that is executed is the FixtureSetup method, which has the [ClassInitialize] attribute. Before each test method is called the TestSetup method is called, because it is decorated with the [TestInitialize] attribute. Then one of the test methods is executed. After each test method is executed the TestTeardown method is called, because it is attributed with [TestCleanup]. This cycle repeats until all the test methods in the fixture are done executing. After all the tests in the class are done, the Visual Studio runner executes the FixtureTeardown method, because it has the [ClassCleanup] attribute. After all tests in the assembly have run, the AssemblyCleanup method is called. Following is the output from running these unit tests in the Visual Studio test runner.


Test assembly initialize

Fixture setup

Before-test

Executing 'TestMethod_NoParameters'

After-test

Before-test

Executing 'TestMethod_WithParameters' 0

After-test

Before-test

Executing 'TestMethod_WithParameters' 1

After-test

Before-test

Executing 'TestMethod_WithParameters' 2

After-test

Fixture teardown

Test assembly cleanup

In the sample code projects within the appropriate folder, you will find an example of the FixtureTests class written to use each of the four unit test frameworks.

Assertions

Within test code, an assertion is a statement that usually evaluates to true or false. The statement is written in the test method to indicate that the developer believes that the condition ought to be true at that point in the test. If the statement is false then the test is failing and the test framework is expected to record that test as a failed test.

Assertions are the primary way that the developer makes a statement about how the code is intended to work. The assertion is expected to be true and the developer wants the test framework to fail the test when the assertion is false. In effect, this is how the developer communicates with the test framework. For example, if the developer expects that the returned result should not be null, a statement line Assert.IsNotNull(result) is written into the test code.

Classic Model of Assertions

The classic model of assertions is to use one of the helper methods of an Assert class. Most unit test frameworks provide an assertion helper class with straightforward methods for making assertions. Because the method names are so similar, once you know the method names in one framework it is not hard to figure out the name in another framework.10 Table 12-6 provides a list of the assertion types with examples from the NUnit unit test framework.

images

Within the frameworks there are more helper classes that support additional test method assertions. In NUnit these helper classes include StringAssert, CollectionAssert, and FileAssert.

Constraint-Based Assert Model

The constraint-based assert model was introduced to NUnit in version 2.4. This model uses a single That method of the Assert helper class for all assertions.11 This one method is passed an object and a constraint to perform the assertion. It can use one of the NUnit “syntax helper” classes to provide the constraint. For example, one of the syntax helpers, Is.EqualsTo, is shown in Listing 12-3. Many developers prefer this style of writing assertion statements.

____________

10 Here is a comparison of the assertion method names in different frameworks: http://xunit.codeplex.com/wikipage?title=Comparisons#assertions.

Listing 12-3. Unit Test Frameworks: Constraint-Based Assertions

    [Test]
    public void Load_WithValidFile_ExpectProperData()
    {
        // Arrange
        var expectedData = "{BEB5C694-8302-4397-990E-D1CA29C163F1}";
        var fileInfo = new System.IO.FileInfo("test.dat");

        var classUnderTest = new Import();

        // Act
        classUnderTest.Load(fileInfo);

        // Assert
        Assert.That(expectedData, Is.EqualTo(classUnderTest.Data));
    }

One big advantage to the constraint-based model is that you can implement custom constraints by writing a class that implements the IConstraint interface. Charlie Poole compares and contrasts the two assert models in a blog posting at http://nunit.net/blogs/?p=44.

Mock Object Frameworks

Chapter 8 discussed the need for fakes, stubs, and mocks. In the automated testing examples from that chapter the Moq framework is used to illustrate the application of a mock object framework.

images Practice 12-3  Use a Mock Object Framework to Provide Stub and Mock Functionality

Generally speaking, a mock object framework dynamically generates a fake object, which is either a stub or mock, as the test code runs. This eliminates the need for you to write a fake implementation of an interface or create a fake subclass. Using one of these frameworks saves a lot of time and provides many features that are tedious to write into handwritten fake implementations.

Dynamic Fake Objects with Rhino Mocks

Rhino Mocks is a dynamic mock object framework for.NET development. Rhino Mocks is a straightforward way to dynamically generate fake objects that implement the specified interface or are derived from the specified type. In this section we will look at using Rhino Mocks to dynamically create fake objects that implement an interface.

It is important to know that Rhino Mocks has some limitations to what it cannot fake. Those limitations include the fact that it cannot

  • Intercept calls to non-virtual instance properties and methods
  • Create a mock object from a private interface
  • Create a mock object from a sealed class

The sample code for this section is found within the SamplesCh125_RhinoMocks folder. The goal here is to test the Student class, which is found under the Lender.Slos.Model namespace. As you learned in Chapter 8, the internal constructor is called to create an instance of the Student in the test code. This constructor is shown in Listing 12-4. The Rhino Mocks framework is used to generate the fake repository objects that this constructor requires.

Listing 12-4. The Student Class Constructor

public class Student
{
    private readonly IRepository<IndividualEntity>_individualRepo;
    private readonly IRepository<StudentEntity> _studentRepo;



    internal Student(
        IRepository<IndividualEntity> individualRepo,
        IRepository<StudentEntity> studentRepo)
    {
        _individualRepo = individualRepo;
        _studentRepo = studentRepo;
        HighSchool = new School();
    }


}

The test code that tests the Save method of the Student class is shown in Listing 12-5. Testing the code's interaction with the repositories is not the goal of this test method; therefore, the Rhino Mocks framework is used to generate stub objects.

Rhino Mocks provides the MockRepository static class as a primary way to write arrange-act-assert pattern test methods. In Listing 12-5 the call to the MockRepository.GenerateStub<IRepository<IndividualEntity>>() method dynamically generates an object that implements the IRepository<IndividualEntity> interface. This object, which is now in the stubIndividualRepo variable, is later passed to the Student constructor. The next line adds a stub-expectation to this stubIndividualRepo object by providing a lambda expression to the Stub method. What this expression is telling the stub object is to expect a call to the Update method and the arguments can be any object of type IndividualEntity. In this way, the stub object is now dynamically primed to receive a call to the Update method.

Listing 12-5. Using Rhino Mocks to Generate Stub Repositories

[Test]
public void Save_WithAnExistingStudentImproperlyCreated_ExpectInvalidOperationException()
{
    // Arrange

    var today = new DateTime(2003, 5, 17);

    const int ExpectedStudentId = 897931;

    var stubIndividualRepo = MockRepository.GenerateStub<IRepository<IndividualEntity>>();
    stubIndividualRepo
        .Stub(e => e.Update(Arg<IndividualEntity>.Is.Anything));

    var stubStudentRepo = MockRepository.GenerateStub<IRepository<StudentEntity>>();
    stubStudentRepo
        .Stub(e => e.Retrieve(ExpectedStudentId))
        .Return(null);
    stubStudentRepo
        .Stub(e => e.Create(Arg<StudentEntity>.Is.Anything))
        .Return(23);

    var classUnderTest =
        new Student(stubIndividualRepo, stubStudentRepo)
        {
            Id = ExpectedStudentId,
            Today = today,
            DateOfBirth = today.AddYears(-19),
        };

    // Act
    TestDelegate act = () => classUnderTest.Save();

    // Assert
    Assert.Throws>InvalidOperationException>(act);
}

The next stub object that is created is the stubStudentRepo. The setup here involves two methods. The first is the Retrieve method, which is set up to return null. The second is the Create method, which is set up to return 23. Since the goal is to have the Save method throw an exception when it is called, the setup for the Create method returns 23. This ensures that the returned value won't equal the ExpectedStudentId value of 897931. These two Rhino Mocks fake objects allow the call to the Save method to proceed to the point when the exception is thrown, which is exactly what you want. In effect, these stubs stand in for the repositories that the Student class uses so that they behave the way the test wants them to behave.

Let's switch over to interaction testing using a mock object instead of a stub. In Listing 12-6, the mockIndividualRepo object is generated from the call to the Rhino Mocks framework's MockRepository.GenerateStrictMock<IRepository<IndividualEntity>>() method. There are two things to notice. First, the variable name uses the mock prefix instead of stub. This is so you remember that this fake object is used for interaction testing. Second, the generation method that is called is named GenerateStrictMock. The strict behavior means that any method calls that are not expected cause an exception. In effect, any unexpected interaction results in a failed test, which is what you want. The opposite of strict behavior is loose behavior. Generally speaking, you should use strict behavior so that the test method does not pass when unexpected interactions should fail the test. Loose behavior can hide flaws in the test code or the code-under-test.

Listing 12-6. Using Rhino Mocks to Generate a Mock Repository for Interaction Testing

[Test]
public void Save_WithAnExistingStudent_ExpectIndividualDalUpdateIsCalledOnce()
{
    // Arrange
    var today = new DateTime(2003, 5, 17);

    const int ExpectedStudentId = 897931;
    var studentEntity = new StudentEntity { Id = ExpectedStudentId, };

    var stubStudentRepo = MockRepository.GenerateStub<IRepository<StudentEntity>>();
    stubStudentRepo
        .Stub(e => e.Retrieve(ExpectedStudentId))
        .Return(studentEntity);

    var mockIndividualRepo = MockRepository
        .GenerateStrictMock<IRepository<IndividualEntity>>();
    mockIndividualRepo
        .Expect(e => e.Update(Arg<IndividualEntity>.Is.Anything))
        .Repeat
        .Once();

    var classUnderTest =
        new Student(mockIndividualRepo, stubStudentRepo)
        {
            Id = ExpectedStudentId,
            Today = today,
            DateOfBirth = today.AddYears(-19),
        };

    // Act
    classUnderTest.Save();

    // Assert
    Assert.AreEqual(ExpectedStudentId, classUnderTest.Id);
    mockIndividualRepo.VerifyAllExpectations();
}

In Listing 12-6, the line after the mockIndividualRepo object is generated sets the expectation for interaction with the mock. This expectation is set with a call to the Expect method, passing in a lambda expression. The expression conveys that the Update method is expected to be called. For this setup, the argument passed to the Update method can be any object. The next part is the Repeat property, which is set to expect that Update will be called once and only once.

At this point, the test method proceeds as a regular test method. It is not until the last line of the test method that expectations are verified. When the VerifyAllExpectations method is called on the mockIndividualRepo object, the mock object framework verifies the interactions. If the expectations set during arrangement are all met, this statement does not throw an error. However, if the expectations were not met, the Rhino Mocks framework throws an exception and the test fails.

Mock object frameworks are an effective way to dynamically generate stubs and mocks. These frameworks provide an effective mechanism to quickly establish stub object behavior that allows the test code to focus on creating the conditions that the test is trying to create. In addition, these frameworks provide mechanisms to verify that the code-under-test is interacting with dependencies in the proper and expected way.

Test in Isolation with Moles

In Chapter 2 you learned about Microsoft Research and the Pex and Moles project. The Moles framework allows Pex to test code in isolation so that Pex is able to automatically generate tests. The goal of isolation testing is to test the code-under-test in a way that is separate from dependencies and underlying components and subsystems. This section looks at how to use Moles to write tests in isolation.

images Practice 12-4 Use an Isolator to Fake Difficult and Pernicious Dependencies

images Note This section refers to sample code in the SamplesCh126_Moles folder. The sample code is based on the Unit Testing with Microsoft Moles tutorial that comes with the Moles documentation.12

The Import class in Listing 12-7 depends on the FileSystem class. This FileSystem class is an external dependency that is difficult to fake with some of the mock object frameworks like Moq and Rhino Mocks. The reason is that the class-under-test is directly calling a static method named ReadAllText from within the Load method. The calling of a static method tightly couples the class-under-test to this class.

Listing 12-7. The Import Class Is the Class-Under-Test

public class Import
{
    public string Data { get; private set; }

    public void Load(System.IO.FileInfo fileInfo)
    {
        Data = FileSystem.ReadAllText(fileInfo);
    }
}

____________

In this section, assume that you want to write a unit test for the Load method; also assume that rewriting the Import class is not an option. Within the FileSystem class the static method ReadAllText is defined, as shown in Listing 12-8. Since a static method like this one is not easy to fake with some mock object frameworks, the first approach might be to take an automated integration testing approach.

Listing 12-8. The Static Method ReadAllText

public static class FileSystem
{
    public static string ReadAllText(
        System.IO.FileInfo fileInfo)
    {
        if (fileInfo == null) throw new ArgumentNullException("fileInfo");

        if (fileInfo.Exists)
        {
            return System.IO.File.ReadAllText(fileInfo.FullName);
        }

        return null;
    }

    public static void WriteAllText(
        System.IO.FileInfo fileInfo,
        string contents)
    {
        if (fileInfo == null) throw new ArgumentNullException("fileInfo");

        System.IO.File.WriteAllText(fileInfo.FullName, contents);
    }
}

The test code shown in Listing 12-9 includes a before-test and an after-test method to create the file and delete the file that ReadAllText needs. This allows the test code to find the file and properly perform the test. This approach is not isolated: if the file is not created as part of the setup it cannot work.

Listing 12-9. Automated Integration Testing of the Import.Load Method

public class ImportTests
{
    private const string FileName = "temporary.dat";

    private const string Data = "{BEB5C694-8302-4397-990E-D1CA29C163F1}";

    [SetUp]
    public void TestSetup()
    {
        System.IO.File.WriteAllText(FileName, Data);
    }

    [TearDown]
    public void TestTeardown()
    {

        if (System.IO.File.Exists(FileName))
        {
            System.IO.File.Delete(FileName);
        }
    }

    [Test]
    public void Load_WithValidFile_ExpectProperData()
    {
        // Arrange
        var fileInfo = new System.IO.FileInfo(FileName);

        var classUnderTest = new Import();

        // Act
        classUnderTest.Load(fileInfo);

        // Assert
        Assert.AreEqual(Data, classUnderTest.Data);
    }
}

It is not hard to imagine a situation where the internals of some method-under-test cannot be accommodated by the test or fixture setup and teardown. For example, the method-under-test might call a web service or interact with a temperamental or sensitive legacy system. In order to mock static and non-virtual methods in isolation you need a mock object framework like Moles that can create an object that stands in for the FileSystem class and allows the call to the WriteAllText method to be faked.

The sample code in the SamplesCh126_MolesBegin folder contains the Lender.Slos.sln file. To follow along with this example, open the solution in Visual Studio. The test code is in the Tests.Unit.Lender.Slos.DataInterchange project. The test class ImportTests is written to perform the integration test shown in Listing 12-10. Rebuild the solution and the Load_WithValidFile_ExpectProperData test method should pass.

Expand the References node under the Tests.Unit.Lender.Slos.DataInterchange project, within the Solution Explorer window. There is a reference with the name Lender.Slos.DataInterchange. Since this assembly contains the implementation of the FileSystem class, this is the assembly that needs to be moled. Right-click the Lender.Slos.DataInterchange assembly and the context menu should contain an option to Add Moles Assembly. Select this option and rebuild the entire solution. (See Figure 12-19.)

images

Figure 12-19. Use Moles to create a code-generated stub type.

After the rebuild is successful there will be a new assembly reference named Lender.Slos.DataInterchange.Moles in the References section of the Tests.Unit.Lender.Slos.DataInterchange project. This new assembly is the moled version of the Lender.Slos.DataInterchange assembly.

By convention, the moled class name is the same name as the class it stands in for, with the letter M prefixed. For example, FileSystem class has a corresponding mole class name of MFileSystem.

With the mole assembly in the test project, the test method is now written as shown in Listing 12-10. Be advised, there are two new namespaces needed:

using Lender.Slos.DataInterchange.Moles;
using Microsoft.Moles.Framework;

The MolesContext.Create method is how the Moles framework provides the context for the test. This using block wraps the entire test method. In the arrangement section, the MFileSystem class includes the ReadAllTextFileInfo delegate, which is used to establish the expected behavior when the ReadAllText method is called. The completed sample code is found under the SamplesCh126_MolesEnd folder.

Listing 12-10. Unit Test the Import.Load Method in Isolation with Moles

public class ImportTests
{
    [TestCase("1FBF377361CD.dat", "{BEB5C694-8302-4397-990E-D1CA29C163F1}")]
    [TestCase("A72498755DD2.dat", "{4E9C15FD-5966-4F69-8263-16E11F239873}")]
    public void Load_WithValidFile_ExpectProperData(
        string filename,
        string data)
    {
        using (MolesContext.Create())
        {
            // Arrange
            var expectedData = data;
            var fileInfo = new FileInfo(filename);

            MFileSystem.ReadAllTextFileInfo = info =>
                {
                    Console.WriteLine("filename: {0}", info.Name);
                    Assert.IsTrue(info.Name == filename);
                    return data;
                };

            var classUnderTest = new Import();

            // Act
            classUnderTest.Load(fileInfo);

            // Assert
            Assert.AreEqual(expectedData, classUnderTest.Data);
        }
    }
}

The NUnit test runner cannot directly run this test. It needs to run using the moles.runner.exe, which provides the Moles framework context. For this sample, run the moles.runner.exe program configured as an external tool as shown in Figure 12-20. The Command entry is %MolesRoot%moles.runner.exe.13 The Arguments entry is $(BinDir)$(TargetName).dll /runner:"%NUnitRoot% unit-console.exe".

images

Figure 12-20. Configuring the Visual Studio external tool entry to run Moles

Make sure you are in the Tests.Unit.Lender.Slos.DataInterchange project and select Run Tests with Moles from the Visual Studio Tools menu. The Moles runner executes the tests and the output is written to the Output window, as shown in Figure 12-21.

____________

13 This assumes that you created an environment variable named MolesRoot that points to the proper Moles folder; for example, C:Program FilesMicrosoft Molesin.

images

Figure 12-21. Output from running Moles as an external tool

In Figure 12-21, the output writes the file name as the runner executes the test. This is because the ReadAllText method is executing the delegate method defined in the arrangement of your test code. It is obvious from the output that the Moles framework uses the MFileSystem stand-in object instead of a FileSystem object when the test code runs. The test method is now isolated from the file system and you are able to write test code in your arrangement that verifies that the code-under-test works as intended.

images Note An excellent alternative to Moles is the Typemock Isolator mock object framework.14 Like Moles, it supports isolation testing of non-virtual instance methods, non-public methods, sealed classes, static classes, and more. Since it is a commercial product, be prepared to make the case for purchasing Typemock.15

Database Testing Frameworks

A database is often a major component of many software systems. To have complete confidence in the system, the entire data access layer (DAL) needs to be tested as an integrated whole with the database. This is especially true for object-relational mapping (ORM) technologies like Entity Framework and NHibernate. Once an integration testing database is in place, a big challenge is keeping the data in the database in a known-state before executing a test. A database testing framework can provide the capability to effectively arrange the data in the database before and after a test method runs, which ensures that the database's state is consistent for the execution of each test.

____________

14 For more information see http://www.typemock.com/isolator-product-page.

15 You can read a comparison of Typemock, Moles, and Moq at http://blog.devdungeon.com/abusiness-case-for-typemock-isolator/.

images Practice 12-5 For Database Testing, Ensure That the Database State Is Consistent Before Each Test

This section looks at the NDbUnit database testing framework.16 This framework is based on DbUnit, which is used for Java development. NDbUnit is free and open source. Here are some key concepts and features to know about NDbUnit:

  • Relies on a .NET DataSet schema file (XSD) to govern its interaction with the database
  • Operates on only the subset of tables defined in the XSD schema file
  • Loads data from an XML file that adheres to the XSD constraints defined by the XSD schema file
  • Supports the following database servers:
    • Microsoft SQL Server 2005 and 2008, from Enterprise to CE
    • OLEDB-supported databases
    • SQLLite
    • MySQL
    • Oracle

images Note This section refers to sample code in the SamplesCh127_NDbUnit folder. It is adapted from the NDbUnit web site's step-by-step tutorial.17

The sample code is found in the SamplesCh127_NDbUnit folder. Open the Lender.Slos.sln file in Visual Studio to follow along with the examples.

Before working with NDbUnit, a test database must be created and available to run the tests against. You should have already created the Lender.Slos database as part of running the samples for Chapter 8. If not, refer to the instructions on creating the database found in the SamplesCh08 folder.

____________

16 The NDbUnit project site is found at http://code.google.com/p/ndbunit/.

A crucial element to running NDbUnit is the creation of the .NET DataSet schema definition file. MSDN provides a walkthrough on how to create a DataSet with the Visual Studio DataSet designer at http://msdn.microsoft.com/en-us/library/ms171897(v=VS.100).aspx. Here is an overview of creating and DataSet from within Visual Studio:

  1. To start, right-click the Data folder under the Tests.Unit.Lender.Slos.Dal project.
  2. Select the Add New Item … from the context menu.
  3. In the Add New Item window there is a Data template named DataSet. Provide the name as ExampleDataSet.xsd. Press the Add button. This adds the DataSet file to the project and opens the designer window.
  4. Browse to the Lender.Slos database from the Server Explorer in Visual Studio.
  5. Drag the Individual table from the Server Explorer into the DataSet designer surface, which adds it to the DataSet definition.
  6. Save and close the DataSet designer.

images Note The Visual Studio .NET DataSet designer support files, such as ExampleDataSet.xsc, ExampleDataSet.xss, and ExampleDataSet.designer.cs, are not used by NDbUnit. You can exclude the DataSet from the project, remove these files, and just include the ExampleDataSet.xsd file in the project.

In the sample code, the DataSet schema file is named Lender.Slos.DataSet.xsd and is found under the SamplesCh127_NDbUnitTests.Unit.Lender.Slos.DalData folder. This schema only defines one table in the database, the Individual table.

Since NDbUnit loads data from an XML file, the next step is to create the XML data file. The contents of the IndividualDalTests_Scenario01.xml file are shown in Listing 12-11. Each Individual element defines one row that NDbUnit needs to insert into the database table. Within each Individual element are the elements that define the data that belongs in each column.

Listing 12-11. The XML Data Used to Populate the Individual Table

<?xml version="1.0" standalone="yes"?>
<Lender xmlns="http://tempuri.org/Lender.xsd">
  <Individual>
    <Id>1</Id>
    <LastName>Roosevelt</LastName>
    <FirstName>Theodore</FirstName>
    <MiddleName />
    <Suffix />
    <DateOfBirth>1858-10-27</DateOfBirth>
  </Individual>
  <Individual>
    <Id>3</Id>
    <LastName>Smith</LastName>

    <FirstName>John</FirstName>
    <MiddleName>Q</MiddleName>
    <Suffix>Sr.</Suffix>
    <DateOfBirth>2011-01-01</DateOfBirth>
  </Individual>
  <Individual>
    <Id>5</Id>
    <LastName>Truman</LastName>
    <FirstName>Harry</FirstName>
    <MiddleName>S</MiddleName>
    <Suffix />
    <DateOfBirth>1884-05-08</DateOfBirth>
  </Individual>
</Lender>

Now turn to the test code in Listing 12-12. The IndividualDalTests class defines the test fixture. The first method to look at is the FixtureSetup. It is here where the SqlDbUnitTest object is instantiated based on the connection to the test database and stored in the _database field. As part of the fixture setup the XML schema file is read and the XML data file is read. Now, the NDbUnit framework is prepared to load the data whenever a call is made to load this data.

With the FixtureSetup method called, the unit test framework calls the TestSetup method before any test method is executed. It is from the TestSetup method that the NDbUnit method is called to perform the database operation. This one statement clears the Individual table and reloads it with the XML data, resetting the identity column to match the values in the XML data:

_database.PerformDbOperation(DbOperationFlag.CleanInsertIdentity);

Before any and all test methods defined in the IndividualDalTests class are called, the unit test framework calls the TestSetup method. This method performs the database operation that clears the Individual table and reloads it with the XML data, resetting the identity column to match the values in the XML data. In this way, each of the test methods begins with the Individual table in the same known-good state.

Listing 12-12. The IndividualDalTests Class Working with NDbUnit to Set the Database's State

using NDbUnit.Core;
using NDbUnit.Core.SqlClient;

using NUnit.Framework;

public class IndividualDalTests
{
    private const string ConnectionString =
    @"Data Source=(local)SQLExpress;Initial Catalog=Lender.Slos;Integrated Security=True";

    private INDbUnitTest _database;

    [TestFixtureSetUp]
    public void FixtureSetup()
    {
        _database = new SqlDbUnitTest(ConnectionString);

        _database.ReadXmlSchema(@"....DataLender.Slos.DataSet.xsd");

        _database.ReadXml(@"....DataIndividualDalTests_Scenario01.xml");
    }

    [TestFixtureTearDown]
    public void FixtureTeardown()
    {
        _database.PerformDbOperation(DbOperationFlag.DeleteAll);
    }

    [SetUp]
    public void TestSetup()
    {
        _database.PerformDbOperation(DbOperationFlag.CleanInsertIdentity);
    }



}

An example of a test method in the IndividualDalTests class is shown in Listing 12-13. Each of the three test cases passes because NDbUnit populated the Individual table with the records that the test method expects.

Listing 12-13. Test Method That Calls the Retrieve Method

[TestCase(1, "Roosevelt")]
[TestCase(3, "Smith")]
[TestCase(5, "Truman")]
public void Retrieve_WithScenarioDataInDatabase_ExpectProperLastName(
    int id,
    string expectedLastName)
{
    // Arrange
    var classUnderTest = new IndividualDal(ConnectionString);

    // Act
    var actual = classUnderTest.Retrieve(id);

    // Assert
    Assert.NotNull(actual);
    Assert.AreEqual(expectedLastName, actual.LastName);
}

User Interface Testing Frameworks

As discussed in Chapter 8, the benefits of a fully automated integration testing environment for stability and regression testing usually outweighs the effort it takes to set everything up and get it working together. Many developers and testers feel, for very good reason, that user interface (UI) testing is a necessary part of fully automated integration testing. However, UI test automation requires a significant investment in startup costs, development, and maintenance effort.18

Except for Windows services and embedded systems, most software systems are developed to interact with an end user. The user interacts with a user interface in a way that can be difficult to test. It is the role of a user interface testing framework to simulate and automate the user interactions with the application. In this way, the test code can compare the actual results to the expected results to determine if the software is working as intended.

Web Application Test Frameworks

Writing automated user interface tests for web applications presents a number of significant challenges. Consider the common approach of automating the hosting of the application in a web server and automating the testing through a web browser. To fully automate this through a continuous integration server requires automated deployment to the web server. The automated test framework needs to be able to open the browser and navigate to the landing page. There could be a login and other pages that need to load just to get to the page where the first test method runs. This work is the heart of the effort needed to establish the Smoke testing infrastructure you learned about in Chapter 8.

At the heart of web application test frameworks is browser automation, which simulates a human using the browser through software. For example, you programmatically navigate to a page, enter data into a field, and click a button. The actions that happen through the browser are then compared to the expected response, which could be a validation message on the page or a new record you expect to see in a database. There are many choices and the tools and technologies offer a variety of supported browsers and different approaches. Table 12-7 provides a list of widely-used web application test frameworks worth evaluating.

images

____________

Windows Forms and Other UI Test Frameworks

Windows Forms, Windows Presentation Foundation (WPF), Flash, Silverlight, and other UI technologies present different challenges when testing the applications that use them. Some of the challenges of deploying and running the application are different than those found in browser automation. Many of the challenges are technical, such as having the test code select an item from a third-party dropdown list. Finding the right UI testing framework depends on so many factors that the only effective strategy is for you and your team to perform proof-of-concept prototyping and feasibility study to find an approach that is effective. Table 12-8 provides a list of widely-used UI test frameworks worth evaluating.

images

Acceptance Testing Frameworks

In the broadest sense, acceptance testing is a series of tests that determine if the requirements and features of a software system are met. At some point in the development process a decision is made as to whether the software is ready. If the software does not meet the acceptance criteria then it is rarely put into production. The software must fulfill a set of minimum and essential requirements and must have all the necessary features. A formal acceptance phase methodically tests the software to determine if the software meets all the objectives, after which a decision is made as to whether the software is or is not acceptable. An informal acceptance phase relies on the decision-makers' judgment, based on what they know or don't know about the readiness of the system. Whether formal or informal, software that meets the acceptance criteria is the ultimate goal of the development team; it is the destination. To arrive at that destination the development team needs a navigation system to keep them on course. An acceptance test framework helps establish a navigation system to guide the team toward their ultimate goal: software that ships.

The goal of an acceptance test framework is to integrate the effort of customers, analysts, developers, and testers. For example, the customer describes the high-level need or desire for a new feature. The analyst works to understand, expand, and advance that information as a set of detailed requirements which are complete, clear, and consistent. The developers then take the detailed requirements and implement the software. The testers verify that the software meets the requirements. So, where does the acceptance test framework come in? The acceptance test framework is fed through examples and scenarios. The customer provides high-level, general examples of how the new feature is specified to work. These examples communicate conditions, outcomes, behaviors, and results.

These scenarios are fed into the framework as acceptance tests. The analyst may provide more detailed and elaborate scenarios, especially for exception handling, which are fed into the framework. The developers write software that satisfies the requirements and are explicit in these scenarios. The testers continue to feed test cases and scenarios as special cases, exceptions, and missing requirements are uncovered. The acceptance test framework provides a way to evaluate the software against the acceptance criteria.

The acceptance test framework ought to serve the goal of automating the acceptance testing. The advantages of automated acceptance testing include

  • Keeping developers focused on the entire set of features and requirements
  • Performing feature verification and regression testing
  • Collecting metrics that are used to measure progress toward the desired results
  • Providing guidance and direction based on acceptance criteria

Testing with Specifications and Behaviors

The primary goal of using specifications and behaviors to perform acceptance testing is to reach a common understanding between the analysts and the developers on the conditions and expected outcomes of the test cases. The key is to find a language that the analyst can use for behavior specifications that both the developer and the acceptance testing framework can also use to verify the software. It is a common language for everyone to work together and provide complete, clear, and consistent system behavior and requirement expectations.

images Practice 12-6  Acceptance Test with Business Specifications and Behaviors, Not Test Scripts

Too many people view automated testing as the execution of testing scripts. It is very important to understand and appreciate the difference between a test script and a specification.19 The primary distinction is that test scripts are instructions fed to the testing framework, while a specification is a well-structured set of explicit statements about the requirements. A script focuses on the means and the methods of testing, while a specification focuses on conditions, outcomes, and results. Perhaps the biggest distinction is that analysts and product owners often do not take full ownership and responsibility for test scripts. These scripts are often tedious to read and rarely make explicit statements about requirements and behaviors.

Table 12-9 provides a list of widely-used acceptance test frameworks that use specifications as the underlying driver of acceptance testing.

____________

19 The Concordian site describes the differences between writing test scripts and specifications: http://www.concordion.org/Technique.html.

images

Business-Logic Acceptance Testing

Often software systems are built around a set of business rules. For example, a fixed-asset accounting system has the depreciation calculation engine as a central component of the software. The requirements for this module are governed by the generally-accepted accounting principles (GAAP) and the tax accounting rules and regulations. The analysts work hard to write detailed requirements for the developers to use to build the calculation engine. Also, testers work hard to write test cases and scenarios to verify and validate that the calculation engine is developed properly. The correctness and completeness of the business logic is central to determining if the software is acceptable. The developers benefit when the test cases that the analysts and testers provide feed an automated acceptance testing framework.

Situations like this exist in many domains, such as payroll systems, life insurance, banking, benefit systems, and many more. The important point is that automated acceptance testing does not have to involve the entire system working as an integrated whole. It can be used to continuously verify and validate a single business-logic module independent of the user interface, database, or any other part of the system. This is crucial when the acceptance of this module is at the very foundation of accepting the entire system.

To continue with the fixed-asset accounting system example, the analyst could use an Excel spreadsheet to develop detailed test cases and scenarios. This could be one spreadsheet providing one scenario in a structured format that has specific values for each of the input variables and calculates the expected results. The acceptance test framework is built to read the spreadsheet, provide the parameters to the depreciation engine, and verify that the engine calculates the expected results. Over time the analyst and testers work together to build a complete and comprehensive set of spreadsheets that cover all the acceptance criteria through test cases and scenarios. The challenge is to create a purpose-built acceptance test framework that takes the spreadsheet as input, arranges the module properly, performs the calculations, and then asserts that the actual results match the expected results.

images Practice 12-7 Develop a Purpose-Built Acceptance Testing Framework to Monitor Key Business Logic

A purpose-built acceptance testing framework can be a very effective tool for the development team to monitor and control the correctness and completeness of the business-logic modules. This approach is built upon the features provided by a unit testing framework, discussed earlier in this chapter. The fixture is responsible for providing the acceptance data to the test case with the test code performing the specific arrangement, actions, and assertions that carry out the acceptance test.

Summary

In this chapter you learned about the detailed features of testing frameworks, including NUnit, MbUnit, MSTest, and xUnit.net. You also learned about mock object frameworks, especially the stubbing and the interaction testing facilities they provide. This chapter included a discussion of NDbUnit and how to use this database testing framework to control the state of the database. You also learned about some available options for UI testing frameworks and acceptance testing.

In Chapter 13 the topic of the many biases and aversions that impact the adoption of new and different practices is discussed.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.39.133