The Validation and Verification Processes

When you receive software, files, or database updates from your service vendors, it is up to you to make sure what you receive conforms to expectations. To do this you'll need to set up your own processes for validation and verification to ensure the quality of service. The following sections provide a guide to quality, testing, and test planning procedures.

Quality, Testing, and Test Planning

Testing and Quality measures help to optimize the utilization of resources. It is essential to define quality in customer or user's terms. When an evolutionary development approach is used, functionality and design might change with each iteration of the prototype. Paralleling the evolutionary development activity is the iterative process of test design.

Test design begins during requirements definition with the definition of test scenarios for each requirement. Test scenarios, iteratively defined and updated during application prototyping activities, are an excellent starting point for defining the testing process.

Categories of Testing

To define a comprehensive plan for testing all aspects of the business system under consideration, it is beneficial to include several independent dimensions of system performance. The following dimensions must be spelled out and tested:

  • Functional Requirements— getting the job done.

  • User Interface— desired look and feel.

  • Integration— seamless interplay between components.

  • Stress— traffic volume, database, and application server performance.

  • Regression— changing the changes.

  • Acceptance— applying business scenarios.

  • Ongoing performance testing— continuing tuning of system performance.

  • Functional Requirements— getting the job done.

    Functional requirement testing verifies that software delivers the capabilities it was designed to accomplish. Workflow control is evaluated. Links are verified and tested. Tests are developed to ensure that scripts, stored procedures, and triggers are all working as expected to deliver the features of the Web or program interface.

    Delivery of data on request via search engines is checked. Database content—both valid and invalid data conditions—should be simulated and tested.

  • User Interface— desired look and feel.

    User Interface testing addresses the need for a visual check on the presentation of the product features and functionality and associated update forms. Several categories of UI testing must be addressed:

    • Appearance includes the look and feel attributes of the system from color selection to sizing and correct usage of graphical elements.

    • Navigation conventions and visual structure of information are tested for usability. Is there a consistent visual identity on all views of the system? Can the user recognize system location on any screen and is it possible to navigate comfortably?

  • Integration— seamless interplay between components.

    Integration testing verifies that the links between system components are working together as expected. Perform validation on data conversions and transformations between input and update mechanisms such as

    • data entry

    • conversion programs

    • update forms

    • output formats such as

      1. online search results

      2. form update returns and generated files or table entries

      3. billing and accounting reports and files

    Throughput and its optimization are tested, with emphasis on identifying bottlenecks, serialization, and scalability issues.

  • Stress— traffic volume, database and application server performance, delivery.

    Stress testing determines the number of hits or user sessions a system can handle in a given time frame. Traffic volume is simulated and load testing done to determine breaking points and scalability issues. Design of architecture (database server, Web server, application server) usage is tested for performance constraints and data delivery parameters. Bandwidth requirements can be tested and identified for scenarios involving a mix of low-end (14.4), mid-range (28.8 to 56K), and high-end (ISDN, cable modem, and ADSL) user access.

  • Regression— changing the changes.

    Regression testing makes sure that changes to computer programs don't have unintended consequences. Testers normally develop a set of test cases that must be rerun against new versions of software to make sure that all the old capabilities still work as specified.

  • Acceptance— applying business scenarios.

    Acceptance testing is the business owner's opportunity to evaluate what the users of the system will encounter in business interactions with the site. Predetermined scenarios will help the owner test situations that software must accommodate. This is the time for a new set of eyes and hands to try to break the system in every conceivable way.

  • Ongoing performance testing— continuing tuning of system performance.

    Performance testing involves ongoing rechecks of critical measures of system performance, such as response time, load times for data, run times for programs, and load balancing for networks. Links and bridges between system components require special attention in the testing process. Throughout the project lifetime, data, software, and equipment should be monitored and performance verified for requirements compliance.

This plan for testing all aspects of the business system assumes that program/unit testing is performed as a significant part of the software development process, and unit testing is therefore not addressed in this plan.

Recommended Approach

To establish a useful testing program, an iterative approach is recommended with the following rules:

  • Each iteration introduces additional tests taken from the seven dimensions or categories, depending on the requirements exercised in that iteration of the application prototype.

  • Some test cases will be developer-conducted, whereas others can be obtained either in collaboration with customer-oriented partners (sales, service, and marketing personnel) or by other team members at the business owner's request. It is important not to limit the testing process to developer(s), but rather select a combination of developer and external testers for a given iteration.

  • External testing resources should be utilized to capitalize on the knowledge residing outside of the enterprise.

  • Both online Web access and process automation systems need to be tested.

  • In the beginning of each iteration the scope of the iteration and target audiences for the testing within the scope will be revisited to introduce the most effective and most needed addition to the plan.

The following is a sample from which test scenarios may be used:

Test Scenario Form

ID # 999

Please fill in the fields below for each test scenario executed.

Test Scenario (one or several per task, requirement or goal, as required to cover the situation)

Iteration #:_______

Test Scenario ID#:________

Date created/last revised: __mm/dd/yy__

Test Scenario Name: _____________________________________

Component Tested (page, search, database server, and so on.):_____________________________________________________

Dimension Tested (1-7 from previous definitions):______

Tester name: ____________________________________________

Scenario Description:

__________________________________________________________

__________________________________________________________

___________________________________

Information required (values, inputs, combinations):

__________________________________________________________

___________________________________

Sequence or special instructions:

__________________________________________________________

___________________________________

Expected results/outputs:

__________________________________________________________

___________________________________

Actual test results:______________________________________

__________________________________________________________

_________________________


One or more test scenarios will be developed for each business requirement, goal, or testing task from the project schedule, depending on the scope and needs of the iteration. The dimension(s) covered by the scenario also vary according to the scope and purpose of the iteration. Test scenarios are developed to provide testers with specific instructions for exercising the scenario, logging results, and adding to the set of test cases maintained for regression testing.

Problems will be reported to the developer(s) for resolution, correction, and retesting.

Test scenario results will be logged with dates and reference numbers for tracking, and reported to business owner for final signoff approval. Failed tests will be tracked, issues assigned, and problem resolution documented.

It is recommended that the following information be logged for test results tracking and reporting:

  • Test Results log (one entry per scenario)

  • Iteration #

  • Test Scenario ID#

  • Date tested

  • Test Scenario Name

  • Dimension Tested

  • Status (open/closed)

  • Test results

  • Issues

  • Action items

Identified Testing for Release I

For your first release of the service or product, it is useful to determine what Release I, Iteration 1 will address. The following bullets provide an example for a CRM application:

  • Verify user home pages and page contents

  • Validate customer data received from vendor or sending applications

  • Test customer database search results

  • Refine Web page look and feel

Testing for the first iteration prototype often focuses on the two dimensions of Functional Requirements and User Interface. Subsequent iterations address other dimensions and remaining system requirements as they are implemented.

Release I, Iteration 2 could address the following

  • Customer Database (User Interface, Functional Requirements)

  • User Registration System (User Interface, Functional Requirements)

Release I, Iteration 3 could address the following

  • Data entry system for company database (User Interface, Functional Requirements)

  • Walk-through of user setup and administrator scenarios for all pages and databases (Integration, Acceptance)

Release I, Iteration 4 will address the following

  • Production System performance (Stress)

  • Regression testing for Production System (reapply selected test cases in production environment)

  • Performance tuning for database access

The four iterations of this previous testing provides thorough coverage of all dimensions of test cases for Release 1. The testing scenarios developed during this process should be maintained and updated with additional information gathered during testing, to provide a basis for testing future releases of the software.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
52.15.229.185