Appendix B. Automated Testing Terms and Definitions

Many of the terms and definitions included here are based on those found in Rational Unified Process 2001. Those taken from RUP are indicated by an asterisk.

*Architecture.

The highest-level concept of a system in its environment [IEEE]. The architecture of a software system (at a given point in time) is its organization or structure of significant components interacting through interfaces, those components being composed of successively smaller components and interfaces.

Artifact.

Any, product, deliverable, or document a process creates.

*Build.

A build comprises one or more components (often executable), each constructed from other components, usually by a process of compilation and linking of source code.

*Component.

A physical, replaceable part of a system that packages implementation and conforms to and provides the realization of a set of interfaces.

Data-Driven Testing.

An automation approach in which the navigation and functionality of the test script is directed through external data; this approach separates test and control data from the test script proper.

Functional Decomposition Approach.

An automation method in which the test cases are reduced to fundamental tasks, navigation, functional tests, data verification, and return navigation; also known as the Framework-Driven Approach.

Key Word–Driven Testing.

The approach developed by Carl Nagle of the SAS Institute that is offered as freeware on the Web; Key Word–Driven Testing is an enhancement to the data-driven methodology.

Performance Testing.

A class of tests implemented and executed to characterize and evaluate the performance-related characteristics of the application under test. These tests include timing profiles, execution flow, response times, and operational reliability and limits.

*Procedure.

A documented description of the course of action that is followed when performing a task; a step-by-step method that, when followed, ensures that standards are met.

Process.

A series of steps that result in a product or service; the work effort that produces a product or service.

Process Control.

Self-adjusting operations that that keep a product or service in conformance with specifications.

Product.

Any artifact, deliverable, or document a process creates.

*Rational ClearCase.

Software that provides configuration management.

*Rational ClearQuest.

A defect tracking and change request management system.

*Rational Robot.

Robot is the Capture/Replay component of Rational Suite TestStudio 2001.

*Rational TestManager.

TestManager is the central console for managing all testing activities—planning, design, implementation, execution, and analysis.

*Rational Unified Process.

A software engineering process that provides a disciplined approach to assigning tasks and responsibilities within a development organization.

Specifications.

What is expected when providing a product or service to a customer.

*Test Artifact Set.

Captures and presents information related to the tests performed.

*Test Case.

A set of test inputs, execution conditions, and expected results developed for a particular objective, such as to exercise a particular program path or to verify compliance with a specific requirement.

Test Condition.

The set of circumstances that a test invokes.

*Test Data.

The actual (sets of) values used in the test or that are necessary to execute the test. Test data instantiates the condition being tested (as input or as pre-existing data) and is used to verify that a specific requirement has been successfully implemented (comparing actual results to the expected results).

Test Inputs.

Artifacts from work processes that are used to identify and define actions that occur during testing. These artifacts may come from development processes that are external to the test group. Examples include Functional Requirements Specifications and Design Specifications. They may also be derived from previous testing phases and passed to subsequent testing activities.

*Test Plan.

Contains information about the purpose and goals of testing within the project. Additionally, the test plan identifies the strategies to be used to implement and execute testing and resources needed.

*Test Procedure.

A set of detailed instructions for the set-up, execution, and results evaluation for a specific test case (or set of test cases).

Test Requirement.

A statement of the objectives associated with a specific test and the criteria that must be met to ascertain the pass/fail status of the test.

Test Results.

Data captured during the execution of test and used in calculating the different key measures of testing.

*Test Script.

The computer readable instructions that automate the execution of a test procedure (or portion of a test procedure). Test scripts may be created (recorded) or automatically generated using test automation tools, programmed using a programming language, or created by a combination of recording, generating, and programming.

*Test Strategy.

Describes the general approach and objectives of the test activities.

Test Suite.

The set of tests that when executed instantiate a test scenario.

*Test Workspace.

“Private” areas where testers can install and test code in accordance with the project's adopted standards in relative isolation from the developers.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.116.63.5