Measurement and Analysis

A Support Process Area at Maturity Level 2

Purpose

The purpose of Measurement and Analysis (MA) is to develop and sustain a measurement capability used to support management information needs.

CAM calls for measuring and monitoring particular contributors to service performance and quality: capacity and availability. MA is a more general set of measurement practices and can be applied to measure and analyze anything relevant to your business goals.

Introductory Notes

The Measurement and Analysis process area involves the following activities:

• Specifying objectives of measurement and analysis so they are aligned with identified information needs and objectives

• Specifying measures, analysis techniques, and mechanisms for data collection, data storage, reporting, and feedback

• Implementing the collection, storage, analysis, and reporting of data

• Providing objective results that can be used in making informed decisions and taking appropriate corrective action

The integration of measurement and analysis activities into the processes of the project supports the following:

• Objective planning and estimating

• Tracking actual performance against established plans and objectives

• Identifying and resolving process-related issues

• Providing a basis for incorporating measurement into additional processes in the future

The staff required to implement a measurement capability may or may not be employed in a separate organization-wide program. Measurement capability may be integrated into individual projects or other organizational functions (e.g., quality assurance).

The initial focus for measurement activities is at the project level. However, a measurement capability may prove useful for addressing organization- and enterprise-wide information needs. To support this capability, measurement activities should support information needs at multiple levels, including the business, organizational unit, and project to minimize rework as the organization matures.

Projects can store project-specific data and results in a project-specific repository, but when data are to be used widely or are to be analyzed in support of determining data trends or benchmarks, data may reside in the organization’s measurement repository.

Measurement and analysis of product components provided by suppliers is essential for effective management of the quality and costs of the project. It is possible, with careful management of supplier agreements, to provide insight into data that support supplier-performance analysis.

Related Process Areas

SSD Add

Refer to the Service System Development process area for more information about developing and analyzing stakeholder requirements.

Refer to the Organizational Process Definition process area for more information about establishing the organization’s measurement repository.

Refer to the Project Monitoring and Control process area for more information about monitoring project planning parameters.

Refer to the Project Planning process area for more information about establishing estimates.

Refer to the Quantitative Project Management process area for more information about applying statistical methods to understand variation.

Specific Practices by Goal

SG 1 Align Measurement and Analysis Activities

Measurement objectives and activities are aligned with identified information needs and objectives.

The specific practices under this specific goal may be addressed concurrently or in any order:

• When establishing measurement objectives, experts often think ahead about necessary criteria for specifying measures and analysis procedures. They also think concurrently about the constraints imposed by data collection and storage procedures.

• Often it is important to specify the essential analyses to be conducted before attending to details of measurement specification, data collection, or storage.

SP 1.1 Establish Measurement Objectives

Establish and maintain measurement objectives derived from identified information needs and objectives.

Measurement objectives document the purposes for which measurement and analysis are done and specify the kinds of actions that may be taken based on results of data analyses. Measurement objectives can also identify the change in behavior desired as a result of implementing a measurement and analysis activity.

Sources of measurement objectives include management, technical, project, product, and process implementation needs.

Measurement objectives may be constrained by existing processes, available resources, or other measurement considerations. Judgments may need to be made about whether the value of the result is commensurate with resources devoted to doing the work.

Modifications to identified information needs and objectives may, in turn, be indicated as a consequence of the process and results of measurement and analysis.

Sources of information needs and objectives may include the following:

• Project plans

• Project performance monitoring

• Interviews with managers and others who have information needs

• Established management objectives

• Strategic plans

• Business plans

• Formal requirements or contractual obligations

• Recurring or other troublesome management or technical problems

• Experiences of other projects or organizational entities

• External industry benchmarks

• Process improvement plans

• Recurring or other troublesome incidents.

Example measurement objectives include the following:

• Reduce time to delivery

• Reduce total lifecycle costs

• Deliver the specified functionality completely

• Improve prior levels of quality

• Improve prior customer satisfaction ratings

• Maintain and improve relationships between the acquirer and supplier

• Improve the level of stakeholder involvement

SSD Add

Refer to the Service System Development process area for more information about developing and analyzing stakeholder requirements.

Refer to the Project Monitoring and Control process area for more information about monitoring project planning parameters.

Refer to the Project Planning process area for more information about establishing estimates.

Refer to the Requirements Management process area for more information about maintaining bidirectional traceability among requirements and work products.

Typical Work Products

1. Measurement objectives

Subpractices

1. Document information needs and objectives.

Information needs and objectives are documented to allow traceability to subsequent measurement and analysis activities.

2. Prioritize information needs and objectives.

It may be neither possible nor desirable to subject all initially identified information needs to measurement and analysis. Priorities may also need to be set within the limits of available resources.

3. Document, review, and update measurement objectives.

Carefully consider the purposes and intended uses of measurement and analysis.

The measurement objectives are documented, reviewed by management and other relevant stakeholders, and updated as necessary. Doing so enables traceability to subsequent measurement and analysis activities, and helps to ensure that analyses will properly address identified information needs and objectives.

It is important that users of measurement and analysis results be involved in setting measurement objectives and deciding on plans of action. It may also be appropriate to involve those who provide the measurement data.

4. Provide feedback for refining and clarifying information needs and objectives as necessary.

Identified information needs and objectives may be refined and clarified as a result of setting measurement objectives. Initial descriptions of information needs may be unclear or ambiguous. Conflicts may arise between existing needs and objectives. Precise targets on an already existing measure may be unrealistic.

5. Maintain traceability of measurement objectives to identified information needs and objectives.

There must always be a good answer to the question, “Why are we measuring this?”

Of course, measurement objectives may also change to reflect evolving information needs and objectives.

SP 1.2 Specify Measures

Specify measures to address measurement objectives.

Measurement objectives are refined into precise, quantifiable measures.

Measures may be either “base” or “derived.” Data for base measures are obtained by direct measurement. Data for derived measures come from other data, typically by combining two or more base measures.

Examples of commonly used base measures include the following:

• Estimates and actual measures of work product size (e.g., number of pages)

• Estimates and actual measures of effort and cost (e.g., number of person hours)

• Quality measures (e.g., number of defects by severity)

Examples of commonly used derived measures include the following:

• Earned value

• Schedule performance index

• Defect density

• Peer review coverage

• Test or verification coverage

• Reliability measures (e.g., mean time to failure)

• Quality measures (e.g., number of defects by severity/total number of defects)

Derived measures typically are expressed as ratios, composite indices, or other aggregate summary measures. They are often more quantitatively reliable and meaningfully interpretable than the base measures used to generate them.

Typical Work Products

1. Specifications of base and derived measures

Subpractices

1. Identify candidate measures based on documented measurement objectives.

Measurement objectives are refined into measures. Identified candidate measures are categorized and specified by name and unit of measure.

2. Maintain traceability of measures to measurement objectives.

Interdependencies among candidate measures are identified to enable later data validation and candidate analyses in support of measurement objectives.

3. Identify existing measures that already address measurement objectives.

Specifications for measures may already exist, perhaps established for other purposes earlier or elsewhere in the organization.

4. Specify operational definitions for measures.

Operational definitions are stated in precise and unambiguous terms. They address two important criteria:

• Communication: What has been measured, how was it measured, what are the units of measure, and what has been included or excluded?

• Repeatability: Can the measurement be repeated, given the same definition, to get the same results?

5. Prioritize, review, and update measures.

Proposed specifications of measures are reviewed for their appropriateness with potential end users and other relevant stakeholders. Priorities are set or changed, and specifications of measures are updated as necessary.

SP 1.3 Specify Data Collection and Storage Procedures

Specify how measurement data are obtained and stored.

Explicit specification of collection methods helps to ensure that the right data are collected properly. This specification may also help further clarify information needs and measurement objectives.

Proper attention to storage and retrieval procedures helps to ensure that data are available and accessible for future use.

Typical Work Products

1. Data collection and storage procedures

2. Data collection tools

Subpractices

1. Identify existing sources of data that are generated from current work products, processes, or transactions.

Existing sources of data may have been identified when specifying the measures. Appropriate collection mechanisms may exist whether or not pertinent data have already been collected.

2. Identify measures for which data are needed but are not currently available.

3. Specify how to collect and store the data for each required measure.

Explicit specifications are made of how, where, and when data will be collected. Procedures for collecting valid data are specified. Data are stored in an accessible manner for analysis. This analysis helps to determine whether data will be saved for possible reanalysis or documentation purposes.

Specifications may also address other factors that provide information about the context in which the measurement was collected (e.g., time of measurement, age of data) to assist in validating the data for later analyses.

Questions to be considered typically include the following:

• Have the frequency of collection and points in the process where measurements will be made been determined?

• Has the timeline that is required to move measurement results from points of collection to repositories, other databases, or end users been established?

• Who is responsible for obtaining data?

• Who is responsible for data storage, retrieval, and security?

• Have necessary supporting tools been developed or acquired?

4. Create data collection mechanisms and process guidance.

Data collection and storage mechanisms are well integrated with other normal work processes. Data collection mechanisms may include manual or automated forms and templates. Clear, concise guidance on correct procedures is available to those responsible for doing the work. Training is provided as needed to clarify processes required for the collection of complete and accurate data and to minimize the burden on those who must provide and record data.

5. Support automatic collection of data as appropriate and feasible.

Automated support can aid in collecting more complete and accurate data.

Examples of such automated support include the following:

• Time-stamped activity logs

• Static or dynamic analyses of artifacts

However, some data cannot be collected without human intervention (e.g., customer satisfaction, other human judgments), and setting up the necessary infrastructure for other automation may be costly.

6. Prioritize, review, and update data collection and storage procedures.

Proposed procedures are reviewed for their appropriateness and feasibility with those who are responsible for providing, collecting, and storing data. They also may have useful insights about how to improve existing processes or may be able to suggest other useful measures or analyses.

7. Update measures and measurement objectives as necessary.

Priorities may need to be reset based on the following:

• The importance of the measures

• The amount of effort required to obtain the data

Considerations include whether new forms, tools, or training would be required to obtain the data.

SP 1.4 Specify Analysis Procedures

Specify how measurement data are analyzed and communicated.

Specifying analysis procedures in advance ensures that appropriate analyses will be conducted and reported to address documented measurement objectives (and thereby the information needs and objectives on which they are based). This approach also provides a check that necessary data will, in fact, be collected. Analysis procedures should account for the quality (e.g., age, reliability) of all data that enter into an analysis (whether from the project, organizational measurement repository, or other source). The quality of data should be considered to help select the appropriate analysis procedure and evaluate the results of the analysis.

Typical Work Products

1. Analysis specifications and procedures

2. Data analysis tools

Subpractices

1. Specify and prioritize the analyses to be conducted and the reports to be prepared.

Early on, pay attention to the analyses to be conducted and to the manner in which results will be reported. These analyses and reports should meet the following criteria:

• The analyses explicitly address the documented measurement objectives.

• Presentation of results is clearly understandable by the audiences to whom the results are addressed.

Priorities may have to be set within available resources.

2. Select appropriate data analysis methods and tools.

Issues to be considered typically include the following:

• Choice of visual display and other presentation techniques (e.g., pie charts, bar charts, histograms, radar charts, line graphs, scatter plots, tables)

• Choice of appropriate descriptive statistics (e.g., arithmetic mean, median, mode)

• Decisions about statistical sampling criteria when it is impossible or unnecessary to examine every data element

• Decisions about how to handle analysis in the presence of missing data elements

• Selection of appropriate analysis tools

Descriptive statistics are typically used in data analysis to do the following:

• Examine distributions of specified measures (e.g., central tendency, extent of variation, data points exhibiting unusual variation)

• Examine interrelationships among specified measures (e.g., comparisons of defects by phase of the product’s lifecycle, comparisons of defects by product component)

• Display changes over time

Refer to the Select Measures and Analytic Techniques specific practice and Apply Statistical Methods to Understand Variation specific practice in the Quantitative Project Management process area for more information about the appropriate use of statistical analysis techniques and understanding variation.

3. Specify administrative procedures for analyzing data and communicating results.

Issues to be considered typically include the following:

• Identifying the persons and groups responsible for analyzing the data and presenting the results

• Determining the timeline to analyze the data and present the results

• Determining the venues for communicating the results (e.g., progress reports, transmittal memos, written reports, staff meetings)

4. Review and update the proposed content and format of specified analyses and reports.

All of the proposed content and format are subject to review and revision, including analytic methods and tools, administrative procedures, and priorities. Relevant stakeholders consulted should include end users, sponsors, data analysts, and data providers.

5. Update measures and measurement objectives as necessary.

Just as measurement needs drive data analysis, clarification of analysis criteria can affect measurement. Specifications for some measures may be refined further based on specifications established for data analysis procedures. Other measures may prove unnecessary, or a need for additional measures may be recognized.

Specifying how measures will be analyzed and reported may also suggest the need for refining measurement objectives themselves.

6. Specify criteria for evaluating the utility of analysis results and for evaluating the conduct of measurement and analysis activities.

Criteria for evaluating the utility of the analysis might address the extent to which the following apply:

• The results are provided in a timely manner, understandable, and used for decision making.

• The work does not cost more to perform than is justified by the benefits it provides.

Criteria for evaluating the conduct of the measurement and analysis might include the extent to which the following apply:

• The amount of missing data or the number of flagged inconsistencies is beyond specified thresholds.

• There is selection bias in sampling (e.g., only satisfied end users are surveyed to evaluate end-user satisfaction, only unsuccessful projects are evaluated to determine overall productivity).

• Measurement data are repeatable (e.g., statistically reliable).

• Statistical assumptions have been satisfied (e.g., about the distribution of data, about appropriate measurement scales).

SG 2 Provide Measurement Results

Measurement results, which address identified information needs and objectives, are provided.

The primary reason for conducting measurement and analysis is to address identified information needs and objectives. Measurement results based on objective evidence can help to monitor performance, fulfill obligations documented in a supplier agreement, make informed management and technical decisions, and enable corrective actions to be taken.

SP 2.1 Obtain Measurement Data

Obtain specified measurement data.

The data necessary for analysis are obtained and checked for completeness and integrity.

Typical Work Products

1. Base and derived measurement data sets

2. Results of data integrity tests

Subpractices

1. Obtain data for base measures.

Data are collected as necessary for previously used and newly specified base measures. Existing data are gathered from project records or elsewhere in the organization.

Note that data that were collected earlier may no longer be available for reuse in existing databases, paper records, or formal repositories.

2. Generate data for derived measures.

Values are newly calculated for all derived measures.

3. Perform data integrity checks as close to the source of data as possible.

All measurements are subject to error in specifying or recording data. It is always better to identify these errors and sources of missing data early in the measurement and analysis cycle.

Checks can include scans for missing data, out-of-bounds data values, and unusual patterns and correlation across measures. It is particularly important to do the following:

• Test and correct for inconsistency of classifications made by human judgment (i.e., to determine how frequently people make differing classification decisions based on the same information, otherwise known as “inter-coder reliability”).

• Empirically examine the relationships among measures that are used to calculate additional derived measures. Doing so can ensure that important distinctions are not overlooked and that derived measures convey their intended meanings (otherwise known as “criterion validity”).

SP 2.2 Analyze Measurement Data

Analyze and interpret measurement data.

Measurement data are analyzed as planned, additional analyses are conducted as necessary, results are reviewed with relevant stakeholders, and necessary revisions for future analyses are noted.

Typical Work Products

1. Analysis results and draft reports

Subpractices

1. Conduct initial analyses, interpret results, and draw preliminary conclusions.

The results of data analyses are rarely self evident. Criteria for interpreting results and drawing conclusions should be stated explicitly.

2. Conduct additional measurement and analysis as necessary, and prepare results for presentation.

Results of planned analyses may suggest (or require) additional, unanticipated analyses. In addition, these analyses may identify needs to refine existing measures, to calculate additional derived measures, or even to collect data for additional base measures to properly complete the planned analysis. Similarly, preparing initial results for presentation may identify the need for additional, unanticipated analyses.

3. Review initial results with relevant stakeholders.

It may be appropriate to review initial interpretations of results and the way in which these results are presented before disseminating and communicating them widely.

Reviewing the initial results before their release may prevent needless misunderstandings and lead to improvements in the data analysis and presentation.

Relevant stakeholders with whom reviews may be conducted include intended end users, sponsors, data analysts, and data providers.

4. Refine criteria for future analyses.

Lessons that can improve future efforts are often learned from conducting data analyses and preparing results. Similarly, ways to improve measurement specifications and data collection procedures may become apparent, as may ideas for refining identified information needs and objectives.

SP 2.3 Store Data and Results

Manage and store measurement data, measurement specifications, and analysis results.

Storing measurement-related information enables its timely and cost-effective use as historical data and results. The information also is needed to provide sufficient context for interpretation of data, measurement criteria, and analysis results.

Information stored typically includes the following:

• Measurement plans

• Specifications of measures

• Sets of data that were collected

• Analysis reports and presentations

Stored information contains or refers to other information needed to understand and interpret the measures and to assess them for reasonableness and applicability (e.g., measurement specifications used on different projects when comparing across projects).

Typically, data sets for derived measures can be recalculated and need not be stored. However, it may be appropriate to store summaries based on derived measures (e.g., charts, tables of results, report text).

Interim analysis results need not be stored separately if they can be efficiently reconstructed.

Refer to the Configuration Management process area for more information about establishing a configuration management system.

Refer to the Establish the Organization’s Measurement Repository specific practice in the Organizational Process Definition process area for more information about establishing the organization’s measurement repository.

Typical Work Products

1. Stored data inventory

Subpractices

1. Review data to ensure their completeness, integrity, accuracy, and currency.

2. Store data according to data storage procedures.

3. Make stored contents available for use only to appropriate groups and personnel.

4. Prevent stored information from being used inappropriately.

Examples of ways to prevent inappropriate use of the data and related information include controlling access to data and educating people on the appropriate use of data.

Examples of inappropriate use of data include the following:

• Disclosure of information provided in confidence

• Faulty interpretations based on incomplete, out-of-context, or otherwise misleading information

• Measures used to improperly evaluate the performance of people or to rank projects

• Impugning the integrity of individuals

SP 2.4 Communicate Results

Communicate results of measurement and analysis activities to all relevant stakeholders.

The results of the measurement and analysis process are communicated to relevant stakeholders in a timely and usable fashion to support decision making and assist in taking corrective action.

Relevant stakeholders include intended users, sponsors, data analysts, and data providers.

Typical Work Products

1. Delivered reports and related analysis results

2. Contextual information or guidance to help interpret analysis results

Subpractices

1. Keep relevant stakeholders informed of measurement results in a timely manner.

Measurement results are communicated in time to be used for their intended purposes. Reports are unlikely to be used if they are distributed with little effort to follow up with those who need to know the results.

To the extent possible and as part of the normal way they do business, users of measurement results are kept personally involved in setting objectives and deciding on plans of action for measurement and analysis. Users are regularly kept informed of progress and interim results.

Refer to the Project Monitoring and Control process area for more information about conducting progress reviews.

2. Assist relevant stakeholders in understanding results.

Results are communicated in a clear and concise manner appropriate to relevant stakeholders. Results are understandable, easily interpretable, and clearly tied to identified information needs and objectives.

The data analyzed are often not self evident to practitioners who are not measurement experts. The communication of results should be clear about the following:

• How and why base and derived measures were specified

• How data were obtained

• How to interpret results based on the data analysis methods used

• How results address information needs

Examples of actions taken to help others to understand results include the following:

• Discussing the results with relevant stakeholders

• Providing background and explanation in a memo

• Briefing users on results

• Providing training on the appropriate use and understanding of measurement results

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.223.108.119