This chapter provides insights, guidance, and recommendations useful for organizations looking to improve their acquisition processes by using the CMMI for Acquisition (CMMI-ACQ) model.
The first set of essays describes successes and challenges experienced by organizations that have adopted CMMI-ACQ within the public sector, primarily government defense and civil agencies in the United States and in France. Included is analysis from the organization that achieved the first-ever CMMI-ACQ maturity level 5 rating.
The second set of essays discusses private-sector adoption and the unique challenges faced by industry when outsourcing IT products and services.
The third set of essays highlights some important lifecycle aspects of CMMI-ACQ that help reduce program risk. These risks can arise anywhere in the lifecycle—from planning the acquisition strategy to transitioning products and services into use.
The next set of essays covers special topics that include acquiring interoperable systems, Agile acquisition, and process improvement.
The chapter closes with a view toward how the future of CMMI might evolve and how the CMMI constellations can be used to enable enterprise-wide improvement.
Mike Phillips, Brian Gallagher, and Karen Richter
Since the first edition of this book was published, even more activity has taken place to reform acquisition in the Department of Defense (DoD). Before discussing this activity, we have updated our discussion of the earlier “Defense Acquisition Performance Assessment Report” (the DAPA report) [Kadish 2006] to this new version of the model.
In his letter delivering the report to Deputy Secretary of Defense Gordon England, the panel chair, Gen Kadish, noted:
Although our Acquisition System has produced the most effective weapon systems in the world, leadership periodically loses confidence in its efficiency. Multiple studies and improvements to the Acquisition System have been proposed—all with varying degrees of success. Our approach was broader than most of these studies. We addressed the “big A” Acquisition System because it includes all the management systems that [the] DoD uses, not [just] the narrow processes traditionally thought of as acquisition. The problems [the] DoD faces are deeply [e]mbedded in the “big A” management systems, not just the “little a” processes. We concluded that these processes must be stable for incremental change to be effective—they are not.
In developing the CMMI-ACQ model—a model we wanted to apply to both commercial and government organizations—we considered tackling the issues raised in the DAPA report. However, successful model adoption requires an organization to embrace the ideas (i.e., best practices) of the model for effective improvement of the processes involved.
As the preceding quote illustrates, in addition to organizations traditionally thought of as the “acquisition system” (“little a”) in the DoD, there are also organizations that are associated with the “big A,” including the Joint Capabilities Integration and Development System (JCIDS) and the Planning, Programming, Budgeting, and Execution (PPBE) system. These systems are governed by different stakeholders, directives, and instructions. To be effective, the model would need to be adopted at an extremely high level—by the DoD itself. Because of this situation, we resolved with our sponsor in the Office of the Secretary of Defense to focus on the kinds of “little a” organizations that are able to address CMMI best practices once the decision for a materiel solution has been reached.
In reality, we often find that models can have an impact beyond what might be perceived to be the “limits” of process control. The remainder of this essay highlights some of the clear leverage points at which the model supports DAPA recommendations for improvement.
Figure 6.1 illustrates some of the issues and their relative importance as observed by the DAPA project team in reviewing previous recommendations and speaking to experts [Kadish 2006]. The issue areas in Figure 6.1 are discussed next. Although we would never claim that the CMMI-ACQ model offers a complete solution to all of the key issues addressed in the DAPA report, building the capabilities covered in the CMMI-ACQ model offers opportunities to address many of the risks in these issue areas.
Acquisition Strategy is one of the more significant additions to the CMMI Model Foundation (CMF) to support the acquisition area of interest. Multiple process area locations (including Acquisition Requirements Development and Solicitation and Supplier Agreement Development) were considered before placing the acquisition strategy practice into Project Planning. Feedback from various acquisition organizations noted that many different organizational elements are stakeholders in strategy development, but its criticality to the overall plan suggested an effective fit as part of this process area. We also found that some projects might develop the initial strategy before the formal establishment of full acquisition offices, which would then accept the long-term management responsibility for the strategy. Note that the practice does not demand a specific strategy, but does expect the strategy to be a planning artifact. The full practice, “Establish and maintain the acquisition strategy,” in Project Planning specific practice 1.1, recognizes that maintenance may require strategy updates if the changing environment reflects that need. The DAPA report (p. 14) calls out some recommended strategies in the DoD environment [Kadish 2006].
Additional discussion is contained in the Acquisition Strategy: Planning for Success essay, found later in this chapter.
Although a model such as CMMI-ACQ should not direct specific organizational structures, the CMMI Product Team looked for ways to encourage improved operations on programs. Most significant might be practices for creating and managing teams with effective rules and guidelines such as those suggested in the Organizational Process Definition process area. The DAPA report (p. 76) noted deficiencies in operation. Organizations following model guidance may be better able to address these shortfalls.
The CMMI-ACQ Product Team, recognizing the vital role played by effective requirements development in acquisition, assigned Acquisition Requirements Development to maturity level 2.1 Key to this effectiveness is specific goal 2, “Customer requirements are refined and elaborated into contractual requirements.” The importance of this activity cannot be overemphasized, as poor requirements are likely to be the most troubling type of “defect” that the acquisition organization injects into the system. The DAPA report notes that requirement errors are often injected late into the acquisition system. Examples in the report included operational test requirements unspecified by the user. Although the model cannot address such issues directly, it does provide a framework that aids in the traceability of requirements to the source, which in turn facilitates the successful resolution of issues. (The Requirements Management process area also aids in this traceability.)
Several features in CMMI-ACQ should help users address the oversight issue. Probably the most significant help is found in the Acquisition Technical Management process area, which recognizes the need for the effective technical oversight of development activities. Acquisition Technical Management is coupled with the companion process area of Agreement Management, which addresses business issues with the supplier via contractual mechanisms. For those individuals familiar with CMMI model construction, the generic practice associated with reviewing the activities, status, and results of all the processes with higher level management and resolving issues offers yet another opportunity to resolve issues.
We have grouped leadership, program manager expertise, and process discipline issues together because they reflect the specific purpose of creating CMMI-ACQ—to provide best practices that enable leadership to develop capabilities within acquisition organizations, and to instill process discipline where clarity might previously have been lacking. Here the linkage between DAPA report issues and CMMI-ACQ is not specific, but very general.
Alas, for complex acquisition system and PPBE process issues, the model does not have any specific assistance to offer. Nevertheless, the effective development of the high maturity elements of process-performance modeling, particularly if shared by both the acquirer and the system development team, may help address the challenges of “big A” budgetary exercises with more accurate analyses than otherwise would be produced.
In 2009, the Weapon System Acquisition Reform Act of 2009 (WSARA) was passed by Congress. This act was of interest to the CMMI-ACQ community because of its emphasis on systems engineering within DoD acquisition. The CMMI-ACQ Product Team recognized the importance of systems engineering in acquisition by giving the model a solid foundation in the Acquisition Engineering process areas: Acquisition Requirements Development, Acquisition Technical Management, Acquisition Verification, and Acquisition Validation. Acquisition Engineering is designated as a new category in CMMI-ACQ V1.3 to stress its importance.
The National Defense Acquisition Act (NDAA) for fiscal year 2010 required a new acquisition process for information technology (IT) systems. This new process must be designed to include the following elements:
• Early and continual involvement of the user
• Multiple, rapidly executed increments or releases of capability
• Early, successive prototyping to support an evolutionary approach
• A modular, open-systems approach
Of course, the CMMI-ACQ model is designed with sufficient flexibility so that it may be applied to all kinds of systems and services, including IT systems. For example, the last three elements identified above are all requirements that would need to be included in the acquisition strategy developed at the very beginning of the Project Planning process area, as discussed earlier in this essay and in the Acquisition Strategy: Planning for Success essay later in this chapter.
The first new requirement for IT system acquisition—early and continual involvement of the user—is addressed pervasively throughout the CMMI-ACQ model. Clearly, in the new IT acquisition process, the “relevant stakeholder” group would always include the user.
Involvement of all relevant stakeholders in the execution of the acquisition processes is discussed throughout all process areas in various activities. Specifically, institutionalizing a process at capability or maturity level 2 requires the generic practice, “Identify and involve the relevant stakeholders of the process as planned.”
Involvement of stakeholders is an important consideration in activities including, but not limited to, the following:
• Planning
• Decision making
• Making commitments
• Communicating
• Coordinating
• Reviewing
• Conducting appraisals
• Defining requirements
• Resolving problems and issues
A specific practice in Project Planning, “Plan the involvement of identified stakeholders,” sets up the involvement early in the project. Another specific practice, “Monitor stakeholder involvement against the project plan,” in Project Monitoring and Control ensures that the project carries out the plan for the stakeholder involvement. Finally, a specific goal, “Coordination and collaboration between the project and relevant stakeholders are conducted,” in Integrated Project Management ensures this behavior is carried throughout the project.
In 2010, Secretary of Defense Robert Gates began an initiative to deliver better value to the warfighter and the taxpayer by improving the way the DoD does business. Under Secretary of Defense (Acquisition, Technology and Logistics) Ashton Carter’s implementation of this initiative included measures focused on improving the tradecraft in the acquisition of services. When planning the CMMI-ACQ model, we made a concerted effort to cover the acquisition of both products and services. Best practices that may help the DoD in its efforts in improving the acquisition of services are found throughout the model.
Although the CMMI-ACQ model cannot address all of the problems in the complex DoD acquisition process, it can provide a firm foundation for improvement in acquisition offices in line with the intent of further improvement initiatives.
Although CMMI-ACQ is aimed primarily at the acquisition of products and services, it also outlines some practices that would be especially important in addressing systems-of-systems issues. Because of the importance of this topic in government acquisition, we decided to add this discussion and the Interoperable Acquisition section later in this chapter to assist readers who are facing the challenges associated with systems of systems.
A source document for this discussion is Interoperable Acquisition for Systems of Systems: The Challenges [Smith 2006], which describes the ever-increasing interdependence of systems necessary to provide needed capabilities to the user. As an example cited in the document, in one key satellite communications program, at least five separate and independent acquisition programs needed to be completed successfully before the actual capability could be delivered to the various military services. The increasing emphasis on net-centric operations and service-oriented architectures added to the challenge. Figure 6.2 provides a visual depiction of some of the complexity in this program.
Figure 6.2 shows the many ways that complexity grows quickly as the number of critical programmatic interfaces increases. Each dependency creates a risk for at least one of the organizations. The figure suggests that for two of the programs, a shared reporting directorate can aid in mitigating risks. This kind of challenge can be described as one in which applying recursion is sufficient to meet the demands. The system is composed of subsystems, and a single overarching management structure controls the subordinate elements. For some large program offices, for example, the system might be an aircraft with its supporting maintenance systems and required training systems. All elements may be separate programs, but coordination occurs within a single management structure. Figure 6.2, however, shows the additional challenges evident when no single management structure has been established. Critical parts of the capability are delivered by separate management structures, often with widely different motivators and priorities.
Although the CMMI-ACQ model was not constructed to specifically solve these challenges, especially the existence of separate management structures, the complexity was familiar to CMMI-ACQ authors. Those individuals who have used the CMM or CMMI model in the past will recognize the Risk Management process area in this model. In addition to noting the existence of significant risks in managing the delivery of a system through any outside agent, such as a supplier, the model emphasizes the value of recognizing both internal and external sources of risk. The following note is part of Risk Management specific practice 1.1:
Identifying risk sources provides a basis for systematically examining changing situations over time to uncover circumstances that impact the ability of the project to meet its objectives. Risk sources are both internal and external to the project. As the project progresses, additional sources of risk may be identified. Establishing categories for risks provides a mechanism for collecting and organizing risks as well as ensuring appropriate scrutiny and management attention to risks that can have serious consequences on meeting project objectives.
Varying approaches to addressing risks across multiple programs and organizations have been taken. One of these, Team Risk Management, has been documented by the SEI [Higuera 2005]. CMMI-ACQ anticipated the need for flexible organizational structures to address these kinds of issues and chose to expect that organizations would establish and maintain teams crossing organizational boundaries. The following is part of Organizational Process Definition specific practice 1.7:
In an acquisition organization, teams are useful not just in the acquirer’s organization but between the acquirer and supplier and among the acquirer, supplier, and other relevant stakeholders, as appropriate. Teams may be especially important in a systems of systems environment.
Perhaps the most flexible and powerful model feature used to address the complexities of systems of systems is the effective employment of the specific practices in the second specific goal of Acquisition Technical Management: Perform Interface Management. The practices under this specific goal require successfully managing needed interfaces among the system being focused on and other systems not under the project’s direct control. The foundation for these practices is established in Acquisition Requirements Development:
Develop requirements for the interfaces between the acquired product and other products in the intended environment. (ARD SP 2.1.2)
The installation of the specific goal in Acquisition Technical Management as part of overseeing effective system development from the acquirer’s perspective is a powerful means of addressing systems-of-systems problems. The following note is part of Acquisition Technical Management specific goal 2:
Many integration and transition problems arise from unknown or uncontrolled aspects of both internal and external interfaces. Effective management of interface requirements, specifications, and designs helps to ensure implemented interfaces are complete and compatible.
The supplier is responsible for managing the interfaces of the product or service it is developing. At the same time, the acquirer identifies those interfaces, particularly external interfaces, that it will manage.
Although these model features provide a basis for addressing some of the challenges of delivering the capabilities desired in a systems-of-systems environment, the methods best suited for various acquisition environments await further development in future documents. CMMI-ACQ is positioned to facilitate that development and mitigate the risks inherent in these acquisitions.
Daniel J. Luttrell and Steven Kelley
The Minuteman III system now deployed is the pinnacle of more than 50 years of rocket science applied to Intercontinental Ballistic Missiles (ICBM). In 1954, the U.S. Air Force awarded the Systems Engineering and Technical Direction (SETD) contract to the Ramo-Wooldridge Corporation (now Northrop Grumman) for Ballistic Missile support. That contract award was the seed that eventually grew to become the ICBM Prime Integration Contract (IPIC).2
As ICBMs were developed, from the early Atlas, Thor, and Titan missiles into the Minuteman and Peacekeeper systems and then continuing with the Minuteman III missiles today, Northrop Grumman has built and maintained detailed mathematical models to predict missile flight path, accuracy, and probability of success. In the beginning, these models consisted of technical papers and reports on the current system’s abilities and predictions for future systems being proposed. After the systems were fielded, the predictive models were sustained and continually upgraded with each new data set. Modeling and statistical analysis served as the basis of the original contract and remains the heart of today’s IPIC program.
Due to Minuteman’s success, this system will maintain operational readiness for the foreseeable future. Its ability to continue its mission well beyond its original service life is evidence of its robust design and sustainment activities. Over the years, many of the components and subsystems have been replaced or upgraded, while many other elements of the system remain with limited or no refurbishment.
The Northrop Grumman Missile Systems organization, based primarily in Clearfield, Utah, has used statistical analysis to build, sustain, and improve the ICBM weapon system and the organization’s engineering and business processes since ICBMs were first developed. In 1997, the Air Force moved away from managing all of the separate pieces of the ICBM system to using a single prime integrator. Northrop Grumman was awarded the contract, and the IPIC Team (consisting of Northrop Grumman plus more than a dozen subcontractors, including Boeing, Lockheed-Martin, ATK, and Aerojet) assumed responsibility for the integration and sustainment of the end-to-end ICBM system. As the prime integrator, Northrop Grumman manages the contract and the IPIC Team shares responsibility for fulfillment of the ICBM mission with the Air Force. This contract is unique because it is based on system performance with a focus on weapon system health. This contract structure for prime integration has worked effectively over a long period of performance.
The Minuteman III missile is a three-stage, solid-propellant rocket motor with a liquid-propellant fourth stage; it is housed in an underground silo. The service life for the IPIC Minuteman system is being extended through 2030, which presents a technical challenge: The system was originally fielded in the 1970s with an expected service life of 10 years.
The IPIC Team has been the sole integrator of the complete ICBM weapon system, including all subsystems, since program inception. IPIC efforts include sustainment engineering, systems engineering, research and development, modifications, and repair programs.
Through joint technical integration, the IPIC Team advises the Air Force, which makes all decisions on proposed technical approaches. This separation of responsibility allows industry, including the prime integrator and subcontractors, the freedom to deliver a best-value product that meets the customer’s weapon system performance requirements and ensures mission success. The IPIC contract has established a weapon system management approach that empowers both the government and the contractor team, holding them jointly accountable to deliver warfighting capability to the end user.
The Air Force provides incentives for the IPIC Team to maintain system performance as measured by key readiness parameters. The focus is on weapon system capability, with government direction addressing what to do, but not specifying how to do the job. Northrop Grumman has been given the flexibility to manage weapon system risk and streamline processes to achieve the desired outcome.
The IPIC contract was part of the Air Force “Lightning Bolt” acquisition initiative and delivers significant cost savings to the Air Force. While other prime integration contracts have not performed as well as expected on development efforts, Northrop Grumman’s prime integration approach on IPIC has been very successfully applied to Minuteman sustainment and life extension. This contract vehicle establishes a framework that allows for all analysis, advisory, and solution development efforts to be tied to one contract, thereby providing a unified industry position for the decisions the Air Force must make for system sustainment. To a large degree, the same industry base benefits from the ongoing sustainment and upgrade work identified by analysis under the contract. This framework allows the contractors to retain engineering talent and data developed with the original system and keep them up-to-date on system improvements as they happen. It also allows industry to bring in new personnel and train them, alongside seasoned veterans, in the systems and processes used to maintain and enhance the system.
The advantage to the Air Force of this arrangement is that it can rely on a stable team that performs well together and has done so for more than a decade. Today, the IPIC Team continues to evolve, improve teaming relationships, and deliver performance in support of the Air Force customer.
IPIC consists of two major types of efforts, as illustrated in Figure 6.3: (1) sustainment to analyze the aging Minuteman system and (2) development of modification programs to modernize the system as required to extend its service life through the year 2030. Modification programs are integrated into the IPIC contract individually as they are approved and negotiated.
Sustainment consists of sustaining engineering and weapon system assessment. The sustaining engineering function responds to issues with the operational system as they arise. The weapon system assessment function analyzes system subelement data proactively gathered from testing. Data from all sources are fed into performance models that enable the team to identify weapon system risks far enough in advance of system impact that team members can fully characterize the extent of the risk, determine appropriate short-term mitigation actions, create a plan, and ultimately mitigate the risk entirely. System risk assessment is driven by statistical models for missile performance to support the key readiness parameters: availability, reliability, accuracy, and hardness. Analyzing these performance models in conjunction with aging models enables the IPIC Team to determine whether all subsystems and the system as a whole will function correctly through the target dates (2030 and beyond).
Effective sustainment of the Minuteman force requires the government and Northrop Grumman to continually balance dual priorities: keeping the missiles on alert, while fielding vital new hardware and software upgrades that extend the service life of the system. All elements of the system are involved— cmillions of custom software lines, thousands of computers, all types of communications, and thousands of mission hardware items. All must be maintained and enhanced both safely and cost-effectively. Engineering productivity has been increased by consolidating contractor statements of work, eliminating redundant tasking, and employing commercial best practices. When combined with innovative procurement approaches for major subsystem upgrades, Northrop Grumman’s approach significantly reduces the overall cost of system sustainment.
Applying a systems approach has been the key factor in ICBM success for more than a half century, since the first strategic land-based deterrent was introduced in the 1950s. The ICBM Systems Approach is pictured in Figure 6.4.
Applying systems engineering requires a complete spectrum of engineering and science disciplines to understand and manage complex interdependencies. Systems engineering also interacts with the goal of achieving overall weapon system optimization. A full review of all considerations is completed prior to every relevant action on the system. This review employs a comprehensive weapon system risk management process. As experience has shown, the Northrop Grumman systems approach is the best way to identify and mitigate unintended consequences resulting from subsystem changes.
To further enhance the systems approach, Northrop Grumman has developed mature and disciplined processes that have improved with time to provide consistent and reliable results. Documented in an easily accessible organizational command media and program data site, these processes are essential for maintaining the quality demanded of such a mission-critical system. These processes have been tested by time and improved through measurement against industry best practices such as ISO standards, AS9100, CMMI, and Lean Six Sigma.
Domain knowledge consists of extensive and sustained expertise about the system—from the beginning of concept development through deployment and decommissioning. Northrop Grumman maintains a full history of data for all aspects of the system (i.e., ICBM performance data, system configurations, and upgrades) along with long-term human expertise in the concept, design, development, deployment, modification, infrastructure, operations, and support functions used to maintain system viability.
Engineers employed at the beginning of the ICBM program are still on the job today. Northrop Grumman systematically hires new graduates and assigns them to work alongside these experienced engineers, from whom they absorb domain knowledge; this practice ensures that the new-generation engineers will be ready to carry on when older generations retire. This domain knowledge enables thorough analysis when issues arise.
The final ingredient of ICBM’s success is total weapon system integration—the control and coordination of separate complex elements and activities (e.g., interfaces, resources, milestones) to achieve required system performance. Integration is the activity that enables IPIC to avoid gaps, overlaps, and system suboptimization. Complete integration of all related subsystem activities is vital to avoid disconnects or unintended consequences when implementing changes and upgrades.
When so many changes are taking place across the system concurrently, the system engineering, integration, and process functions become critical as means to avoid adverse impacts. Management of individual efforts is also critical to performance and reporting on cost and schedule, as each effort is tracked and rolled up to the system level for top-level customer review. The complex system and contract include parallel activities both in the execution of modifications and in deployment. Integration and optimization of deployment activity minimizes resource expenditure while ensuring that operational readiness is maintained at acceptable levels.
The IPIC mission is to maintain the total ICBM weapon system and modernize the Minuteman III to extend its useful life to 2030. The expected product of this mission is an ICBM weapon system that maintains its readiness to launch 24/7/365 while at the same time undergoing testing, evaluation, and upgrades. Success, in this case, is measured by comparing predicted results with observed results and getting no surprises from year to year. Process discipline, predictability, and improvement are vital across the enterprise to enable mission success.
Over its lifetime, the IPIC program has examined process standards and models from DoD, IEEE, ISO/AS9100, CMMI-DEV, CMMI-SVC, and CMMI-ACQ to help frame its internal process development and improvement activities. As with all programs, some standards have proved more useful than others. Nearly all share the same fundamental “Plan, Do, Check, Act” construct. Northrop Grumman arranged the AS9100 framework (a contractual compliance requirement) around the simple “Plan, Do, Check, Act” core to codify a broad process framework for management. Illustrated in Figure 6.5, this operating philosophy is the essence of what Northrop Grumman and its subcontract team must do to succeed.
This operating philosophy includes the following elements:
Define Management System: Documented on a website accessible by all Program Team members (including teammates, subcontractors, and the Air Force) are assets that include program requirements establishing local policy for performing work, implementation plans, additional clarification on how processes are to be implemented, and other plans to establish a framework for operations. The site also contains handbooks, guides, templates, and tools to support the processes.
Establish Goals and Requirements: In the Management System diagram, IPIC’s primary goal is to meet or exceed all customer expectations and achieve excellent award fees. All other goals flow from the primary goal and from government contract direction.
Develop Process Guidance: Accessed via the website, the process guidance documentation includes mandatory items plus handbooks, guides, forms, tools, and templates. Each mandatory document has a process leader, a subject matter expert, and a process reviewer who has the authority to approve or reject changes after review by the appropriate change control board.
Prepare Plans: IPIC plans are a subset of process guidance, also accessed via the web page, that is mandatory for compliance. Each program element and key function has an approved plan for execution (e.g., Measurement, Program Management, Data Management).
Perform Training: Mandatory and optional classes and online computer-based training are available, along with informal mentoring and attendance at seminars and conferences.
Perform Work and Develop Products: The primary product of assessment engineering is ongoing weapon system modeling and risk analysis. As subsystems are sustained and upgraded, new issues may arise that also require mitigation. Some mitigations may consist of small corrective efforts or simple updates to technical orders. Others may require significant product development such as the design and manufacture of one or more system components.
Retain Records: Records are retained in a variety of databases and web tools with appropriate levels of configuration control.
Measure Performance: IPIC maintains a plan describing what to measure, how to measure it, how to analyze the data, and where the data and analysis are to be reported and stored. A strong measurement and statistical analysis program is in place to measure and predict future costs and schedule performance as well as weapon system health and readiness.
Assess Compliance: Northrop Grumman has an active internal assessment program. If a recurring lack of compliance is identified, the team first evaluates the underlying process documentation to determine if the process should be updated or improved before assuming that poor discipline is the source of the problem. Northrop Grumman measures IPIC against the ISO/AS9100 standard and the CMMI-ACQ model with both internal and external audits, and has completed most audits without negative findings. Northrop Grumman Missile Systems/IPIC was the first organization externally appraised at maturity level 5 for CMMI-ACQ and published by the SEI on the Published Appraisal Results (PAR) website in 2009.
Obtain and Share Customer Feedback: Customer feedback comes in the form of both praise and constructive criticism on a daily basis. Two formal feedback mechanisms are Contract Performance Assessment Reports (CPAR) and Award Fee Letters. These reports require formal response and action items to address negative remarks. Award fee results are also analyzed for recurring themes and shared with the IPIC employees.
Conduct Management Reviews: Northrop Grumman conducts internal and external reviews at various program levels. Staff meetings, contractor Integrated Product Team (IPT) meetings, joint IPT meetings, and program-level management reviews are conducted on a recurring basis.
Perform Corrective and Preventive Actions: Corrective actions are required for cost, schedule, or technical variances, and are also created at all levels of the program and the organization to address internal or external audit findings, negative customer feedback, and other issues that arise. Preventive actions may also be required depending on the issue or in instances where the issue appears in more than one place.
Improve Processes: An ongoing improvement activity serves to enhance program performance and customer/employee satisfaction with the process architecture. IPIC maintains a database that can be used by any employee, teammate, or customer to submit a problem discovered or a suggestion for improvement. The process integration team reviews and prioritizes suggestions for action. Northrop Grumman uses the Lean and Six Sigma tool sets along with Theory of Constraints thinking processes to guide data collection and process improvement planning and execution.
Starting with the “Plan, Do, Check, Act” core with an encircling AS9100 framework, which provides the broad basis for the IPIC management system (as outlined in the preceding section), CMMI models—with their narrower focus—add a much-needed depth of guidance for management activities.
IPIC was successfully appraised both internally and externally against CMMI-DEV for applicable process areas (excluding Technical Solution and Product Integration). Because Northrop Grumman does not develop products on IPIC, but rather acquires them from subcontractors that develop those products from system-directed requirements, other CMMI models were investigated as they were released. After examining and ruling out the CMMI-SVC model, Missile Systems evaluated the CMMI-ACQ model. The Acquisition model is a more suitable evaluation structure for the pure acquisition work performed by Northrop Grumman on this contract.
Mathematical models and statistical analysis were developed as a technical solution to support the customer (i.e., the Air Force) for decades with a proven track record of success. The voice of the process on the technical side supplied good historical data to support and update technical baselines. Modeling future expectations from these baselines was a logical extension that provided solid information for technical decisions.
When Northrop Grumman began to investigate CMMI models for use on IPIC, it became evident that these same data analysis, baseline development, and mathematical modeling techniques could be applied successfully in the nontechnical arena to manage the program, the business, and the underlying processes. Significant process improvements were derived from high maturity elements of CMMI models using baselines discovered in existing program data as a source for new model development.
As an example, applying moving range analysis to program management data produced a predictive model that could accurately forecast when program work accomplishment trends were eroding compared to plan, even though the standard program metrics might show no cause for concern. Analysis of payment cycle time data, stratified by contract type, yielded stable baselines to provide the basis for another new model to analyze payment trends—a key business process output.
For acquisition, Northrop Grumman “discovered” another baseline in readily available data for mitigation plan development. Combining these data with an existing technical model yielded another new model around the key function of what to acquire and when. Baselines for first-pass acceptance and internal assessment processes led to further acquisition related models that produced actionable information enabling the organization to improve acquired product acceptance to an essentially error-free state and to gain an improved understanding of enterprise-wide process performance.
Generic practices provided another opportunity to expand process maturity. Rather than institutionalizing only CMMI processes, Northrop Grumman chose to apply generic practices to all of its process areas (e.g., Human Resources, Business Development). Beyond expanding the application of CMMI within Northrop Grumman, generic practices such as Process and Product Quality Assurance (PPQA) facilitated even more improvements by helping the organization to develop a more objective and useful QA process.
Prior to adopting CMMI, audits were focused merely on outputs, with only the government Data Item Description (DID) serving as the standard. Process audits did not “feel” objective to those being audited. With CMMI, Northrop Grumman separated product from process audits and developed standards for each. Key steps were identified for each process, and the audits became much less confrontational and easier to complete. Process audit findings are never directed toward the performer, but are focused instead on the key steps. The organization cross-trains audit volunteers who want to understand other parts of the business, thereby helping engineers learn about business processes, and business process employees learn about engineering. Audits are no longer viewed as contentious events to be avoided, but rather as a chance to discuss work and ideas with a fellow employee who wants to learn and ensure the success of the program.
The focus of Acquisition Requirements Development (ARD) and Acquisition Requirements Management (ARM) in the CMMI-ACQ model versus Requirements Development (RD) and Requirements Management (REQM) in the CMMI-DEV model is a good example of how the change in focus provided a much better fit for IPIC at Northrop Grumman. ARD and ARM specific practices dive deep into the way in which requirements flow down the supply chain and suppliers are held accountable for meeting them, while RD and REQM focus on deriving lower level requirements from user needs.
Managing supplier development and delivery to requirements is a substantial process area that Northrop Grumman executes on the contract, for which no credit could be taken using the CMMI-DEV model. Solicitation and Supplier Agreement Development (SSAD) was also a good fit. Northrop Grumman (on IPIC) does not directly develop any product components, but does frequently initiate new supplier agreements. Having a process area focused on the critical aspects of the solicitation and agreement development process from an acquirer’s viewpoint is a new and useful tool for both government and industry acquirers. From a contract value perspective, acquisition accounts for 81 percent of the total contract value, and “getting it done right” in the beginning pays dividends later in the acquisition.
Coming up to speed on CMMI-ACQ has been a rewarding experience. Northrop Grumman began a pilot of the model on IPIC in the summer of 2008. In the same time frame, the Air Force independently requested a SCAMPI B appraisal. Only the ACQ processes were evaluated, and even with little time to prepare and no experience with this new model, the outcome was generally positive. In May 2009, with an improved understanding of the model, Northrop Grumman conducted an internal SCAMPI C appraisal covering all process areas. Again, the results were positive, leading to success in the first ever recorded SCAMPI A maturity level 5 against the CMMI-ACQ model in December 2009. The Lead Appraiser was also one of the model developers.
Given this resounding success, is there anything about the CMMI-ACQ model that could be improved? It has been our experience that contract organizations such as Northrop Grumman (on IPIC) and military organizations that perform acquisition have very small staffs who are assigned to any single acquisition. Staff numbers typically range from one or two to four or five people, supported by larger organizations that supply staff augmentation for short periods such as Solicitation and Supplier Agreement Development or Validation. The day-to-day work is carried out by a small team whose size is not equivalent to that of the project teams in an average development organization. Therefore, some of the process areas normally handled within a typical development project would more naturally be performed outside the acquisition project team.
Some of these process areas are obvious (CM and PPQA), but others are ambiguous (AVAL and QPM). Goals for quantitative management are driven by the larger organization, and not by the small acquisition teams. For example, Northrop Grumman’s acquisition project data for IPIC are aggregated across multiple acquisitions and not analyzed by each project. This approach is intentional because of insufficient data inside each acquisition project and because many process improvements are intended for the larger organization and not just for the specific project. Perhaps, then, QPM in the CMMI-ACQ model could be described as Quantitative Process Management (rather than Project Management), and the decision of where the practices are performed might be left to the organization.
Northrop Grumman’s unique approach to Minuteman III sustainment and modernization has provided the following objective, tangible results:
• Consistently high award fee scores—averaging 95.5 percent
• High Contractor Performance Assessment Reports (CPARs)—average score of purple or better
• Favorable cost performance—a 3 percent favorable cost variance on more than $6 billion worth of executed program
• Technical performance resulting in weapon system readiness far above threshold specifications
• Consistently reliable flight test demonstrations since program inception
This contract structure, with its integration of prime integrator and customer decision makers, enhanced by management systems and implemented models for organizational management in addition to technical and programmatic variables, is an excellent approach for any mature, fielded weapon system.
George Richard Freeman, Technical Director, U.S. Air Force Center for Systems Engineering3
When the U.S. Air Force (AF) consolidated various systems engineering (SE) assessment models into a single model for use across the entire AF, the effort was made significantly easier by the fact that every model used to build the expanded AF model was based on Capability Maturity Model Integration (CMMI) content and concepts. This consolidation also set into motion an approach toward addressing information used across the enterprise in ways that were previously unimagined.
In the mid-1990s, pressure was mounting for the Department of Defense (DoD) to dramatically change the way it acquired, fielded, and sustained weapons systems. Faced with increasingly critical Government Accounting Office (GAO) reports, and with a barrage of media headlines reporting on program multi-billion-dollar cost overruns, late deliveries, and systems that failed to perform at desired levels, the DoD began implementing what was called acquisition reform.
Many believed that the root of the problem originated with the government’s hands-on approach, which called for it to be intimately involved in requirements generation, design, and production processes. The DoD exerted control primarily through government-published specifications, standards, methods, and rules, which were then codified into contractual documents. These documents directed—in often excruciating detail—how contractors were required to design and build systems.
This approach resulted in the government unnecessarily bridling contractors. In response, the contractors vehemently insisted that if they were free of these inhibitors, they could deliver better products at lower costs. This dual onslaught from the media and contractors resulted in sweeping changes that included the rescinding of a large number of the military specifications and standards, slashing of program documentation requirements, and a major reduction in the size of government acquisition program offices.
The language of contracts was changed to specify the desired “capability,” with contractors being given the freedom to determine “how” to best deliver this capability. This action effectively transferred systems engineering to the sole purview of the contractors. The contractors were being relied upon to deliver the desired products and services, while the responsibility for the delivery of viable solutions remained squarely with the government.
What resulted was a vacuum in which neither the government nor the contractors accomplished the necessary systems engineering. Over the following decade, the government’s organic SE capabilities virtually disappeared. This absence of SE capability became increasingly apparent in multiple-system (systems-of-systems) programs in which more than one contractor was involved.
Overall acquisition performance did not improve; indeed, in many instances, it became even worse. While many acquisition reform initiatives were beneficial, it became increasingly clear that the loss of integrated SE was a principal driver behind continued cost overruns, schedule slips, and performance failures.
In response to these problems, on February 14, 2003, the AF announced the establishment of the Air Force Center for Systems Engineering (AF CSE). The CSE was chartered to “re-vitalize SE across the AF.” Simultaneously, major AF Acquisition Centers initiated efforts to swing the SE pendulum back the other way.
To regain some level of SE capability, many centers used a process-based approach to address the challenge. Independently, three of these centers turned to the CMMI construct and began tailoring it to create a model to be used within their various program offices, thereby molding CMMI to meet the specific needs of these separate acquisition organizations.
Recognizing the potential of this approach and with an eye toward standardizing SE processes across the entire AF enterprise, in 2006 the AF CSE was tasked to do the following:
• Develop and field a single Air Force Systems Engineering Assessment Model (AF SEAM)
• Involve all major AF centers (acquisition, test, and sustainment)
• Leverage current SE CMMI-based assessment models in various stages of development or use at AF centers
Following initial research and data gathering, in the summer of 2007 the AF CSE established a working group whose members came from the eight major centers across the AF (four Acquisition Centers, one Test Center, and three Sustainment Centers). Assembled members included those who had either built their individual center models or would be responsible for the AF SEAM going forward.
The team’s first objective was to develop a consistent understanding of SE through mutual agreement of process categories and associated definitions. Once this understanding was established, the team members used these process areas and, building on the best practices of existing models, together developed a single AF SEAM. Desired characteristics included the following:
• Be viewed as adding value by Program Managers and therefore “pulled” for use to aid in the following:
• Ensuring standardized core SE processes are in place and being followed
• Reducing technical risk
• Improving program performance
• Scalable for use by all programs and projects across the entire AF
• Self-assessment based
• Independent verification capable
• A vehicle to share SE lessons learned and best practices
• Easily maintained
To ensure a consistent understanding of SE across the AF and provide the foundation on which to build AF SEAM, the working group clearly defined ten SE process areas (presented in alphabetical order here):
The model’s structure is based on the CMMI construct of process areas (PAs), specific goals (SGs), specific practices (SPs), and generic practices (GPs). The AF SEAM development team subsequently amplified the 10 PAs listed above with 33 SGs, 119 SPs, and 7 GPs.
Each practice includes a title, description, typical work products, references to source requirements and/or guidance, and additional considerations to provide context. While some PAs are largely an extract from CMMI, others are combinations of SPs from multiple CMMI PAs and other sources. Additionally, two PAs not explicitly covered in the CMMI model were added—namely, Manufacturing and Sustainment.
It is equally important to understand where AF SEAM fits within the assessment continuum (depicted in Figure 6.6). The “low” end of this continuum is best defined as an environment where policies are published and “handed off” to the field, with the expectation that they will be followed and in turn yield the desired outcomes. In contrast, the “high” end of the assessment continuum is best defined as an environment replete with a high degree of independent engagements, including highly comprehensive CMMI assessments. AF SEAM was designed to target the “space” between the two (“CMMI Light”) and provides one more potential tool for use, when and where appropriate.
To aid in the establishment of a systems engineering process baseline and, in turn, achieve the various aims stated above, programs and projects are undergoing one or more assessments followed by continuous process improvement. An assessment focuses on process existence and use; thus it serves as a leading indicator of potential future performance.
The assessment is not a determination of product quality or a report card on the people or the organization (i.e., lagging indicators). To allow for full AF-wide scalability, the assessment method is intentionally designed to support either a one- or two-step process.
In the first step, each program or project performs a self-assessment using the model to verify process existence, identify local references, and identify work products for each SP and GP—in other words, the program or project “grades” itself. If leadership determines that verification is beneficial, an independent assessment is performed.
This verification is conducted by an external team that confirms the self-assessment results, adds other applicable information, and rates each SP and GP as either “satisfied” or “not satisfied.” The external assessment is conducted in a coaching style, which facilitates and promotes continuous process improvement.
Results are provided to the program or project as an output of the independent assessment. The information is presented in a manner that promotes the sharing of best practices and facilitates reallocation of resources to underperforming processes. The AF SEAM development working group has also developed training for leaders, self-assessors, and independent assessors, who undergo this training on a “just-in-time” basis.
Since it was originally fielded, the use of AF SEAM continues to grow across the AF. With a long history of using multiple CMMI-based models across the AF, significant cultural inroads have been made to secure the acceptance of this process-based approach. The creation and use of a single AF-wide Systems Engineering Assessment Model is playing a significant role in the revitalization of sound SE practices and the provision of mission-capable systems and systems-of-systems on time and within budget.
Additionally, this process-based methodology has opened up an entirely new frontier—specifically, the advancement of a disciplined systems engineering approach to other areas across the enterprise, otherwise known as integrated systems engineering: looking beyond the traditional realms.
Figure 6.7 illustrates the traditional “product” focus of systems engineering, which AF SEAM applies to. The process begins with the identification of requirements by the customer. Using the various SE processes applied to Systems Process Engineering, these requirements are broken down and allocated to various subelements that are then processed through the Acquisition process. Both areas are supported by Architecture. The end result is a Deliverable to the customer. This view, which is commonly referred to as the “product view,” illuminates the application of AF SEAM in the context of other associated activities.
An enterprise must continually make resource allocation decisions in its never-ending drive to deliver the highest “value” to the customer at the lowest possible total ownership cost (TOC), within acceptable levels of risk.4 To achieve this goal, the data and information that are produced within the product view must be fed back to leadership.
This provision of information to leadership is achieved through the addition of the “enterprise view” depicted in Figure 6.8. Note the addition of the Business Process Engineering activity adjacent to the Systems Process Engineering activity. Also note that although systems process engineering deals with customer requirements, more often than not these requirements are communicated through enterprise business processes.
Gaining an increased understanding of how process-based assessments are executed in the context of other influencing activities is vitally important. Understanding the context of assessments also points one toward potential avenues for expansion of process-based analysis and improvement—hence the designation of continuous process improvement (CPI) and its applicability to both Business and Systems Process Engineering.
Data and information that are created within an enterprise are used to support resource allocation and other business decisions. However, this information is incomplete without inclusion of the customer’s voice. In Figure 6.9, the customer’s voice is depicted through the addition of the “service view.” Note the feedback loops, including where they traditionally enter into the various processes. Also note the inherent challenges in transmitting this feedback to the activities that would truly impart change upon the processes supporting each of the views.
Appropriately visualizing and addressing process activities as the heartbeat of the enterprise is another aid to the AF in its goal to deliver needed capabilities on time and within budgetary constraints. Beginning in 2007 with the CMMI-based AF SEAM development effort that focused on the SE processes primarily supporting the product view, this approach set into motion an entirely new way of examining this complex enterprise.
The path ahead for AF SEAM continues to look promising. Individual users are finding value in using AF SEAM, and even in an environment of austere resources it has taken hold as the SE process analysis tool of choice at the practitioner level. This “grassroots” support is directly attributable to the firm foundation upon which the model was constructed combined with the collaborative way in which it was built. Policy makers and leaders are also looking for ways to capitalize further on this process-based leading indicator approach. Figure 6.10 depicts the current state of AF SEAM in many locations as its use continues to mature. AF SEAM is being augmented by various center variants, while a separate path takes an office through its preparations for a Unit Compliance Inspection (UCI). A UCI measures various artifacts and, therefore, is an assessment of lagging indicators.
As AF SEAM continues to mature and its use proliferates across the AF centers and individual program offices, the AF is moving toward the merging of independent leading and lagging indicator assessments. Figure 6.11 suggests how further evolution of standard processes (leading) might merge desired outcomes (lagging) into a single coordinated effort. The combination of these two assessments into a single holistic approach is intended to lessen the workload on individual offices while simultaneously providing a clearer picture of processes to leaders.
Armed with the continually improving process-based approach described in this essay, leaders are more likely to take advantage of these combined tools, which promise to yield improved information with less overall effort. This evolution will, in turn, facilitate improved allocation of resources. Finally, as the challenge of working within highly complex systems-of-systems environments becomes the norm, it is increasingly important that the associated supporting processes have the ability to integrate with one another in ways previously unimagined. The standardization of AF SEAM and the associated processes it describes across the AF enterprise represents a positive step toward achieving these aims.
Dominique Luzeaux, Eric Million-Picallion, and Jean-René Ruault
The Direction Générale de l’Armement (DGA) is part of the French Ministry of Defense and is responsible for the design, acquisition, and evaluation of the systems that equip the armed forces, the export of these systems, and the preparation of future weapon systems. Its action encompasses the whole lifecycle of these systems.
The DGA is organized into seven directorates: strategy, operations, technique, international development, planning and budget, quality and modernization, and human resources. The Operations directorate is directly responsible for the design and acquisition of the equipment used by the armed forces and the corresponding budget. Operations is divided into ten divisions: land systems, helicopters, missiles, aircraft systems, naval systems, space and C4ISR systems, deterrence, nuclear and bacteriological protection, the “Coelacanthe” weapon system, and the “Rafale” weapon system. The Land Systems division, which is the focus of this essay, is in charge of all land systems (i.e., combat, transportation, and bridging vehicles; direct and indirect fire systems; ammunitions; infantry and paratrooper equipment; various equipment for troop sustainment; and training and simulation).
The DGA passed ISO 9001 certification four years ago at the enterprise level and is engaged in continuously improving its main processes. The quality process implemented by the DGA relies on a long-term vision aimed at improving the value of acquired systems. Therefore, the decision was made in 2007 to obtain a CMMI level rating, starting with level 2 and then addressing higher levels.
The expected benefits of CMMI-ACQ for the DGA included the following:
• Improved efficiency achieved by measuring tried and tested practices, capitalizing them into documented process references, and increasing professionalism
• Organization improvement by the optimization of estimation, planning, and monitoring and skills development and resource allocation
• Improved operational results through better respect of foreseen budgets and delays and productivity gains
• Recognition by partners, through objective proof of acquisition efficiency
To reach level 2 of the CMMI-ACQ model, an enterprise should implement project and acquisition management processes, such as Requirements Management (REQM), Configuration Management (CM), and Acquisition Requirements Development (ARD). These process areas implement DGA’s Stakeholder Requirements Definition Process and the Requirements Analysis Process.
These process areas are very important topics for DGA because they concern its core acquisition activities. Indeed, DGA’s architects take as input the needs expressed by the Joint Staff, as well as other stakeholder requirements, such as regulations and laws. They translate these needs and requirements into technical requirements to elaborate technical specifications. These technical specifications form the core of the acquisition agreement between the DGA and its suppliers. They allow the DGA to acquire solutions that satisfy stakeholder needs within the allotted budget and schedule constraints. Furthermore, the DGA’s architects must maintain the systems’ definition throughout the whole lifecycle.
It is the systems architect’s job to account for the flow of evolving needs and to assess their impacts on the definition of the current system architecture. The architect must upgrade the current systems’ architectures, maintain operational capabilities, and manage the systems to the budget. These tasks imply that the architect analyzes the impact of all critical dimensions (i.e., operational, financial, schedule, security, and safety) and decides how to implement these requirements and elaborate adequate technical specifications. To achieve these goals, the architect must have an up-to-date architecture definition of the baseline system, a clear set of baseline requirements with their rationale, and appropriate system engineering methods and tools.
In this context, systems engineering processes and activities must be defined and applied. Moreover, these activities must be more and more agile and efficient, allowing the systems architect to master both the architecture definition of the system and evolving requirements to assess the effects of change requests, and to make the best decision.
The CMMI-ACQ model gives the DGA an opportunity to improve its systems engineering practices aimed at achieving this agility and efficiency. Current guidelines and procedures are mapped to CMMI-ACQ specific goals and practices. If necessary, the lack of relevant documents is identified and corrected, and appropriate sets of measurements are defined and applied to activities to enable these activities to reach a higher efficiency level.
The CMMI certification process was launched in 2007. The first assessment focused on management practices and highlighted the differences between the practices implemented by the DGA and the recommendations of the CMMI-ACQ model. Three divisions were chosen among the ten in a first stage. All of these divisions have submitted to a Standard CMMI Appraisal Method for Process Improvement (SCAMPI B) evaluation. The goal in mind was a level 2 certification with evaluation on activities such as process management, project management, and contract and supplier management. This evaluation process ended with a successful SCAMPI A evaluation at the end of 2009 for two divisions; evaluation of the third division was scheduled to be completed at the end of 2010.
For the Land Systems division, several major projects have been evaluated (i.e., infantry combat vehicle, soldier combat system, large-caliber indirect fire system, armored light vehicle, light battle tank, main battle tank, future crossing bridge system), which are in different phases of their lifecycle. Half of the projects have been evaluated on all nine process areas required to achieve level 2 certification. Objective evidence has been submitted to the evaluators and interviews of the integrated project teams have been conducted.
For each project, all the available documents were analyzed before the interviews were conducted. Both divisions were evaluated on very different projects, but the evaluation yielded similar results—which is not especially surprising given that the DGA has been focusing on sharing common methods for quite some time and relies on a matrix organization that fosters such a fruitful exchange. Besides, both divisions have taken advantage of DGA’s CMMI project to share and harmonize their practices and to apply common improvements on a short-term basis in a true continuous improvement process.
The bottom-up approach underlying CMMI, which consists of identifying and sharing good practices as observed in everyday life, has been widely appreciated by DGA’s various teams. These teams are convinced that the CMMI approach supports a quality approach, an internal improvement process, and a standardization of identified best practices. For instance, the reporting spreadsheets developed for some of the projects have been generalized to the various projects.
Several DGA team members have dedicated a large part of their time to the CMMI project. (The word “project” is not misused here: The management processes have been applied in a recursive manner to the CMMI-based improvement effort with planning, resource estimation, project meetings, resource management, and so on.) Working groups have been established to find solutions when gaps between performance and the model are identified.
A special effort is dedicated to the collection of meaningful artifacts in each project. As the DGA’s experience demonstrates, the resource investment involved in CMMI-based improvement is not a negligible one. The amount of work on average for both of the DGA’s SCAMPI evaluations can be estimated at one person-day per sector per project per evaluation, if the project is already mature in its management. Because the DGA’s integrated project teams encompass various skills (e.g., project manager, technical experts, quality experts, finance experts, procurement specialists), the workload differs from one member of the team to the other; for this reason, most preparation work for the evaluations was performed by the quality experts.
Several opportunities for process improvement have been found—for example, in the documentation management, resource allocation, and objective evaluations. Some of them were not a real surprise, and the CMMI project has helped to implement solutions to these issues. A side effect of the whole approach has been the forging of a much closer relationship between the evaluated divisions. Usually they deal with completely different customers and systems, but within the CMMI project they had to share their practices to achieve success within the given schedule.
The main criticism that could be raised about the effectiveness of this evaluation relates to the proof collection process, in that the definition of “proof” seemed rather subjective depending on the experts involved. The proof required from the divisions differed for the experts whom the teams relied on at the beginning of the project and the lead appraiser at the end of the project. This problem was exacerbated because the model (CMMI-ACQ V1.2) and the context of its implementation (a state-owned procurement agency) were new. We sometimes had the feeling that we were required to submit a “proof of a proof.” A lot of time was spent on that issue, and the return on investment for that first stage was rather low because most of the time the proofs already existed but were recorded differently.
For “systems of systems,” DGA experts based the systems engineering processes on the ISO/IEC 15288:2008 standard. However, this standard describes high-level processes and does not describe in an explicit manner the means to measure and improve them. In this context, CMMI-ACQ complements ISO/IEC 15288:2008 by defining key systems engineering processes and enabling their measurement and improvement.
Requirements Management (REQM), Acquisition Requirements Development (ARD), and Configuration Management (CM), as defined by the CMMI-ACQ model, are the keystones of DGA systems engineering processes and activities.
The Requirements Management process area details the following specific practices: Understand Requirements, Obtain Commitment to Requirements, Manage Requirements Change, Maintain Bidirectional Traceability of Requirements, and Identify Inconsistencies Between Project Work and Requirements. This process area is of great importance to DGA because it deals with understanding input requirements, obtaining commitments from stakeholders, and managing requirements changes. These activities are based on bidirectional traceability and consistencies between requirements and product work, and they help the architect of the DGA complete his or her work.
Although this process area is necessary, it is not sufficient by itself. There is a huge gap between customer requirements from the Joint Staff that expresses its needs with an operational capability point of view and the contractual requirements. Integrated teams must translate these operational customer requirements into functional requirements. To achieve this feat, they use functional analysis to elaborate functional architecture that supports attended operational capabilities.
After verifying the functional architecture and functional requirements with the Joint Staff using simulations and battle labs, the team translates them into technical requirements, which will be necessary for the acquisition process (e.g., tender, selection of a contractor) to follow.
The acquisition process can be modeled as the following activities:
• Operational requirements are translated into functional requirements.
• Functional requirements are translated into technical requirements.
• Technical requirements are used as the basis of the technical specification.
Every requirement is traceable from any one level to the one above and the one below. Likewise, each requirement, at each level, is traceable between engineering activities and integration activities, and follows a complete validation process.
Thus the Acquisition Requirements Development process area complements the REQM process area by specifying activities in three specific goals: Develop Customer Requirements, Develop Contractual Requirements, and Analyze and Validate Requirements.
During these activities, changes happen both as a result of evolving customer requirements and from compliance with suppliers’ requirements (e.g., unavailability of a technology, changing regulation). These changes have many effects. To avoid loss of the consistency of the architecture definition, each change triggers a change request procedure. An impact assessment is made based on the architecture definition of the current baseline. A review translates this information into a change request. If the change request is accepted, it implies an evolution of the architecture definition. These activities are controlled by the Configuration Management process area. The configuration control board is very important for tracking change and controlling consistency.
Once the mapping between the various activities of the systems engineering process is complete, improving the CMMI level at the enterprise level provides a mechanism for improving systems engineering. The mapping takes advantage of the measurement and continuous improvement possibilities provided by the CMMI-ACQ model, while sticking to a standardized engineering process. The various tools and technical standards developed for systems engineering can be related to the best practices promoted by CMMI-ACQ.
Claude Bolton
Much has been written about the state of program management, often called “acquisition” in the U.S. Department of Defense (DoD), and much of this discussion has not been flattering or positive. Acquisition is described as constantly late to schedule. It does not acquire what was needed. Its performance is less than expected. It spends grossly over original cost estimates.
As a result, a number of laws, rules, policies, and regulations have been produced over the years in an attempt to fix—or at least improve—the DoD acquisition process. In addition, a number of studies, commissions, think tank reviews, and other analyses have been chartered to determine the cause of the poor state of the DoD acquisition process. At last count, more than 100 such studies had been conducted, out-briefed, and “shelved,” with no apparent actual improvement in sight.
One has to ask, “Why hasn’t there been any significant progress? Why does the DoD acquisition process continue to struggle and underperform? Why is it, according to a 2008 GAO report, that of the nearly 100 major DoD programs reviewed, the majority of them experienced cost overruns of 27 percent on average and were 18 to 24 months behind schedule?”
While the GAO report’s methodology has been challenged with no rebuttal from the GAO, the fact remains that many programs are not performing adequately despite years of focus on improving the DoD acquisition process. Again, the question is “Why?” Are we not asking the right questions when it comes to DoD acquisition improvement? Are we not addressing the real root problem, but rather focusing on its symptoms? Are we collectively personifying Einstein’s definition of insanity—that is, are we continuing to do the same thing, yet expecting a different outcome? When it comes to the DoD acquisition process, Einstein may be on to something: From my vantage point, with rare exception, virtually all of the studies to date have not addressed the real problem.
“And what is the real problem?” you ask. If I were to tell you now, you might not bother to read the rest of this essay. And given that I may have to “prepare the battlefield” so that you will better understand the answer, let me give you my thoughts first and then explain the answer.
Would you be surprised if I were to tell you that all this fuss over the DoD acquisition process could end in a heartbeat and the process improve almost overnight by just following the “three R’s”? Or that actually executing acquisition in the DoD is as simple as the “three R’s”? It’s true. Don’t believe me? Let me tell you about the “three R’s” and how they, combined with CMMI-ACQ, could change DoD acquisition forever.
Most of us heard about and studied the “three R’s” when we attended grammar school: reading, writing, and arithmetic. If you could master those skills, you were well on your way to a great education and much success in the U.S. school system. Everything we learned involved the basics of the “three R’s.” Just look at what we are doing at this very moment. I am writing an essay that you are reading, so we are collectively using two of the “three R’s.” The idea is very basic, but without a good understanding of the basics early in our formative years, we would have a tough time communicating today. The same is true with arithmetic. Earlier in this essay, I shared some simple arithmetic regarding the GAO. Again, without learning the basics early in our lives, you and I would be lost by now.
What does all of this have to do with DoD acquisition? My view is that DoD acquisition, when reduced to its basics and practiced accordingly, is as simple as the “three R’s.” I know, I know. You are probably saying, “Bolton is crazy and must have lost a few brain cells once he retired.” Well, I may have lost a few brain cells, but not when it comes to acquisition and doing it right.
Let’s take a look at DoD acquisition and let’s see how the “three R’s” apply. First, I will admit that the “three R’s” for acquisition are different from those for grammar school. The “three R’s” in DoD acquisition have different meanings but are still quite simple. The following is a brief description of what I mean by the acquisition “three R’s”:
Requirements: This “R” stands for not only having formal, written operational requirements, but also—and even more importantly—understanding the requirements. In short, does everyone involved in the “big A” understand the requirement? At a minimum, does the warfighter who wrote the operational requirement understand it? Does the resource manager (i.e., the source of funding) understand the same requirement? Does the acquirer (i.e., the Program Manager) understand the same requirement? Does the test community understand the same requirement, particularly the DoD office of the Director, Operational Test and Evaluation (DOT&E)? Does the sustaining/logistics community understand the same requirement? And does the contractor (i.e., the builder) understand the same requirement? Preferably, all of the aforementioned community representatives of the “big A” should meet in the same room at the same time and make sure all have a common understanding of the formal requirement. If this is not done, do not proceed with the program.
Resources: While most would believe this “R” is only funding, my definition of “resources” takes it a bit further. Aside from adequate funding, this “R” includes appropriate management tools, practices, processes, policies, regulations, strategies, and other guidelines for all members of the “big A” team who will be involved in meeting the requirement and getting the appropriate capability to the warfighter. This need for adequate resources also includes industry.
Right People: This is perhaps the most important of the “three R’s.” Without the right people, nothing will be accomplished as planned, even if the requirements and resources are in place. The “right people” are those on both the government and industry sides of the team who are educated, trained, mentored, experienced, creditable, empowered, and trusted to do the job at hand (with the resources given), meet the requirements, and deliver the capability to the warfighter.
In my 30-plus years in DoD acquisition, every program that I have led, studied, or assisted (that has succeeded) followed these “three R’s.” Those that have violated one or more of the “three R’s”, at a minimum, have not met expectations and, at worst, have failed and been terminated after much delay and cost. While we could go on and on discussing each of the “three R’s” and apply them to various good and bad programs of today, the remainder of this essay will focus on just one of the “three R’s,” including how it relates to CMMI-ACQ.
Let’s turn our attention to the second of the “three R’s”—resources. Resources include, among other things, the tools, processes, and practices needed to meet the requirements. To this end, CMMI-ACQ is a natural fit. CMMI-ACQ is a process-oriented tool that lays out an orderly way of describing a program in terms of elements that can be devolved, managed, improved, benchmarked, and assimilated into an integrated whole, which is what the customer wanted, when the customer wanted it, and at the agreed-to price. “Resources,” when managed properly, actually integrate all of the “three R’s.”
For an example that illustrates this point, I will pick on an Army program that was on my watch during my tenure as the Assistant Secretary of the Army for Acquisition, Logistics and Technology (ASAALT). In this case, CMMI-ACQ was not used because it was not then available. Had it been available, the results might have been different. The program was called the Aerial Common Sensor (ACS) program. It was a joint Army–Navy program, for which the Army served as the lead. It was chartered to deliver an aircraft to the Army and Navy that would replace the Army’s Guardrail aircraft and the Navy’s EP-3 aircraft. Both of these aircraft provided airborne electronic surveillance for the services.
Because of the age of the aircraft, the increasing sophistication of electronic threat and surveillance, and the need to drive lifecycle costs down, the ACS program was conceived and approved. The pre-Milestone B activity, which included technology demonstrations, went well. It appeared as if all of the processes and the “three R’s” were in place and performing as expected. However, as Milestone B approached, changes were made to the operational requirements, which drove the size and weight of the aircraft. The effects were so dramatic that, during source selection for Milestone B, one of the competitors was eliminated because its aircraft was too heavy.
Within three months of the start of post-Milestone B activities, the program manager recommended termination of the contract. Why? The winning contractor’s aircraft weight had grown to the point that the ACS could not take off! Why did this happen? The interpretation of the operational requirements by the warfighter had changed.
Had the ACS project used the CMMI-ACQ tool from its inception and, in particular, throughout the activities leading to the Milestone B decision, ACS would likely be flying today. Had ACS followed the CMMI-ACQ methodology, it would have discovered the requirements issues early in its inception; it would have established procedures and processes to identify and resolved requirements challenges early in the pre-Milestone A program phase. Those new procedures and processes would have led to better use of the technology demonstrations and provided valuable knowledge about the feasibility of the warfighter requirements. All of these issues could have been resolved early in the program and led to success.
Let’s dwell here for a moment. We will consider what would have happened if the ACS had used the CMMI-ACQ tool—which actions would have taken place that would have resulted in a better outcome.
Because the ACS contract was terminated before system development and demonstration could occur, I will focus my remaining remarks on the requirements process. This focus is most appropriate because the change in the interpretation of the operational requirements at the start of the system development and demonstration (SDD) phase was the factor that ultimately caused contract termination shortly after the start of SDD as previously mentioned.
Had ACS been able to adopt and use the CMMI-ACQ model, it would have accomplished the following actions. First, it would have established an Acquisition Requirements Development (ARD) process to collaborate with the warfighter in the development and analysis of the operational requirements, and the translation of those requirements into a contractual arrangement with industry. The model would have insisted on the following goals and practices specially illustrated in the CMMI-ACQ text:
Second, it would have established an Acquisition Technical Management (ATM) process to manage all the technical reviews needed to ensure that the technical aspects of the program were on course. Establishing this process in the pre-Milestone B phase of ACS would have provided early recognition of the impacts of the changing requirements. It would have required the planning and scheduling of technical reviews early in the program history and ensured that all parties involved understood what knowledge was required and at what time—not just during the pre-Milestone B program phase, but throughout the SDD, production, fielding, and sustainment phases.
I believe CMMI-ACQ could have made a considerable difference in the ACS program and allowed it to continue successfully. Just by reading the purpose of each process area and reflecting on what could have been if the ACS had followed and realized each process area’s goals, it is clear that, if this program had been developed using CMMI-ACQ, the ACS would undoubtedly be in the Army and Navy inventories today.
Madhav Panwar
Building, managing, and acquiring information technology systems are not easy tasks. Unless the product you want is commercially available in a box and meets all of your organization’s needs, a number of tasks need to be performed to ensure that you build or acquire a system that enables your organization to meet its business objectives.
The Software Engineering Institute (SEI) has provided guidance on building and acquiring systems for more than two decades. As the technology has matured, so has the guidance. Using appropriate portions of the various models that the SEI and its partners in the CMMI Product Development team offer enables your organization to increase its success rate in implementing information technology solutions.
The U.S. Government Accountability Office (GAO) has worked with the SEI to ensure that the best practices captured in the model used earlier by the GAO, the Software Acquisition CMM, were both preserved and improved with the broadening of coverage in CMMI. Since then, the CMMI-ACQ model has proved itself to the GAO, and has become a valuable tool for our work.
At the GAO, we have been using the CMMI-ACQ model to evaluate federal agencies’ acquisition efforts. This use of CMMI enables the GAO to evaluate acquisition activities across the government using a common methodology. The GAO does not require that a certain level of CMMI certification be achieved or even recommend which practices to implement when a project is acquiring a good or service. Instead, the program manager makes those decisions based on the level of risk, criticality, funding, and other criteria.
The practices provided in CMMI are a reasonable set that assist the program manager and the organization by increasing the likelihood of success. Given our vast experience in looking at processes across the government, we are confident that the ones in CMMI-ACQ will help organizations with their acquisition efforts, particularly when they are tailored to meet the specific needs of the product or service being considered.
Over the years, the GAO has reported on the acquisition of large IT systems and their cost overruns, failures in meeting customer needs, midstream redefinition of requirements, and even outright project termination. These issues are really part of the history of IT systems in general; what the GAO reports reflects larger industry trends. Over the years, the success rate of acquisition programs has gradually improved.
The GAO has recommended that projects be defined in smaller increments, provide customers with quicker releases, and get feedback to improve the product. Defining requirements for a product that will be released in 5 to 7 years is a complex and difficult task. The recent encouragement to speed the IT acquisition cycle to match the commercial 18-month cycle puts yet more pressure on the acquisition office. When that is the case, a disciplined approach to acquisition like that contained in CMMI-ACQ is required.
CMMI-ACQ deals primarily with processes that result in the acquisition of goods and services. Today, more and more organizations are contracting out for their IT solutions. Federal agencies in the United States, for example, put out numerous solicitations for contractors to offer solutions. This competition ensures that the government gets the best value for its money and also enables the government to keep up-to-date with new technologies and work processes, thereby becoming more efficient. However, to ensure that contractors understand what is required, to monitor the development of the solution, to ensure that various parts are integrated and tested correctly, and to ensure that a working solution is delivered requires that the organization follow practices in this model.
The GAO has been working with various agencies to ensure that systems acquisition is done in a manner that provides the most likely chance of success. For example, the GAO sought to get a data management strategy as a key part of acquisition efforts into legislation. Furthermore, current efforts are underway to make sure that in addition to following a data management strategy, acquisition program managers routinely consult with key stakeholders and actively solicit their input during the entire acquisition lifecycle.
The CMMI models address both of these areas. For its part, the GAO uses the model practices associated with data management and involvement of stakeholders as well as other process areas and practices in the model as appropriate during our reviews of government acquisition efforts. The GAO has used CMMI-ACQ when evaluating agency efforts to acquire software solutions, for example—these reports are available on the GAO website (www.gao.gov/). CMMI-ACQ is the model that GAO uses when evaluating acquisition practices.
While there is no set of practices that will work for every possible situation, experience has shown that managing requirements is critical. Many projects proceed with vaguely defined requirements; do not consult adequately (if at all) with users, customers, and other stakeholders; and put out a solicitation only to find that it is incomplete or not what the user wanted.
Other areas that typically are not addressed adequately include risk management, communications within and outside the team, planning and tracking the work, and senior management oversight. These are all basic management activities. CMMI-ACQ includes practices that can help your organization with these and other areas of the acquisition lifecycle. The GAO recommends that organizations use this model to guide their efforts to successfully acquire solutions and to provide their customers or the organization itself with the tools needed to better accomplish the mission at hand.
Rich Frost, Ashok Gurumurthy, and Tom Keuten
Large corporations have been outsourcing all or pieces of their information technology functions for years. Any organization that is not in the IT business is likely buying some form of IT that is necessary for its business to be successful. Most companies buy their core business systems (e.g., enterprise resource planning [ERP] or time management) from commercial providers. Even IBM now buys its laptops from someone else, as the firm is no longer in that business. Organizations must buy at least some IT products and services from someone else because it just doesn’t make good business sense to make or do everything themselves. Unfortunately, IT acquisitions don’t always go well, as evidenced by the large number of articles discussing IT and outsourcing gone wrong.
Many IT functions have been outsourced by company management because executives are led to believe that they can hire other companies to develop their software, run their help desk and support, manage their infrastructure, and do anything else related to IT, all at a fraction of the cost that these companies pay for their current employees. This “story” makes sense because most companies are in business to do something other than IT, and IT service companies are in business to help other companies with their IT. However, history and statistics on IT outsourcing have shown that many of these deals fall through or are not successful, which leaves some companies with expensive deals that they want to wriggle out of.
On the one hand, many IT organizations that acquire their technology or outsource the whole IT function blame their suppliers when those acquisitions subsequently fail. They complain that the suppliers did not meet their end of the deal and could not deliver what they, the customer, expected. On the other hand, IT suppliers that have tried to help such companies complain that the customers didn’t specify what they needed and did not hold up their end of the bargain.
The IT environment is very complex in almost all commercial organizations. Many struggle with issues related to security, cost, scalability, and integration. Countless IT suppliers stand ready to provide all kinds of products and services that are intended to help solve these problems. Those products and support services must somehow seamlessly connect with others that are currently in the IT environment of the acquiring organization. Companies that mostly “insource” their IT functions also run into comparable complex connection issues, but outsourcing some or all of these functions increases that complexity. Consequently, companies that acquire more IT products and services must be exceptional at IT acquisition (which includes holding suppliers and their internal stakeholders accountable) if they expect to achieve positive results.
CMMI-ACQ is the most robust, publicly available solution for organizations that are seeking to improve their acquisition processes. To leverage this model to achieve positive results in acquisition, organizations must internalize CMMI-ACQ best practices so that they fully understand how to apply them. This essay provides some insights based on our observations when working with companies using CMMI-ACQ. It describes how using multiple suppliers increases the complexity of the commercial IT environment, and it provides tips about using preferred vendors, lessons learned on contract management, and insights on implementation. The essay concludes with some recommendations for acquirers about when and how to conduct appraisals using the CMMI-ACQ model and ideas for future versions of CMMI-ACQ.
The definition of outsourcing has matured over time and has become more complex to reflect the changing needs of businesses. When organizations outsource their enterprise IT services, new issues and risks are introduced as various “outsiders” provide IT services. Recently, organizations have started using a “multi-supplier” strategy, which means they use more than one supplier to provide IT services. This strategy adds further complexity to an outsourced environment. Because so many suppliers define a specific niche for themselves and none serves as a one-stop shop for everything a company needs, it is impossible to avoid the multi-supplier environment given today’s changing business needs.
Unfortunately, a perfect outsourcing contract may not exist, as so many unwritten expectations must be met in running a large IT environment. During the execution of contracts in a multi-supplier environment, the following issues and challenges typically arise:
• Separate contract terms and conditions and segregated responsibility across suppliers (IT solutions require integrated responsibility)
• Suppliers’ unwillingness to work with other suppliers (“not in my contract” responses)
• Suppliers’ inability to work together with other suppliers in a multi-supplier environment
• Separate supplier tools and methods for delivery of work artifacts
• Lack of a governance model across multiple suppliers to execute the organizations’ policies
• Lack of incentive for suppliers to come up with innovative solutions to drive costs down (the motivation to meet the contractual commitments drives simplicity and commonality of processes, not the idea of being flexible in service delivery)
The solution to the challenges arising from implementing the multi-supplier strategy is to use an organization that provides governance oversight and integrates services from multiple suppliers. Such a solution provides structured oversight and governance across all IT suppliers on behalf of the acquirer based on proven system integration principles.
A system integrator is a person or organization that specializes in bringing together component subsystems into a whole and ensuring that those subsystems function together smoothly. The principles of systems integration should be considered in the context of a multi-supplier outsourcing environment. The resultant “office of multi-supplier governance” provides thought leadership across the IT landscape, from strategy to operations, by using its knowledge and experience of the organization’s multi-supplier environment. In addition, the office of multi-supplier governance provides the following key services to address the challenges in an outsourced multi-supplier environment:
• Multi-supplier communication management
• Cross-supplier governance
• Compliance management (e.g., standards and policies)
• Process and technology assurance
• Business-aligned IT strategy and thought leadership
• Enterprise architecture
The office of multi-supplier governance ensures that suppliers receive standard communication, operate consistently within defined policies and standards, and perform required quality checks before delivering something to the customer. This office can also provide thought leadership and research based on external best practices. Most importantly, the office of multi-supplier governance can help overcome challenges that arise when multiple suppliers are working in the same environment.
Many organizations will use multiple suppliers that can provide similar service offerings when they acquire resources either for a project (e.g., software development) or for ongoing operations (e.g., help desk). Some of these organizations maintain a list of prequalified “preferred” vendors that the organization will turn to when it has a need to acquire new products and services. This section discusses why organizations should consider having a list of preferred suppliers and some best practices to consider when engaging them. This topic is addressed here rather than more specifically in the CMMI-ACQ model because some organizations that use CMMI-ACQ may be constrained by regulatory or other organizational requirements that prevent them from being able to engage in this practice.
Development of a list of preferred vendors makes a lot of sense for some companies because it allows the acquiring organization and the suppliers to agree to terms once for a series of acquisitions rather than having to repeat the same process individually each time. This process helps the project team go faster and can provide motivation for the suppliers to invest in longer-term relationships. These elements can help ensure a successful acquisition.
During an acquisition, a lot of time can be spent in solicitation and supplier agreement development (SSAD in CMMI-ACQ). The acquiring organization needs to determine which suppliers are capable of doing the work. If the organization has acquired a similar product or service in the past, and the supplier executed its end of the deal in an acceptable manner, then it is natural that the supplier would be added to a preferred supplier list in that area. Also, if a supplier is able to handle one type of project well, it may be able to handle other types of projects. Once those capabilities are qualified, it would be natural to place that supplier on a preferred vendor list.
When the next project comes along, if the organization has a prequalified list of preferred vendors, that project will go faster. The acquisition will also go faster if the preferred vendors have all agreed in advance to contract terms and conditions. Many contracts are held up by attorneys on either side of the acquisition due to disagreements regarding terms and conditions. If terms and conditions are already agreed on, this risk is avoided. If projects can be timed so that a preferred supplier team “rolls off” one project and starts another one right away, both parties win: Acquirers will benefit from continuity of the team, and suppliers will benefit from knowing where their resources will be allocated.
As these types of win-win scenarios are developed, both suppliers and acquirers get used to working with each other, making it easy to collaborate on innovation and engage in a more productive delivery model. Suppliers may be willing to make critical investments in infrastructure. For example, suppliers may be willing to establish test laboratories with hardware that has the same configuration as the customer to conduct testing in a customer-like environment.
To optimize the value of using preferred suppliers, it is important that agreements with these suppliers include some protections for the acquirer. For example, the acquirer may want to name specific critical employees whom the supplier needs to keep on acquirer projects unless prior written authorization is received from the acquirer not to do so. This term of the agreement will ensure that the acquirer is not paying for new employees to learn the environment and will drive continuity across engagements.
Contract terms (or at least the terms and conditions) should focus on the long-term relationship so that the acquirer can plan for and the supplier can count on more predictability across the time frame of the contract. Finally, the acquirer should specify which tools (e.g., standard test tool) are required to work in the acquirer’s environment and that the supplier must use to provide the service. As part of the tools aspect of the contract, there should be specific language about licensing for those tools so that these costs are not included for each project.
Preferred lists of suppliers can be very beneficial to organizations that engage suppliers for similar types of projects. These lists can help the project team go faster and motivate suppliers to work with acquirers to create more effective delivery models. Agreements with suppliers on these preferred lists should contain clauses that protect the acquirer and ensure that the acquirer is receiving the benefits of having preferred suppliers.
Industry surveys on outsourcing show that, in many cases, the outsourcing benefits of IT organizations have not met their initial expectations. The primary drivers for organizations to outsource these operations include a lack of in-house expertise and the ability to transfer the risk to the vendor. At the same time, organizations also view cost savings, flexibility, best practices, and innovation as key supporting drivers of outsourcing IT. As industry research makes clear, many of these organizations do not realize the benefits they expected from outsourcing.
In the real world, outsourcing often does not deliver on its promise. There is rarely a single reason why a contract fails; instead, a combination of reasons is typically to blame. Some of these problems can be attributed to contracting processes. Stipulating basic discipline around contract management processes plays an important role in outsourcing deals. The contract lifecycle spans various phases such as contract planning, solicitation, evaluation of supplier response, negotiation, and contract finalization.
Communication plays a vital role across the contract management lifecycle. Historically, IT organizations have focused on different models and frameworks within the software engineering and project management disciplines. As IT organizations have adopted outsourcing as their key strategy, it has become clear that similar process discipline is needed in the contract management area. Some of the major challenges faced in the outsourcing industry are addressed by CMMI-ACQ process elements, which clearly stipulate many of these activities within the contract management process.
We often hear from outsourcing organizations that expected cost savings are not achieved through outsourcing, because hidden costs are associated with the services that were not considered properly when contracts were written. If proper discipline is followed in creating a solicitation package (which is then used to seek proposals/responses from potential suppliers), these ambiguities in contracts can be avoided.
Solicitation packages need to be reviewed with the appropriate stakeholders to ensure that they provide the correct scope of coverage. Involving the right people in developing and reviewing the solicitation package is critical to avoid discrepancies in contract coverage. For example, not specifically covering the decommissioning of a system in a contract for developing a new solution may lead to potential discrepancies in the contract and may trigger contract change requests and lead to additional costs. Involving the right people in developing and reviewing the solicitation package is covered in the CMMI-ACQ model in the Solicitation and Supplier Agreement Development (SSAD) process area. (CMMI-ACQ reference: The SSAD process area specifies preparation, review, and distribution of solicitation packages by involving the right stakeholders. This process area helps the acquirer avoid discrepancies in the contract.)
Outsourcing adds a level of rigidity to normal organizational activities because contracts are binding and suppliers may choose not to accommodate changes without charging the customer additional cost. This problem becomes more significant when a multi-year contract is in place, but the contract does not react to the changes that the acquirer needs to act on based on its changing user requirements. When developing contracts, there is a need to create a win-win situation for both acquirer and supplier. The SSAD process area in the CMMI-ACQ model describes how supplier agreements should elaborate on how scope changes and supplier agreement changes can be determined, communicated, and addressed. The agreement may be either a stand-alone agreement or part of a master agreement. When it is a part of a master agreement, the project agreement may take the form of an addendum, work order, or service request to the master agreement. (CMMI-ACQ reference: The SSAD process area specifies how to establish and maintain a supplier agreement.)
Some outsourcing problems are attributed to a lack of contract management governance. Problems in this area may arise because of improper communication, incorrect document versioning, or inadequate reviews done by senior management during the contract lifecycle. By adopting the generic practices of the CMMI-ACQ model, important practices such as communication planning, configuration control, stakeholder and senior management reviews, and process verification can address any issues in the areas early on, allowing the parties to avoid contract disputes. (CMMI-ACQ reference: The generic practices are intended to ensure that you address issues that could lead to contract disputes or problems.)
Often, contract approvals take too long to complete. If contracting plans and strategies are not clearly known and communicated by the acquisition organization, the supplier may waste resources by not planning to provide services as soon as the contracts are signed. SSAD emphasizes communicating with potential suppliers concerning the forthcoming solicitation. The acquirer outlines the plans for the solicitation, including the projected schedule for solicitation activities and expected dates for responses from suppliers, which helps suppliers plan to bring their resources on board once contracts are signed. (CMMI-ACQ reference: The SSAD process area specifies means to establish and maintain the supplier agreement.)
There are many ways to leverage the CMMI-ACQ model. It is important that the implementation of the practices identified in CMMI-ACQ matches the culture and goals of the organization. While applying CMMI-ACQ in large organizations with a wide variety of acquisitions in the form of commercial off-the-shelf (COTS) software and custom-developed software to infrastructure and hardware, the following points should be considered:
• Develop the process and tailoring approach in a way that ensures that projects do only the right things and neither skip quality steps nor add non-value-added activities into the process.
• Wherever possible, automate process steps so that they automatically collect metrics while the project teams are doing their work.
• Provide oversight where necessary, at the appropriate level of the organization.
When organizations keep processes lean but rigorous and optimized for project teams to deliver, they are more likely to see greater returns on their investment in their adoption of CMMI-ACQ practices.
Even though organizations may pursue a wide variety of acquisitions, each acquisition project can follow a common process that serves as its foundation. Each acquisition project will need to plan the acquisition, elicit and manage customer requirements, engage suppliers through a supplier agreement, verify work products, validate supplier deliverables, and monitor the project against its plan and risk factors. Each of those steps can be either very complex or quite simple, however—its level of complexity will match the characteristics of the acquisition.
For example, if the acquirer simply wants to purchase a new computer server, those requirements are quite different from the requirements for a new handheld device that service technicians can use for requesting parts and tracking outstanding jobs. A requirements specification is needed in both situations, but the latter would include a significant number of user interface and mobility requirements that would not be considered important when purchasing the typical server. When standardizing a process (Organizational Process Definition [OPD] in CMMI-ACQ) for an organization that has such a variety of projects, it is important to implement tailoring to the work products themselves; they need to be tailorable to match what a project needs to be successful.
Processes that have standard quality checkpoints can also have very positive effects on project performance. Throughout any acquisition project, it is important to have someone outside the project review the team’s work to ensure its alignment with standard processes and overall quality. During the assessment that takes place at these checkpoints, another opportunity for tailoring opens up.
For major investments that include significant budgets and high complexity levels, these independent reviews will likely need to occur at the highest levels of the organization. However, the head of the organization does not have to review the smaller projects with lower levels of complexity. Sometimes team members for a project must review its progress with large numbers of stakeholders to ensure that they all have visibility. Often, these same stakeholders are invited to reviews of other projects that don’t require such robust governance. Thus the reviews an organization conducts can also be tailored to fit the project’s unique situation, which helps to ensure that the project doesn’t do things that do not add value and the organization’s leaders still receive the level of visibility necessary to provide the right level of oversight.
Automating processes that are implemented based on CMMI-ACQ can significantly increase productivity and decrease project costs. For example, we worked with one company that had a robust tollgate process at the end of each project phase in which business customer and IT management would review project status and decide whether to proceed. Because this was a global company with project team members and executives located all over the world, significant effort was required to set up tollgate meetings that fit into all of the shareholders’ schedules, and the meetings themselves were very expensive due to the level of attendees.
To solve this problem, the company implemented a virtual tollgate process that allowed project teams to show their progress in an online system and allowed tollgate approvers to vote when it was convenient. Reminders were sent out to those persons who did not submit a vote, and if the project information was not clear, the tollgate approver could request a formal meeting. The default situation was to conduct all voting through the virtual tollgate, but there was always an opportunity to have the meeting if the project required more scrutiny or the approvers just needed in-person discussion. By automating this process, the company was able to reduce the cost of the meeting infrastructure and administration, and it made the process more convenient for both the project teams and approvers.
Organizations that are following the CMMI-ACQ model will at some point strive to gather metrics (if they don’t consider them when they first get started). Project teams typically will not mind providing or reviewing metrics if the data are collected as a part of doing normal project work. If the project has to do additional work to gather the metrics and report them, however, team members rarely see the value in doing so and often will do whatever possible to not participate.
For example, many organizations write change requests on forms. As part of the change request analysis, the team often determines the impact to customer requirements as a result of the requested change. If the project then needs to report metrics related to requirements volatility, team members have to go back through the change request forms and determine which ones affected business requirements. If the project instead has a requirements management tool that tracks changes, oftentimes volatility data can be pulled directly from that tool.
Alternatively, if the change request is part of a change management system and data about business requirements changes can be summarized, metrics can be generated rather quickly. Both of these potential solutions do not require the project team members to do extra work to create metrics. Minimizing the non-value-added work and keeping processes lean helps to ensure that projects follow processes because they will find them valuable instead of obstructive.
Another key area to consider when implementing CMMI-ACQ practices is supporting projects with the right level of oversight. Much in the same way that project documents and processes can be tailored, the amount of oversight from senior management through technical leadership can be customized based on the needs of the project. For instance, the CIO of a large organization may be required to attend the project review for the largest IT project in the company, but his or her attendance is likely not necessary for review of the project that is simply upgrading some servers in a data center.
Many organizations invite people to reviews because they have been invited before or because they might know something that could help the project. Other organizations address many projects at a single meeting so that they can be reviewed sequentially. With this policy, half of the people in the meeting care about only half of the projects, yet they are required to attend the whole time. There are lots of opportunities to tailor oversight, just like there are many opportunities to tailor processes and work products. Each of these opportunities should be evaluated when adopting CMMI-ACQ practices.
Organizations that go through major changes like adopting CMMI-ACQ practices often experience many stops and starts. One major reason for delays in making changes is the tendency to implement processes that force projects to do things that do not add value or that create extra work. The more that organizations can improve their tailoring (from processes to work products to approvals), and the more they can automate the collection of metrics so that this task does not create an extra burden for projects, the faster they will be able to implement CMMI-ACQ practices and see the benefits of adoption. Those organizations that do not consider these opportunities may see some of the challenges that many organizations before them have encountered and, one hopes, will have the wherewithal to notice history repeating itself so that they can change course.
Acquiring organizations have been using CMMI-DEV appraisal results for years to qualify suppliers. To get the best result for the customer, both the acquirer and the supplier must be capable. Acquirers should conduct appraisals of their own organization using CMMI-ACQ to identify opportunities to reduce cost, improve quality, and increase customer satisfaction.
As mentioned earlier in this section, one of the motivating factors for commercial organizations to outsource software development and other IT functions is to reduce cost. Vendors that perform those functions as their primary business should be able to do so less expensively than a company that focuses on providing energy, building trucks, or selling clothing. However, to ensure that costs are as low as they can be, the acquirer must be effective at communicating requirements and engaging suppliers. An appraisal using the SCAMPI method with CMMI-ACQ as the source model will help organizations identify gaps in their ability to elicit requirements, communicate those requirements to suppliers, and subsequently engage them through contracts. If the requirements are vague, conflicting, or not testable, then suppliers bidding on such engagements will raise their prices to account for the work that will inevitably have to be done to clarify and potentially rewrite the acquirer-developed requirements. If the acquirer can eliminate defects in requirements by closing gaps identified in appraisals, the acquirer can expect to see its costs of acquisitions go down, as suppliers can bid more aggressively due to the clarity of acquirer intent.
Acquirers can also increase their quality by conducting appraisals and identifying and closing gaps related to verification and process and product quality assurance from an acquirer’s perspective. Suppliers develop a perception of the acquirer’s level of quality expectations from the initial engagement, which then builds through the contracting phase and into the delivery phase. If the acquirer does not enforce internal verification steps for the work products that the acquirer develops (e.g., requirements specifications, contract statements of work), then it will become apparent to the supplier that the company has a low expectation—and the supplier will likely deliver products and services geared toward that expectation.
Conversely, if the acquirer produces very clear deliverables that have been reviewed and are well understood, the supplier will develop a perception that the acquirer expects a high level of quality and will strive to match that level of quality so that it can impress the acquirer and win more business. If the supplier also knows that its own quality results may be reviewed by the acquirer at any time (which should be specified in any supplier agreement), the supplier team will have additional motivation to achieve higher quality on its end, resulting in a higher-quality product or service being delivered to the acquiring customer.
Just as suppliers develop perceptions of acquirers, so customers of acquirers can develop perceptions of their acquisition support. For example, many commercial organizations that outsource large portions of their IT function still retain a team of employees to manage the suppliers engaged by the company. These teams often gather requirements from the business and engage suppliers to develop solutions to meet these requirements.
If these internal IT teams do not know how to professionally engage their customers to elicit requirements and appropriately manage expectations through project planning, monitoring, and control, the customers may develop the perception that the acquirer teams are in the way and not able to help them. They may even question whether the internal team roles can also be outsourced to someone who can help them. SCAMPI appraisals of these internal IT teams using the CMMI-ACQ model can identify some of these potential pitfalls and help IT leadership understand where to invest in additional capability for better customer engagement. These investments can lead to higher performance by the internal IT team and result in a higher level of customer satisfaction.
Many IT organizations that outsource much of their software development and other functions have been looking for a mechanism that can assess whether they are effective at their remaining function—namely, engaging suppliers to deliver solutions. A CMMI-ACQ appraisal helps acquirers ensure that they have their side of the transaction in order, similar to how results from a CMMI-DEV appraisal can give acquirers confidence that suppliers are prepared to deliver. Acquisitions can fall apart if either side is not prepared, so acquirers need to conduct internal appraisals similar to the way they demand appraisals from their suppliers. Acquirers conducting appraisals of their own organization using CMMI-ACQ can identify many performance opportunities, including those related to cost reduction, quality improvement, and higher customer satisfaction.
Like all great concepts, the CMMI-ACQ model needs to continue to evolve to support industry needs as they change. Many of the underlying principles focused on project management, supplier engagement, and acquisition verification and validation have stood the test of time and will likely continue to do so for many years. At the same time, CMMI-ACQ does not address some things that are important to commercial organizations that acquire technical products and services—but a revised version could address those issues in the future.
There will likely be as yet unknown areas that emerge in the future that CMMI-ACQ will need to address as well. This section discusses practices that are needed by industry that CMMI-ACQ could address or that organizations will have to address in other ways to meet their objectives. These practices include “pre-project” analysis and work as well as management of the organizational portfolio of projects. This section also addresses integration with other CMMI models as another opportunity to improve and operationalize the use of CMMI-ACQ.
CMMI-ACQ is focused on improving acquisition projects. The process areas in this model describe practices that projects should follow and practices that organizations should follow to help projects be successful. These practices are all valuable, and they should be implemented by commercial organizations that acquire technical products and services. However, to ensure that the project adds value to the organization, it has to be the right project in the first place.
CMMI-ACQ today does not specifically address project selection by the organization. There are many project investments that an organization can make. Those organizations that select the right projects that are best aligned to organizational goals are more likely to achieve their goals. This selection may be based on criteria such as return on investment, risk, strategic fit, or ability to deliver based on available capacity in terms of resources and capability. Each project investment should be evaluated using standard criteria to ensure that the right projects for the organization are selected in the first place.
Every organization also has scarce resources. CMMI-ACQ captures many good specific and generic practices at the project level. If projects follow these practices, they should have adequate resources to allow them to be effectively and successfully executed. If all of the projects in the organization follow these practices, then the organization should perform at a higher level of maturity. For that to happen effectively, organizations must have a resource management system in place that accomplishes the following:
• Enables senior management to make decisions about which projects should have resources allocated to them
• Provides visibility into the resources that will be finishing their projects and potential projects where they could be used
• Identifies over- or underutilized resources for resource leveling
If resource management happens only at the project level, then the organization will not optimize its resources and may experience higher costs or lower performance. In the future, CMMI-ACQ could include practices for resource management at an organizational level in addition to the project level to help organizations in this area.
The CMMI Product Team has done a fantastic job in ensuring continuity across the CMMI model series. Recognizing that certain core process areas are common to development, acquisition, and services helps customers of those models align their operations using a common language. As the set of models evolves, more guidance on how customers of the models might integrate their efforts to work with one another will be necessary to ensure that when customers and developers decide to work together, a positive result ensues for both organizations.
For example, both CMMI-DEV and CMMI-ACQ include Project Planning (PP) as a process area, and most people would agree that development and acquisition organizations need to be effective in these activities. To effectively implement these process areas when the customer is using the CMMI-ACQ model and the supplier is using the CMMI-DEV model, it is important to clearly understand “who is planning what” to ensure that nothing is missed and that significant duplication of effort does not occur. This understanding requires effective communication between the two organizations, which could be more focused if more guidance is provided in CMMI.
From another perspective, when companies cross their organizational boundaries to work with other companies, they typically enter that partnership with the desire to obtain a positive financial outcome for both companies. CMMI-ACQ has excellent guidance in how to engage suppliers and manage their contractual agreements. The CMMI-DEV model has a Supplier Agreement Management (SAM) process area, but it does not have a Customer Agreement Management process area. If developers don’t have the engagement skills necessary to work with their customers, they may not deliver a positive financial return for their own company or develop a positive customer relationship—even if they have great development teams that produce quality products.
CMMI-ACQ is a very powerful tool for commercial organizations that engage suppliers to acquire technical products and services for their own use, for their customers’ use, or as components that are integrated into a larger overall solution. To realize the most value from investing in the practices referenced by the model, an organization must take into account many factors, including how to manage multiple suppliers, contracts, preferred vendors, appraisals, and tailoring for both rigor and speed.
As with many great offerings, there are numerous opportunities to continue to improve the CMMI-ACQ model. Because CMMI-ACQ is part of the CMMI model series and integrated with the CMMI Product Suite, it is the most robust publicly available solution for organizations seeking improvement in this area. CMMI-ACQ will be very useful to commercial organizations that buy technical products and services for many years to come.
Mike Phillips and Brian Gallagher
Since the release of CMMI-ACQ, various types of nongovernment organizations have found the practices contained in this constellation to be a better fit for their quality improvement efforts than the CMMI-DEV predecessor. This essay explores some of the industry examples, and then considers some future possibilities for use of this model.
From a historical perspective, the initial contributions to CMMI were made by industry representatives who noted that often the contracting “arm” of their organizations had contract experts, but lacked a technical skill set for ensuring effective technical content in the contract as well as the skills needs to oversee progress during the supplier’s development lifecycle. One organization created a group to assist this effort that was called “procurement engineering.” The industry sponsor of CMMI-ACQ, General Motors’ CIO, had a similar need for new competencies when the company divested its software development unit and created a “hub and spoke” network of software suppliers. Another essay in this chapter discusses this particular company’s current use of the model.
Another essay describes a different path to acquisition. In this case, a defense contractor had been closely involved with a government sponsor in developing critical elements of the DoD’s ballistic missile capabilities. Strategic changes have led to a need to extend the lifecycles of the existing systems rather than developing new systems. This requires a commitment to careful modernization of the capabilities, in the face of very limited resources. The government chose to award a contract to one company to oversee all of these acquisition efforts, and the essay describes some of its specific uses of CMMI-ACQ’s high maturity capabilities.
Some current industry users of the constellation must deliver whole systems to clients, but engage in little or no development of the elements to be delivered. In some cases, they may have significant integration efforts. In others, they may be better characterized as assemblers of the subsystems, with integration aspects being relatively “low risk.” An example might be a company that provides railway cars for urban transit. Many suppliers are at work producing alternative subsystems, but typically only a few companies deliver the final integrated vehicles. CMMI-ACQ provides a valuable “lens” for the oversight of the collection of suppliers that demand the most management attention.
Where might some new applications of the constellation next be seen? One opportunity that appears likely to emerge is when an organization assumes responsibility for the “enterprise,” including management of the acquisition of a large suite of services from various supplying organizations. Just as the CMMI-DEV constellation covers the development activities of organizations for both products and services, so CMMI-ACQ is designed to cover the activities necessary for the acquisition of both products and services. To date, most of the “early adopters” seem to be acquiring products. Acquiring services adds a dimension that has been challenging for many organizations on both sides of these agreements. Satisfying service agreements often brings up significant issues, as most of these deals require service level agreements that are difficult to apply successfully. Incidents associated with perceived failures to deliver the needed service levels abound. Now that we have a complementary constellation with CMMI-SVC, we believe there are new areas between acquirers and service providers that are ripe for improvement.
Modern engineering practices, new technologies, and architectural patterns may provide interesting opportunities for industry. Cost savings or efficiencies may be realized in commercial organizations that acquire components and deliver value-added products and services to the general public, those organizations that acquire products and services for internal use, and those organizations that acquire and integrate products for government use. Take, for example, a service-oriented architecture (SOA) architectural pattern that logically decouples services from applications developed using services through an enterprise infrastructure. This decoupling allows acquirers to develop acquisition strategies for each of these three parts (services, applications, infrastructure) that optimize contractual relationships and acquisition methods based on the acquired part of the enterprise. For example, an acquirer may want to encourage quick reaction or use agile approaches when working with application developers, and may want to encourage a plan-driven approach for infrastructure developers. While this approach provides the acquirer with more flexibility, it also requires more sophistication, more integration skills, and systems engineering skills on the part of the acquisition team.
The move toward “cloud computing,” in which resources can be shared among multiple users transparently and “on demand,” provides some interesting challenges to acquirers as well. In the case where someone is managing an enterprise that requires general-purpose computing power to run applications with the need to dynamically allocate resources from one project to another based on demand, this approach appears to provide value. The acquirer would then need to construct an acquisition strategy for the cloud resources with a clear eye toward critical quality attributes of the enterprise. For example, nonfunctional requirements such as performance, availability, reliability, security, and assurance would all need special attention and may pose challenges when acquiring cloud computing for an enterprise. Additionally, both expected growth and exploratory growth scenarios would aid in determining cloud computing requirements. As with taking advantage of an SOA pattern, cloud computing allows an acquirer to decouple infrastructure procurement from application procurement with the potential of entering into agreements with suppliers that have particular expertise in application development or cloud computing independently.
As with any new engineering practice, technology, or architectural pattern, the hype is usually larger than the initial benefits. A sophisticated acquirer will carefully weigh the benefits using structured decision processes (see the Decision Analysis and Resolution process area) and pilot any new approach using a fact-based piloting approach (see the Organizational Process Focus and Organizational Performance Management process areas).
Brian Gallagher
Adapted from “Techniques for Developing an Acquisition Strategy by Profiling Software Risks,” CMU/SEI-2006-TR-002
CMMI-ACQ’s Project Planning process area asks a program to establish and maintain its own acquisition strategy. There are also references to the acquisition strategy called out in many other CMMI-ACQ process areas.
What is an acquisition strategy and how does it relate to other planning documents discussed in CMMI-ACQ?
The Defense Acquisition University (DAU) defines acquisition planning as follows:
Acquisition planning is the process by which the efforts of all personnel responsible for an acquisition are coordinated and integrated through a comprehensive plan for fulfilling the agency need in a timely manner and at a reasonable cost. It is performed throughout the lifecycle and includes developing an overall acquisition strategy for managing the acquisition and a written acquisition plan (AP).
Acquisition planning is the act of defining and maintaining an overall approach for a program. Acquisition planning guides all elements of program execution to transform the mission need into a fielded system that is fully supported and delivers the desired capability. The goal of acquisition planning is to provide a road map that is followed to maximize the chances of successfully fielding a system that meets users’ needs within cost and schedule constraints. Acquisition planning is an iterative process; feedback loops impact future acquisition planning activities.
An acquisition strategy, when formulated carefully, is a means of addressing program risks through program structure. The DAU defines acquisition strategy as follows:
An acquisition strategy is a business and technical management approach designed to achieve program objectives within the resource constraints imposed. It is the framework for planning, directing, contracting for, and managing a program. It provides a master schedule for research, development, test, production, fielding, modification, postproduction management, and other activities essential for program success. The acquisition strategy is the basis for formulating functional plans and strategies (e.g., test and evaluation master plan [TEMP], acquisition plan [AP], competition, systems engineering, etc.).
The Defense Acquisition Guidebook [DoD 2010] describes an acquisition strategy as “a comprehensive, integrated plan that identifies the acquisition approach and describes the business, technical, and support strategies that management will follow to manage program risks and meet program objectives. The Acquisition Strategy should define the relationship between the acquisition phases and work efforts, and key program events such as decision points, reviews, contract awards, test activities, production lot/delivery quantities, and operational deployment objectives.”
The guidebook goes on to state that the acquisition strategy “evolves over the phases and should continuously reflect the current status and desired end point of the phase and the overall program.”
A program’s best acquisition strategy directly addresses its highest-priority risks. High-priority risks can be technical if no one has yet built a component that meets some critical aspect of the system or has never combined mature components in the way that is required. Risks can be programmatic if the system must be designed to accommodate predefined cost or schedule constraints. Risks can be mission related when the characteristics of a system that meets the need cannot be fully articulated and agreed on by stakeholders. Each program faces a unique set of risks, so the corresponding acquisition strategy must be unique to address them.
The risks a program faces evolve through the life of the program. The acquisition strategy and the plans developed based on that strategy must also evolve. Figure 6.12 shows the iterative nature of acquisition strategy development and planning.
The process usually starts with an articulation of user needs requiring a material solution and identification of alternatives to satisfy those needs (e.g., an analysis of alternatives [AoA]). A technology development strategy is then developed to mature the technologies required for the selected alternative. Next, the acquisition strategy is developed to guide the program. As the strategy is executed, refinements are made. In the DoD, the acquisition strategy is required and formally reviewed at major program milestones (e.g., Milestone B [MS B]).
The terms acquisition planning, acquisition strategy, and acquisition plan are frequently used interchangeably, which causes confusion. A program’s acquisition strategy is different from its acquisition plan, but both are artifacts of acquisition planning.
All strategies can be plans, but not all plans can be strategies. In the context of acquisition, strategies are high-level decisions that direct the development of more detailed plans, which guide the execution of a program. Careful planning of what is to be done, who will do it, and when it will be done is required.
Developing an all-encompassing acquisition strategy for a program is a daunting activity. As with many complex endeavors, often the best way to begin is to break the complex activity down into simpler, more manageable tasks. When developing an acquisition strategy, a program manager’s first task is to define the elements of that strategy. When defining strategy elements, it is useful to ask the question, “Which acquisition choices must I make in structuring this program?”
Inevitably, asking this question leads to more detailed questions such as the following:
• Which acquisition approach should I use?
• Which type of solicitation will work best?
• How will I monitor my contractor’s activities?
The result of these choices defines the acquisition strategy, as summarized in Table 6.1. Identifying strategy elements is the first step in this process.
One of the most important strategy considerations is defining the acquisition approach, which in turn dictates the approach the program will use to achieve full capability. Typically, this approach is either evolutionary or a single step, as illustrated in Figure 6.13. (The evolutionary approach is the DoD’s preferred approach.)
For readers who are experienced with development approaches, the decision to use a single-step or evolutionary approach seems natural and is driven by program risk. When acquiring a capability that has been developed many times before and for which there are known solutions, a single-step approach may be warranted.
However, most systems are more complex and require an evolutionary approach. The most important aspect to understand is the difference between the acquisition approach and the development approach used by the supplier. You can use a single-step approach as an acquisition strategy if the developer is using an incremental development approach.
At times, a different approach may be selected for different parts of the program. Figure 6.14 depicts the acquirer using a single-step approach while the developer is using an incremental approach. Figure 6.15 shows the acquirer using an evolutionary approach and the developer using a mix of single-step, incremental, and evolutionary (spiral) approaches.
The acquisition approach is just one of many decisions an acquirer makes when developing the acquisition strategy. Most choices are made based on a careful examination of risk. Each project is unique, so each acquisition strategy is unique. In the end, the acquisition strategy becomes one of the most important guiding artifacts for all stakeholders.
The decisions made and documented in the acquisition strategy help to mitigate the highest risks to the program and provide a road map that all stakeholders can understand, share, and analyze to help the project succeed.
Mike Phillips
When creating the CMMI-ACQ model, we wrestled with finding ways to properly discuss the myriad agreements that are needed for an acquisition program to be successful. In this essay, I offer some thoughts related to the multiple types of agreements possible in acquisitions based on my experiences in government acquisition over the past 30 years.
Central to a successful acquisition process is the key contractual agreement between the acquirer and the supplier. To describe the agreements for this relationship, we built on the excellent foundation provided by the Supplier Agreement Management (SAM) process area from CMMI-DEV and legacy documents such as the Software Acquisition CMM published in the 1990s [SEI 2002, SEI 2006a]. However, the product team knew that other agreements must be carefully developed and managed over the life of many acquisitions, and that the model had to cover these types of agreements as well. Figure 6.15 is a reminder of the need for multiple agreements given the complexity of many of today’s acquisitions.
In many government organizations, the acquisition discipline is established and maintained in specific organizations that are chartered to handle acquisition tasks for a variety of using organizations. Often the acquirers do not simply provide products to customers that the supplier delivers “as is,” but instead integrate delivered products with other systems or add services such as training or upgrades, all of which may come from other agencies, departments, or organizations not typically thought of as suppliers. These other acquirer activities increase as product complexity increases.
Because of this known complexity, the product team knew that the definition of supplier agreement had to be broad and cover more than just a contract. So, the product team settled on the following definition: “A documented agreement between the acquirer and supplier (e.g., contract, license, or memorandum of agreement).”
The focus of the Acquisition Requirements Development process area is to ensure that customer needs are translated into customer requirements and that those requirements are used to develop contractual requirements. Although the product team agreed to call these “contractual” requirements, they knew the application of these requirements had to be broader.
Thus, contractual requirements are defined as follows: “The result of the analysis and refinement of customer requirements into a set of requirements suitable to be included in one or more solicitation packages, formal contracts, or supplier agreements between the acquirer and other appropriate organizations.”
In another essay, we mention how, in many of our acquisitions, understanding the performance expected in delivering services is important. The service level agreements that are a vital element of the larger contractual agreement have often led to dissatisfaction on the part of both the buyer and the supplier. The disciplined application of the practices contained in ARD can assist in improving the understanding of expectations on both sides of the acquisition team.
Practices in the Solicitation and Supplier Agreement Development and Agreement Management process areas are particularly helpful as the need for other types of documented agreements increases with system complexity. An example of these kinds of agreements is the program-manager-to-program-manager agreement type that the U.S. Navy uses to help integrate combat systems and C4I in surface ships during new construction and modernization.
A recent—and noteworthy—example emerged as we prepared the second edition of this book. As I participated in a workshop with the Navy’s Program Executive Office for C4I in San Diego, the commander was somewhat distracted by the challenge he was facing. Haiti had just suffered a massive earthquake, and a Navy task force was to be quickly assembled to provide aid. This situation required rapid, and new, configuration of the various systems that needed to be integrated for a specific task force to meet the specific needs of the deployment.
While the amount of development required for this “systems of systems” was modest, the challenge of ensuring adequate integration of the systems chosen and installed on the ships to be deployed was exceptional. The professional discipline shown by the PEO C4I workforce and their SSC engineering integration teams led to effective and rapid response to the Haiti catastrophe. These are the types of events that remind us of the value of a commitment to quality so that we can meet such challenges successfully.
Another helpful process area to use when integrating a large product is Acquisition Technical Management. Specific goal 2 is particularly useful because it focuses on the critical interfaces that likely have become the responsibility of the acquisition organization. Notes in this goal state: “The supplier is responsible for managing the interfaces of the product or service it is developing. However, the acquirer identifies those interfaces, particularly external interfaces, that it will manage as well.”
This statement recognizes that the acquirer’s role often extends beyond simply ensuring a good product from a supplier. In the introductory notes of this process area, the acquirer is encouraged to use more of CMMI-DEV’s Engineering process areas if “the acquirer assumes the role of overall systems engineer, architect, or integrator for the product.”
Clearly many acquisitions are relatively simple and direct, ensuring delivery of a new capability to the customer by establishing contracts with selected suppliers. However, the model provides an excellent starting point for the types of complex arrangements that often are seen in government today.
Mike Phillips
Having been a military test pilot and director of tests for the B-2 Spirit Stealth Bomber acquisition program for the U.S. Air Force (USAF), I am particularly interested in the interactions between the testing aspects contained in the two companion models of CMMI-DEV and CMMI-ACQ. This essay provides some perspectives that will be helpful for those of you who need to work on verification issues across the acquirer–supplier boundary. With this release, we can also reflect on the relationship between CMMI-ACQ and CMMI-SVC, because the acquirer also has to consider some of the unique challenges when the supplier is providing services.
When we created the process areas for the Acquisition category, the product team knew we needed to expand the limited supplier agreement coverage in SAM (in the CMMI-DEV model) to address the business aspects critical to acquisition. We also needed to cover the technical aspects that were “close cousins” to the process areas in the Engineering process area category in CMMI-DEV. We tried to maximize the commonality between the two models so that both collections of process areas would be tightly related in the project lifecycle. For a significant part of the lifecycle, the two models represent two “lenses” observing the same total effort. Certainly testing seemed to be a strongly correlated element in both models.
As you read through the CMMI-ACQ process areas of Acquisition Verification and Acquisition Validation, notice that the wording is either identical or quite similar to the wording in the CMMI-DEV process areas of Verification and Validation. This commonality was intentional—that is, it was meant to maintain bridges between the acquirer and supplier teams, which often must work together effectively.
Throughout my military career, I often worked on teams that crossed the boundaries of government and industry to conserve resources and employ the expertise needed for complex test events such as flight testing. Combined test teams were commonly used. Often these teams conducted flight test missions in which both verification and validation tasks were conducted. The government, in these cases, helped the contractor test the specification and provided critical insight into the usability of the system.
Each model, however, must maintain a distinct purpose from its perspective (whether supplier or acquirer) while maximizing desirable synergies. Consider Validation and Acquisition Validation first. There is little real distinction between these two process areas because the purpose of both is the same: to satisfy the ultimate customer with products and services that provide the needed utility in the end-use environment. The value proposition is not different: Dissatisfied customers are often as upset with the acquisition organization as with the supplier of the product that has not satisfied the users. Thus the difference between Validation and Acquisition Validation is quite small.
With Verification and Acquisition Verification, however, the commonality is strong in one sense—both emphasize assurance that work products are properly verified by a variety of techniques—but each process area focuses on the work products produced and controlled in its own domain. Thus, whereas Verification focuses on the verification of supplier work products, Acquisition Verification focuses on the verification of acquirer work products.
The CMMI-ACQ product team saw the value of focusing on acquirer work products such as solicitation packages, supplier agreements and plans, requirements documents, and design constraints developed by the acquirer. These work products require thoughtful verification using powerful tools such as peer reviews to remove defects. Given that defective requirements are among the most expensive to find and address because they often are discovered late in development, improvements in verifying requirements have great value.
Verifying the work products developed by the supplier is covered in Verification and is the responsibility of the supplier, who is presumably using CMMI-DEV or CMMI-SVC. The acquirer may assist in verification activities because such assistance may be mutually beneficial to both the acquirer and the supplier. Nevertheless, the verification of the product or service developed by the supplier is ultimately performed by the supplier.
The acquirer’s interests during supplier verification activities are captured in two other process areas. During much of development, supplier deliverables are used as a way to gauge development progress. Monitoring that progress and reviewing the supplier’s technical solutions and verification results are covered by Acquisition Technical Management.
All of the process areas discussed thus far (Verification, Validation, Acquisition Verification, Acquisition Validation, Acquisition Technical Management, and Agreement Management) have elements of the stuff we call “testing.” Testing is, in fact, a subset of the wider concept often known as “verification.” I’ll use my B-2 experience as an example.
Early in B-2 development, the B-2 System Program Office (SPO) prepared a test and evaluation master plan (TEMP). This plan is required in DoD acquisition and demands careful attention to the various methods that the government requires for the contractor’s verification environment. The B-2 TEMP outlined the major testing activities necessary to deliver a B-2 to the using command in the USAF. This plan delineated a collection of classic developmental test activities called “developmental test and evaluation.” From a model perspective, these activities would be part of the supplier’s Verification activities during development. Fortunately, we had the supplier’s system test plan, an initial deliverable document to the government, to help us create the developmental test and evaluation portions of the TEMP. Receiving the right supplier deliverables to review and use is part of what the Solicitation and Supplier Agreement Development process area is about.
The TEMP also addressed operational test and evaluation activities, which included operational pilots and maintenance personnel to ensure operational utility. These activities map to both Validation and Acquisition Validation. From a model perspective, many of the operational elements included in the TEMP were the result of mission and basing scenarios codeveloped with the customer. These activities map to practices in Acquisition Requirements Development.
The TEMP also described a large number of ancillary test-related activities such as wind tunnel testing and radar reflectivity testing. These earlier activities are best mapped to the practices of Acquisition Technical Management. They are the technical activities the acquisition organization uses to analyze the suppliers’ candidate solution and review the suppliers’ technical progress. As noted in Acquisition Technical Management, when all of the technical aspects have been analyzed and reviewed to the satisfaction of the acquirer, the system is ready for acceptance testing. In the B-2 example, a rigorous physical configuration audit and functional configuration audit were conducted to account for all of the essential elements of the system. The testing and audits ensured that a fully capable aircraft was delivered to the using command. The physical configuration audit activities are part of the “Conduct Technical Reviews” specific practice in Acquisition Technical Management.
Much of the emphasis in CMMI-ACQ reminds acquirers of their responsibilities for the key work products and processes under their control. Many of us have observed the value of conducting peer reviews on critical requirements documents, from the initial delivery that becomes part of a Request for Proposal (RFP) to the oft-needed engineering change proposals. My history with planning the testing efforts for the B-2 leads me to point out a significant use of Acquisition Verification associated with the effective preparation of the TEMP. The test facilities typically were provided by the government, often including a mixture of facilities operated by the DoD and other government agencies such as NASA and what is now the Department of Energy.
Many people have stated that “an untestable requirement is problematic.” Therefore, to complete a viable TEMP, the acquisition team must determine the criteria for requirements satisfaction and the verification environment that is needed to enable requirements satisfaction to be judged accurately. Here are a few of the questions that I believe complement Acquisition Verification best practices to help improve the quality of the government work product called the TEMP, or other work products like it in industry:
• What confidence do we have that a specific verification environment—whether it is a government site, an independent site, or the contractor site—can accurately measure the performance we are demanding?
• Are results obtained in a constrained test environment scalable to represent specified performance requirements?
• Can limited experience be extrapolated to represent long-term needs for reliability and maintainability?
• When might it be more cost-effective to accept data gathered at the suppliers’ facilities?
• When might it be essential to use an independent (government or buyer) facility to ensure confidence in test results?
These and similar questions demonstrate the need for early attention to Acquisition Verification practices, particularly those under the first Acquisition Verification specific goal, “Prepare for Verification.” These practices, of course, then support specific goal 3 activities that verify acquisition work products. These work products, in turn, often provide or augment the verification environment in which the system being acquired demonstrates its performance. These questions also help to differentiate the need to ensure a viable verification environment for the acquiring organization from the more familiar need to support the suppliers’ verification of work products.
Now that we have CMMI-SVC to more fully describe the services environment, it makes sense to extend this thinking into that specific domain. In some cases, the extension is easy: The service is developed and delivered much like a tangible work product. More challenging is the situation in which the service to be provided will span a significant period of time, often with multiple service elements being included within the agreement.
Acquisition Verification can assist in the preparation of an effective service level agreement to be part of the contract. Often this work product needs to be a joint effort between the buyer and the supplier of the services. The questions listed previously can guide efforts to ensure satisfaction of the service level agreement. Such collaboration aids both the acquirer and the supplier of the services by ensuring that expectations can be described, measured, and satisfied to the benefit of both parties to the agreement. Based on my own experience, this is an area with major opportunities for research and improvement.
Brian Gallagher
CMMI-ACQ includes two practices—one each in Project Planning and Project Monitoring and Control—related to transitioning products and services into operational use. As shown in Figure 6.16, one of the primary responsibilities of the acquirer, in addition to the acceptance of products and services from suppliers, is to ensure a successful transition of capability into operational use, including logistical considerations initially as well as throughout the life of the product or service.
Planning and monitoring the transition to operations and support activities involves the processes used to transition new or evolved products and services into operational use, as well as their transition to maintenance or support organizations. Many projects fail during later lifecycle phases because the operational user is ill prepared to accept and integrate the capability into day-to-day operations. Failure also stems from the inability to support the capability delivered due to inadequate initial sparing or the inability to evolve the capability as the operational mission changes.
Maintenance and support responsibilities may not be the responsibility of the original supplier. For example, sometimes an acquisition organization decides to maintain the new capability in-house. In other situations, there may be other reasons, economic or otherwise, why it makes sense to look for more potential providers of support activities. In these cases, the acquisition project must ensure that it has access to everything that is required to sustain the capability. This required access would include all design documentation as well as development environments, test equipment, simulators, and models.
The acquisition project is responsible for ensuring that acquired products not only meet their specified requirements (see the Acquisition Technical Management process area) and can be used in the intended environment (see the Acquisition Validation process area), but also can be transitioned into operational use to achieve the users’ desired operational capabilities and can be maintained and sustained over their intended lifecycles.
The acquisition project is responsible for ensuring that reasonable planning for transition into operations is conducted (see Project Planning specific practice 2.7, “Plan for Transition to Operations and Support”), clear transition criteria exist and are agreed to by relevant stakeholders, and planning is completed for product maintenance and support of products after they become operational. These plans should include reasonable accommodation for known and potential evolution of products and their eventual removal from operational use.
This transition planning should be conducted early in the acquisition lifecycle to ensure that the acquisition strategy and other planning documents reflect transition and support decisions. In addition, contractual documentation must reflect these decisions to ensure that operational personnel are equipped and prepared for the transition and that the product or service is supportable after delivery. The project also must monitor its transition activities (see Project Monitoring and Control specific practice 1.8, “Monitor Transition to Operations and Support”) to ensure that operational users are prepared for the transition and that the capability is sustainable once it is delivered.
Adequate planning and monitoring of transition activities is critical to success when delivering value to customers. The acquirer plays an important role in these activities by setting the direction, planning for implementation, and ensuring that the acquirer and supplier implement transition activities.
Craig Meyers
The term interoperability has long been used in an operational context. For example, it relates to the ability of machines to “plug and play.” However, there is no reason to limit interoperability to an operational context.
Recent work at the SEI has broadened the concept of interoperability to apply in other contexts. One result of this work is the Systems of Systems Interoperability (SOSI) model [Morris 2004], a diagram of which appears in Figure 6.17. This model is designed for application to a systems-of-systems context that includes more and broader interactions than one typically finds in a program- and system-centric context.
Taking a vertical perspective of Figure 6.17, we introduce activities related to program management, system construction, and system operation. What is important to recognize is the need for interaction among the various acquisition functions. It is this recognition that warrants a broader perspective of the concept of interoperability. Figure 6.17 shows three different aspects of interoperability. Although we recognize that operational interoperability is the traditional use of the term, we also introduce the concepts of programmatic interoperability, constructive interoperability, and interoperable acquisition [Meyers 2008].
Programmatic interoperability is interoperability with regard to the functions of program management, regardless of the organization that performs those functions.
It may be natural to interpret programmatic interoperability as occurring only between program offices, but such an interpretation is insufficient. Multiple professionals collaborate with respect to program management. In addition to the program manager, relevant participants may include staff members from a program executive office, a service acquisition executive, independent cost analysts, and so on. Each of these professionals performs a necessary function related to the management of a program. In fact, contractor representation may also be part of this context. The report “Exploring Programmatic Interoperability: Army Future Force Workshop” [Smith 2005] provides an example of the application of the principles of programmatic interoperability.
The concept of constructive interoperability may be defined in a similar manner as programmatic interoperability. Constructive interoperability is interoperability with regard to the functions of system construction, regardless of which organization actually performs those functions.
Again, it may be natural to think of constructive interoperability as something that occurs between contractors, but that notion would limit its scope. For example, other organizations that may participate in the construction of a system might include a subcontractor, a supplier of commercial off-the-shelf (COTS) products, independent consultants, an independent verification and validation organization, and a testing organization. In addition, a program management representative may participate in the function of system construction. Each of these organizations performs a valuable function in the construction of a system.
An example of programmatic interoperability is the case in which a program is developing a software component that will be used by another program. For this transaction to occur, the program offices must interact with regard to such tasks as schedule synchronization, for example. Other interactions may be required, such as joint risk management. In addition, as noted earlier, other organizations will play a role. For example, the development of an acquisition strategy may require interaction with the organizations responsible for its approval.
This example may be applied to the context of constructive interoperability as well. For one program office to provide a product to another, a corresponding agreement must exist among the relevant contractors. Such an agreement is often expressed in an associate contractor agreement (such an agreement would be called a supplier agreement in CMMI-ACQ). It is also possible that the separate contractors would seek out suppliers in a collective manner to maximize the chance of achieving an interoperable product.
The preceding example, although focusing on the aspects of interoperability, does not address the integration of those aspects. This topic enlarges the scope of discussion and leads us to define interoperable acquisition.
Interoperable acquisition is the collection of interactions that take place among the functions necessary to acquire, develop, and maintain systems to provide interoperable capabilities.
In essence, interoperable acquisition is the integration of the various aspects of interoperability. For instance, in regard to the previous example, although two program offices may agree on the need for the construction of a system, such construction will require interactions among the program offices and their respective contractors, as well as among the contractors themselves.
There are clearly implications for acquisition in the context of a system of systems6 and CMMI-ACQ. For example, CMMI-ACQ includes the Solicitation and Supplier Agreement Development and Agreement Management process areas, which relate to agreement management. As the previous discussion illustrates, the nature of agreements—their formation and execution—is important in the context of a system of systems and interoperable acquisition. These agreements may be developed to serve agreement management, or to aid in the construction of systems that must participate in a larger context.
Other areas of relevance deal with requirements management and schedule management, as discussed in the reports “Requirements Management in a System of Systems Context: A Workshop” [Meyers 2006a] and “Schedule Considerations for Interoperable Acquisition” [Meyers 2006c], respectively. Another related topic is that of risk management considerations for interoperable acquisition. This topic is covered in the report “Risk Management Considerations for Interoperable Acquisition” [Meyers 2006b]. Some common threads exist among these topics, including the following:
• Developing and adhering to agreements that manage the necessary interactions
• Defining information, including its syntax, but also (and more importantly) its semantics
• Providing mechanisms for sharing information
• Engaging in collaborative behavior necessary to meet the needs of the operational community
Each of these topics must be considered in the application of CMMI-ACQ and CMMI-DEV to a systems-of-systems context, perhaps with necessary extensions, to meet the goals of interoperable acquisition. Therefore, the application of maturity models to acquisition in a systems-of-systems context is quite relevant!
Brian Gallagher
Who can argue that being agile is a bad thing? The implication of not embracing agility is that one appreciates or values bureaucracy and rejects the ability to react quickly to changing environments. Operational parts of organizations, from military fighting units to customer-facing service providers, demand systems that provide operational agility. Developers prefer the increased control that small teams enjoy when implementing agile development methods. Acquisition professionals want to act with speed, agility, and purpose and ensure the systems they acquire meet the evolving needs of their customer while allowing developers the ability to meet technical challenges in new and innovative ways.
Given these desires, the acquisition, development, and operational communities collectively nod their heads in agreement that we need to be more agile. Unfortunately, the interpretation of what agile really means becomes clouded depending on an organization’s mission. If the mission is to acquire operationally relevant systems, develop world-class solutions for customers, or execute a set of operational tasks, the thoughts on what agility means will be unique and probably not commonly shared with other enterprise partners.
Figure 6.18 indicates how acquirers, developers, and operational units can all act with agility, and how together, if properly motivated, they can be part of an agile enterprise that ensures each part of the enterprise works in harmony to achieve greater objectives.
The first step in evolving into an agile enterprise is to clearly understand what is needed as an output from the enterprise and then make informed decisions about how much agility, coupled with discipline, is needed to achieve those objectives. For example, if an end user needs a system with the operational characteristic of agility, more time and effort might be required to analyze the problem space, negotiate the solution trade space, and design a system with agility as an operational characteristic.
It doesn’t naturally follow that the developer needs to employ formal Agile development methods to deliver an operationally agile system. Consider the following quote by Blaise Pascal: “I have made this [letter] longer than usual, only because I have not had the time to make it shorter.”7 In other words, sometimes it takes longer to deliver a product with certain important characteristics—like a letter with the attributes of being shorter in length and easier to read. Conversely, while the use of Agile development methods may not provide the final solution more quickly than traditional methods, there is a higher probability that the system developed iteratively, with lots of end-user interaction and early-and-often deliveries of working versions, will be more operationally acceptable than a system developed using traditional methods.8
The clear conclusion is that the acquirer plays an important role in meeting operational agility needs, in the success of its developers’ use of Agile development methods, in the way it uses agile concepts to improve its internal acquisition responsibilities, and in ensuring that all of its partners are able to participate and benefit from operating within an agile enterprise construct.
Most end users of systems and services, whether they are competing in a private commercial environment or a government or military environment, are facing more challenges today than ever before. Competition in the private sector drives organizations to move more quickly to market, to adjust to unforeseen events, and to anticipate market needs. Government end users, whether members of an operational military unit, an agency providing social services, or even an intelligence agency helping to defeat irregular aggressors, are facing similar challenges. They can no longer afford stodgy systems that all too soon become brittle.
When end users discuss “agility,” they are usually describing their ability to use the delivered product or service in new and evolving ways without large cost or time impacts. They want the flexibility to quickly reconfigure their systems and services themselves to meet new challenges. What they are describing are quality attributes that are important to meeting their business or mission needs. In the CMMI-ACQ process areas of Acquisition Requirements Development (ARD), Solicitation and Supplier Agreement Development (SSAD), and Acquisition Technical Management (ATM), the acquirer helps elicit the real needs of the end user, translates those needs into contractual requirements that can be fulfilled by developers, and incrementally evaluates progress toward meeting those needs.
Usually, acquirers are very good at capturing functional requirements. However, “agility” needs are not typically described in a functional sense. Instead, their impact drives the acceptability of the system and most certainly will drive the architectural decisions made by both the acquirer and the developer as they analyze requirements and possible solution alternatives. One way to capture the “agility” needs of the end user is through the use of a Quality Attribute Workshop (QAW).9 QAWs provide a structured method for identifying a system’s architecture-critical quality attributes—such as agility, availability, performance, security, interoperability, and modifiability—that are derived from mission or business goals.
These attributes are prioritized and further refined with scenarios that make abstract concepts such as “agility” much more concrete. For example, the end user at a satellite ground station might want an agile system. Through the use of an approach like the QAW, that desire can be explicitly described in the form of some operationally relevant stimuli, expected response, and response measure. When probed, the abstract agile need first articulated by a satellite ground operator might turn out to be the expectation that when the mission evolves (stimuli), the system will allow insertion of third-party applications or services with no impact to current operations (response) within one month from the identification of the mission change (response measure).
Scenarios like this example can be captured, prioritized, and provided, along with the set of functional requirements, to developers during a competitive source selection activity (involving both ARD and SSAD process areas). The developers can then make key architectural decisions and propose solutions to meet those needs. Another possible approach is to include operational representatives in the program’s risk management process to ensure risks to unique end-state agility needs are considered and mitigated proactively.
As previously mentioned, the use of Agile development methods, especially for software-intensive systems, provides a higher probability of delivering operationally relevant capabilities. The move toward Agile methods has its roots in the writing of The Agile Manifesto (see Figure 6.19). A group of well-respected leaders in the software engineering community, frustrated over the current state of software development, boldly penned the manifesto, which in turn inspired dramatic changes in some approaches to software development, and later system development.
Upon reading the Manifesto, it becomes quickly apparent that adoption of Agile methods by developers requires not only changes in the way developers behave, but also changes in the way acquirers interact with developers to enable their success. In the past, some acquirers have made the mistake of requiring their developers to use Agile methods, but not changing their interaction with the developer team.
In one program, the direction provided was for the developers to “Go Agile!” It became the battle cry and the developers enthusiastically accepted the challenge. It quickly became evident that the cry should have been “Let’s go Agile!” (to include the acquirers as well as the developers) when the development team expecting “shoulder to shoulder” reviews and strong acquirer presence during daily planning activities were instead faced with the insistence of an arm’s-length relationship and high ceremony reviews following a standard “waterfall” lifecycle. Table 6.2 provides some guidance on how the acquirer can evolve its oversight role to increase the probability of success of the development team.
The use of agile approaches and methods within the development community is fairly mature. It is no longer the case that only a brave developer tries an agile approach. Instead, use of agile methods is relatively routine, and agile method failures are becoming hard to find. Even so, the acquisition community is still trying to understand what it means to operate an acquisition program or organization in an agile way. The key is to embrace the principles of agility and interpret them for the unique aspects of the acquisition environment, rather than simply adopting the precise practices of agile development.
Table 6.3 lists the principles of agile system development and suggests how these principles might be interpreted for the acquirer.11
With a clear understanding of the roles of the acquirer, the developer, and the operational end users and their unique needs for agility, the enterprise can optimize delivery of operationally significant value while meeting critical program timelines. The success of any member of the enterprise depends on the success and cooperation of every other member. The acquisition organization plays a critical role in ensuring delivered systems meet the agility needs of the operational end user as well as enabling its development partners to successfully employ Agile development methods to increase the probability of meeting end-user needs. By interpreting the principles of Agile system development to meet their own needs, the acquisition project can improve its internal focus, operation, and ability to satisfy stakeholder needs.
Mary Ann Lapham
This essay is adapted from the SEI report “Considerations for Using Agile in DoD Acquisition,” which I coauthored with Ray Williams, Bud Hammons, Dan Burton, and Fred Schenker.12
Before discussing the use of Agile in DoD, I must provide some context and an assumption we used when we wrote the report. First, within the Agile community of practice, the term “Agile” with a capital “A” is used to represent the entirety of agile concepts and methods used in that community. We have adopted this usage in this essay. Second, we define Agile as follows:
An iterative and incremental (evolutionary) approach to software development which is performed in a highly collaborative manner by self-organizing teams within an effective governance framework with “just enough” ceremony that produces high quality software in a cost effective and timely manner which meets the changing needs of its stakeholders.13
We assumed suppliers are either software providers or software-intensive systems providers. Our initial research was designed to determine if the use of Agile is beneficial to the DoD, and if so, to identify the barriers that exist in the acquisition environment and how these barriers might be addressed. Along the way, we found a school of thought that claimed users could and should embrace both the CMMI model and Agile methods [Glazer 2008]. It is not a stretch to conclude that they should embrace the use of CMMI-ACQ as well. However, some accommodations for the concepts used within Agile will most likely be required to have the full use of CMMI-ACQ. With all this in mind, let’s explore Agile use within the DoD.
The Agile philosophy has existed for many years and, in fact, is based on concepts that have been around for decades. This philosophy achieved its greatest success in small to mid-sized commercial applications. There has been limited documented use of these concepts in the DoD or larger government arena.
In recent years, Agile has matured, personnel have become more skilled in applying Agile concepts, and some DoD contractors have started to build internal Agile capabilities and use Agile on DoD programs. Some DoD acquisition programs have proposed and used Agile processes, attempting to benefit from contractor capabilities, but without (as yet) any formal DoD guidance, templates, or best practices.
Given this backdrop, can an Agile approach yield a better product developed within cost and schedule parameters? If barriers interfere with the ability of the DoD to adopt Agile, how can they be addressed?
Our interviews and research indicate that Agile can benefit the DoD. Agile is another tool that can provide both tactical and strategic benefits. The tactical benefits (i.e., lower cost, met schedules, higher quality) are important; however, the strategic benefits (i.e., responsiveness, the ability to rapidly adjust to the current situation) might be of even greater value. These benefits can be a huge factor in today’s world, where the DoD needs to get results faster and be better aligned with changing needs.
In fact, the literature14 available about Agile use cites impressive results. Even if actual experience provides savings for DoD programs on the low end of the spectrum described in this publication, the savings can be significant. We also found that the DoD 5000 series regulations do not prohibit the use of Agile. In fact, the IEEE15 is currently working on a standard for Agile development. To date, the standard is unpublished, but the fact that the IEEE has deemed it worthy of a standard indicates a step in the direction of obtaining formal guidance for Agile use.
Our research revealed that in the current, traditional “waterfall” method commonly employed within the DoD, there is an established practice that uses some form of controlled experimentation. Current waterfall practices create experimental code or prototypes that are later thrown away. Agile methods involve building software iteratively, refining or discarding portions as required to create increments of the product. The goal is to have working code at the end of each iteration that can be deployed. Some programs in the DoD today are employing Agile techniques to do just this.
Agile processes are based on good ideas derived from successful industry practices. We believe the DoD should embrace Agile methods for some programs and traditional waterfall methods for others. There is no “one size fits all” Agile process. Just like any set of processes, Agile methods must be tailored to fit the situation and context. For example, Agile teams responsible for developing high-risk, core components of a software architecture might apply not as aggressive release schedules as Agile teams developing less critical pieces of the software system. Some Agile teams might pick a two-week iteration cycle, whereas others might determine that their optimal iteration cycle is three weeks. Agile is not a silver bullet, but rather another technique to be included in the Program Management Office’s and contractor’s arsenal.
Sometimes a hybrid approach of waterfall and Agile methods is best for a program. For example, due to safety considerations, some mission-critical systems might require certain traditional milestones and documentation deliverables. However, Program Management Offices (PMOs) might work with Agile development teams on a hybrid approach that bridges these requirements with the need for agility and responsiveness. Perhaps the PMO would agree on fewer, less formal reviews and delivery of smaller sets of high-value documentation in exchange for getting a beta version that the user can start evaluating in the field more quickly.
Moving to Agile requires considerable work by the DoD entity (i.e., PMO, DoD, OSD, and perhaps Congress) and is not without hurdles. The most notable of these obstacles are described next.
Each lifecycle phase (e.g., Material Solution Analysis, Technology Development, Engineering and Manufacturing Development, Production and Deployment, and Operations and Support) presents unique challenges and opportunities. Some phases lend themselves to the use of an Agile approach better than others. Consider the Agile processes and practices you want to use early in the acquisition lifecycle; it is critical to make sure that contractually binding documents, such as RFPs and Statements of Work (SOWs), support those processes and practices. For example, if you embrace the Agile philosophy, you must determine how to meet the standard milestone criteria such as PDR and CDR. Typically, the types of documentation expected at these milestone events are not produced using Agile methods. Thus you should create expectations and criteria that reflect the level and type of documentation that are acceptable for those milestones, yet work within Agile constraints.
A central concept in Agile is the small, dynamic, high-performing team. The challenge lies in providing an environment that fosters the creation of self-forming or dynamic teams in a culture that is accustomed to static, centralized organizational structures. To complicate this issue further, the software team might be a small part of an overall system procurement for a tank, ship, or plane. The system environment might call for centralized configuration management, highly defined legacy interfaces, and a predetermined architecture, all of which constrain the software. This environment, then, should be treated as a constraint by the Agile team and can provide boundaries within which the Agile team must operate. These system boundaries could act to encapsulate the Agile team environment.
Access to end users can be complex and difficult when dealing with any single service, but can become even more complex with multi-service programs. Agile developers need to speak with a single voice so that the user can commit to changes for the product being developed. In some Agile approaches, the “one voice” is a product owner or manager who brings decisions to the Agile team that have been made through collaborative interaction with the end users. Within the DoD, the acquisition organization typically acts as the proxy for end users and only duly warranted personnel can commit the program. To mitigate these issues, invite end users to all demonstrations, where they can provide feedback that then becomes binding only with PMO approval. The end users need to work closely with the PMO, as with any acquisition.
While Agile concepts may not be new, the subtleties and nuances of each Agile method can be new to the uninformed PMO. To overcome this lack of knowledge, train the PMO staff before starting work with suppliers using Agile methods and employ an experienced coach or knowledgeable advocate for the PMO to help guide the staff throughout the process. Set aside funding for initial and ongoing training and support.
Traditional methods have well-defined oversight methods. Agile oversight methods are less defined and require more flexibility to accommodate the fluidity of Agile implementation. Decide on the specific type of oversight needs in advance. One aspect of the Agile management philosophy is that the manager is more of a team advocate than an overseer. The management function of roadblock remover is critical to the success of an Agile team. Update selected day-to-day PMO activities to support this type of change.
Typically, documentation is used by the PMO throughout the development cycle to monitor the progress of the contractor. Documentation produced by Agile teams is just enough to meet this need and provide continuity for the team. It is usually not sufficient for capstone reviews or monitoring progress. For this reason, it is important to create different ways to meet the same PMO monitoring objectives while leveraging the advantages of Agile.
Agile rewards and incentives are different from those associated with traditional methods. In the DoD, the challenge is finding ways to incentivize teams and individuals to support Agile goals such as innovation, rapid software delivery, and customer satisfaction. At the same time, rewards that incentivize the wrong things should be eliminated. For example, rather than rewarding contractors for fixing defects, reward them for early delivery of beta software to a limited set of users in a constrained environment. Using this approach, the beta users get to test the product in the field sooner while providing feedback that helps to improve the quality of the software. Also consider incentives that encourage a collaborative relationship between the PMO and the contractor’s staff.
The composition of the PMO staff may need to change to accommodate the use of an Agile approach. Consider adding a knowledgeable Agile advocate or experienced coach to the team. Although end-user representatives are essential for Agile success, these positions are often difficult to fill quickly and consistently. Consider using rotating personnel to fill the end-user positions.
Another challenge is keeping high-performing Agile teams together long enough to achieve peak performance. This issue arises because developers change at the end of a contractual period of performance. Consider putting key Agile technical developers or technical leads under a separate contract vehicle or hiring them to work for the government organization itself.
The overall organizational culture needs to support the Agile methodology in use. The Agile culture runs counter to the traditional waterfall culture in everything from oversight and team structure to end-user interaction throughout development. This change in approach requires a mindset change for the PMO and other government entities such as OSD. To employ any of the Agile concepts, plan for them, anticipate the changes needed in their environment and business model, and apply the hard work to make the changes a reality. Organizational change management is essential during the transition to Agile.
These hurdles were identified by analyzing the software development environment and determining what needed to change to support adoption of Agile methods. However, these same hurdles exist if one looks at using Agile from an acquisition perspective. If the acquisition and development methods do not work together and complement each other, then the program is doomed before it begins. The tenets of CMMI-ACQ and CMMI-DEV can be and should be applied to meet the goals of a program willing to use Agile. The hurdles that exist need to be addressed and eliminated. As stated before, why not embrace both?
Brian Gallagher
Most process improvement approaches are based on a similar pedigree that traces back to the same foundation established by process improvement gurus such as Shewhart, Deming, Juran, Ishikawa, Taguchi, Humphrey, and others. The improvement approaches embodied in CMMI, Lean Six Sigma, the Theory of Constraints, Total Quality Management, and other methods all embrace an improvement paradigm that can be boiled down to these simple steps.
1. Define the system you want to improve.
2. Understand the scope of the system.
3. Define the goals and objectives for the system.
4. Determine constraints to achieving objectives.
5. Make a plan to remove the constraints.
6. Learn lessons and do it again.
These simple steps comprise an improvement pattern that is evident in the “Plan, Do, Check, Act” (PDCA) improvement loop; Six Sigma’s “Define, Measure, Analyze, Improve, Control” (DMAIC) improvement methodology; the U.S. military’s “Observe, Orient, Decide, Act” (OODA) loop; and the SEI’s “Initiating, Diagnosing, Establishing, Acting, and Learning” (IDEAL) method shown in Figure 6.20.
All of these improvement paradigms share a common goal: to improve the effectiveness and efficiency of a system. That system can be a manufacturing assembly line, the decision process used by military pilots, a development organization, an acquisition project, or any other entity or process that can be defined, observed, and improved.
One mistake individuals and organizations make when embarking on an improvement path is to force a decision to choose one improvement methodology over another, as if each approach must be mutually exclusive. An example is committing an organization to using CMMI or Lean Six Sigma, or the Theory of Constraints, before understanding how to take advantage of the toolkits provided by each approach and selecting the tools that make the most sense for the culture of the organization and the problem at hand. The following case study illustrates how one organization took advantage of using the Theory of Constraints with CMMI.
One government agency had just completed a Standard CMMI Appraisal Method for Process Improvement (SCAMPI) appraisal with a CMMI model on its 12 most important customer projects—all managed within a Project Management Organization (PMO) under the direction of a senior executive. The PMO was a virtual organization consisting of a class of projects that the agency decided needed additional focus due to their criticality from a customer perspective. Although the entire acquisition and engineering staff numbered close to 5,000 employees, the senior executive committed to delivering these 12 projects on time, even at the expense of less important projects. Each project had a well-defined set of process requirements, including those for start-up and planning activities, and was subject to monthly Program Management Reviews (PMRs) to track progress, resolve work issues, and manage risk. With its clear focus on project success, the PMO easily achieved CMMI maturity level 2.
On the heels of the SCAMPI appraisal, the agency was directed to move much of its customer-facing functionality to the Web, enabling customers to take advantage of a wide variety of self-service applications. This web-based system represented a new technology for the agency, as many of its employees and contractors had experience only with mainframe or client/server systems. To learn how to successfully deliver capability using this new technology, the senior executive visited several Internet-based commercial organizations and found that one key factor in their success was the ability to deliver customer value within very short time frames—usually between 30 and 90 days. Armed with this information, the executive decried that all Internet-based projects were to immediately establish a 90-day delivery schedule. Confident with this new direction and eager to see results, he asked that the Internet projects be part of the scope of projects in the next SCAMPI appraisal to validate the approach.
The agency’s process improvement consultant realized that trying to do a formal SCAMPI appraisal on an organization struggling to adopt a new, more agile methodology while learning a new web-based technology and under pressure to deliver in 90 days was not the best option given the timing of the recent changes. She explored the idea of conducting a risk assessment or other technical intervention, but found that the word risk was overloaded in the agency context. Thus she needed to employ a different approach. She finally suggested using the Theory of Constraints to help identify and mitigate some of the process-related constraints facing the agency.
The Theory of Constraints is a system-level management philosophy that suggests all systems may be compared to chains. Each chain is composed of various links with unique strengths and weaknesses. The weakest link (i.e., the constraint) in the chain is not generally eliminated; rather, it is strengthened by following an organized process, thereby allowing for improvement in the system. By systematically identifying the weakest link and strengthening it, the system as a whole experiences improvement.
The first step in improving the system is to identify all of the system’s constraints. A constraint is an obstacle to the successful completion of an endeavor. Think about how constraints would differ depending on the following endeavors:
• Driving from Miami to Las Vegas
• Walking across a busy street
• Digging a ditch
• Building a shed
• Acquiring a new space launch vehicle
• Fighting a battle
Constraints are not context free; you cannot know your constraints until you know your endeavor. For the Internet project endeavor, the senior executive selected a rather wordy “picture of success”:
The Internet acquisition projects are scheduled and defined based on agency priorities and the availability of resources from all involved components, and the application releases that support those initiatives are planned, developed, and implemented within established time frames using established procedures to produce quality systems.
The problem the consultant faced was how to systematically identify the constraints. Where should she look, and how would she make sure she covered all the important processes employed by the agency? Because the agency was familiar with CMMI, she decided to use the practices in CMMI as a taxonomy to help identify constraints. Instead of looking for evidence of compliance with the practice statements as one would in a SCAMPI appraisal, she asked interviewees to judge how well the practices were implemented and the impact of implementation on successful achievement of the “picture of success.” Consider the difference between the two approaches as demonstrated in an interview session:
Question: “How do you establish estimates?”
Answer: “We follow a documented procedure ... involve stakeholders ... obtain proper sign-off and commitment ...”
Question: “Given your ‘picture of success,’ do you have any concerns about the way you establish estimates?”
Answer: “Our estimates are based on productivity rates that are twice what our contractors have ever been able to accomplish. There’s no way we’ll meet our current schedule.”
After six intense interview sessions involving 40 agency personnel and contractors, 103 individual constraint statements were gathered and affinity-grouped into the following constraint areas:
• Establishing, Communicating, and Measuring Agency Goals
• Legacy Versus Internet
• Lack of Trained and Experienced Resources
• Lack of Coordination and Integration
• Internet-Driven Change
• Product Quality and Integrity
• Team Performance
• Imposed Standards and Mandates
• Requirements Definition and Management
• Unprecedented Delivery Paradigm
• Fixed 90-Day Schedule
Further analysis using cause-and-effect tools helped to identify how each constraint affected the others. This analysis also allowed the consultant to produce the hierarchical interrelationship diagraph depicted in Figure 6.21.
The results of this analysis helped the senior executive decide which constraints needed to be resolved to help improve the quality of the delivered systems. The higher up in the diagraph you can affect elements, the more impact you can achieve.
The senior executive decided that the “Imposed Mandates” constraint was outside his span of control. However, many of the other constraints were either directly within his control or at least within his sphere of influence. He decided to charter improvement teams to tackle the following constraints:
• Establishing, Communicating, and Measuring Goals
• Internet Versus Legacy (combined with Coordination and Integration)
• Internet-Driven Changes
The improvement teams used a well-defined improvement approach that helped them explore each constraint in detail, establish an improvement goal, perform root cause analysis, identify barriers and enablers to success, and establish strategies and milestones to accomplish the goal.
The process improvement consultant recognized that using CMMI alone was not appropriate for the challenges facing the agency, and imported concepts and tools from another improvement approach—the Theory of Constraints—to help the agency recognize and remove critical process-related constraints. She sought to establish improvement strategies and helped the agency avert a pending crisis, setting its personnel on an improvement course that served them well. Today, the agency’s PMO manages both legacy and Internet-based projects, serves as an exemplar for other agencies, and demonstrated its process prowess by successfully achieving CMMI maturity level 3.
This case study also illustrates how using CMMI for process improvement fits into the DoD-wide Continuous Process Improvement (CPI)/Lean Six Sigma (LSS) Program, which includes the Lean, Six Sigma, and Theory of Constraints tools and methods for process improvement [DoD 2008].
Mike Phillips
As we are finishing the details of the current collection of process areas that span three constellations, this essay is my opportunity to encourage “continuous thinking.” My esteemed mentor as we began and then evolved the CMMI Product Suite was our Chief Architect, Dr. Roger Bate. Roger left us with an amazing legacy. He imagined that organizations could look at a collection of “process areas” and choose ones they might wish to use to facilitate their process improvement activities.
Maturity levels for organizations were adequate, Roger thought, but not as interesting to him as being able to focus attention on a collection of process areas for business benefit. Small businesses have been the first to realize the advantage of this approach, as they often find the full collection of process areas in any constellation daunting. An SEI report, “CMMI Roadmaps” (www.sei.cmu.edu/library/abstracts/reports/08tn010.cfm), describes some ways to construct thematic approaches to effective use of process areas from the CMMI for Development constellation.
As we created the two new constellations, we took care to refer back to the predecessor collection of process areas in CMMI for Development. For example, in CMMI for Acquisition, we note that some acquisition organizations might need more technical detail in the requirements development effort than we provided in Acquisition Requirements Development (ARD)—in fact, they might need to “reach back” to CMMI-DEV’s Requirements Development (RD) process area for more assistance.
In CMMI for Services, we suggest that the Service System Development (SSD) process area is useful when the development efforts are relatively limited, but the full engineering category in CMMI-DEV may be useful if major service systems are being created and delivered.
Now with three full constellations to consider for addressing the complex organizations many of you have as targets for process improvement, many “refer to” possibilities exist. With the release of the V1.3 Product Suite, we will offer the option to describe satisfaction of process areas from any of the process areas in the CMMI portfolio. What are some of the more obvious expansions?
We have already mentioned two expansions—namely, ARD using RD and SSD expanded to capture RD, TS, PI, VER, and VAL. What about situations in which most of the development is done outside the organization, but final responsibility for effective systems integration remains with your organization? Perhaps a few of the acquisition process areas would be useful beyond SAM. A simple start would be to investigate using SSAD and AM as a replacement for SAM to get the additional detailed help. Also, ATM might offer some good technical assistance in monitoring the technical progress of the elements being developed by specific partners.
As we add the contributions of CMMI-SVC to the mix, several process areas offer yet more ways to expand. In V1.2 of CMMI-DEV, for example, we added informative material in Risk Management to begin to address concerns about continuity of operations after some significant risk event occurs. Now, with CMMI-SVC, we have a full process area that provides more robust coverage of continuity concerns. (For those users who need even more coverage, the SEI now has the Resilience Management Model [RMM] to give the greater attention that some financial institutions and similar organizations have expressed as necessary for their process improvement endeavors. For more, see www.cert.org/resilience/rmm.html.)
Another expansion worthy of consideration is inclusion of the contributions in the Service Systems Transition (SST) process area. Organizations that may have been responsible for development of new systems—and maintenance of existing systems until the new system can be brought to full capability—may find the practices contained in SST to be a useful expansion, as this part of the lifecycle is the subject of only limited coverage in CMMI-DEV. In addition, CMMI-ACQ added two practices to PP and PMC to address planning for the transition into use, so the CMMI-ACQ versions of these two core process areas might mesh nicely with the SST expansion.
A topic that challenged the development team for V1.3 was improved coverage of “strategy.” Those of us with acquisition experience knew the criticality of an effective acquisition strategy to program success, so the practice was added to the CMMI-ACQ version of PP. In the CMMI-SVC constellation, Strategic Service Management (STSM) has the objective “to get the information needed to make effective strategic decisions about the set of standard services the organization maintains.” With minor interpretation, this process area could assist a development organization in determining which types of development projects should be in its product development line. The constellation authors also added a robust practice in the CMMI-SVC version of PP (Work Planning) to “provide the business framework for planning and managing the work.”
Two process areas essential for service work were seriously considered for insertion into CMMI-DEV V1.3—Capacity and Availability Management (CAM) and Incident Resolution and Prevention (IRP). In the end, expansion of the CMMI-DEV constellation from 22 to 24 process areas was determined to be less valuable than continuing our efforts to streamline coverage. As a result, these two process areas offer another opportunity for the type of expansion explored in this essay.
Those of you who have experienced appraisals have likely seen the use of target profiles that gather the collection of process areas to be examined. Often these profiles specifically address the necessary collections associated with maturity levels, but this need not be the case. With the release of V1.3, we have ensured that the reporting system (SEI Appraisal System [SAS]) is robust enough to allow depiction of process areas from multiple constellations. As other architecturally similar SEI models, such as the RMM mentioned previously, grow in use, we will be able to develop profiles using the mixtures that give process improvement value to a growing range of complex organizations.
18.224.73.77