1. David Cleland and H. Kerzner, A Project Management Dictionary of Terms (New York: Van Nostrand, 1985), p. 187.
2. Max Wideman, First Principles of Project Management, http://www.maxwideman.com/papers/principles/principles.pdf.
3. The notion of capabilities-based planning originated in the defense domain. The Joint Requirements Oversight Council (JROC) in the U.S. Department of Defense defines the needed capabilities of equipment and services. In the IT domain, it is rare to speak about needed capabilities. Instead, projects usually start with requirements, without any knowledge about how the project’s outcome will support a needed business capability.
4. Coupling and cohesion is a critical attribute of the capabilities and the requirements that result. Two components are loosely coupled when changes in one never or rarely result in a change in the other. A component exhibits high cohesion when all its capabilities are strongly related in terms of another capability. The higher cohesion and the lower coupling a system, a design, or a function has, the more robust are its capabilities in the presence of change and the more flexible are the individual components to the same change.
5. Implementing Capabilities Based Planning Within the Public Safety and Security Sector: Lessons from Defence Experience, DRDC Centre for Security Science, DRDC CSS TM 2011-26, December 2011.
6. James A. Dewar, Assumption-Based Planning: A Tool for Reducing Avoidable Surprises (Cambridge: Cambridge University Press, 2002).
7. Ralph R. Young, The Requirements Engineering Handbook (Norwood, MA: Artech House, 2003).
8. I. F. Hooks and K. A. Farry, Customer-Centered Products: Creating Successful Products Through Smart Requirements Management (New York: AMACOM, 2001).
9. Jeffrey O. Grady, Systems Requirements Analysis (New York: McGraw-Hill, 1993).
10. Michael G. Christel and Kyo C. Kang, “Issues with Requirements Elicitation,” Technical Report, CMU/SEI–92–TR–12, Software Engineering Institute, Carnegie Mellon University, Pittsburgh, PA 15213.
11. Ian Sommerville and Pete Sawyer, Requirements Engineering: A Good Practice Guide (New York: John Wiley & Sons, 1997).
12. Garry Roedler and Cheryl Jones, Technical Measurement: A Collaborative Project of PSM, INCOSE, and Industry, INCOSE-TP-2003-020-01, pp. 9–10.
13. The notion of a big, visible chart has been around for some time. These types of charts hung in the hallways of TRW, 1 Space Park, Redondo Beach, California, in the late 1970s. They were produced to show the process flow of the program, the master schedule, and who was late in delivering their outcomes to the satellite we were building. Recently, the agile software development community has been using this term. It also is a concept used in the eXtreme Programming World: “. . . by putting this chart in the public area, everybody gets a reminder of where the team is on this, and visual feedback of the team’s progress,” http://c2.com/cgi/wiki?BigVisibleChart.
14. Charts on the wall are very effective ways to communicate complex ideas about the project. The size and visibility of these charts provide accountability to the stakeholders, focus the team on the right things, and convey a sense of urgency.
15. The discussion in this chapter is the basis of Earned Value Management. In later chapters, Earned Value Management is explained in more detail. For now, this will serve as an introduction to the concept of measuring project performance by integrating cost, schedule, and the technical performance of the products or services produced by the work efforts.
1. See the Standish Reports for IT, the Government Accountability Office (GAO) reports on high risk and troubled programs, and the construction industry annual reports on project performance.
2. Mr. Blaise Durante retired from the Air Force in October 2012. He was a member of the Senior Executive Service, and is currently deputy assistant secretary for acquisition integration, Office of the Assistant Secretary of the Air Force for Acquisition, Washington, D.C. He directed the development of weapon system acquisition policy, including program direction.
3. American Baseball Coaches Association, Practice Perfect Baseball, Bob Bennett, ed. (Champagne, IL: Human Kinetics, 2009).
4. Susie White, 10 Principles of Garden Design (London: Vivays Publishing, February 2012).
5. Noel Sproles, “Formulating the Measures of Effectiveness,” Systems Engineering, 5, no. 2 (2002).
6. Frederick P. Brooks Jr., “No Silver Bullet—Essence and Accident in Software Engineering” (Chapel Hill: University of North Carolina, 1986), p. 13. http://faculty.salisbury.edu/~xswang/Research/Papers/SERelated/no-silver-bullet.pdf.
7. James A Dewar, Assumptions-Based Planning: A Tool for Reducing Avoidable Surprises (Cambridge: Cambridge University Press, 2002).
8. Arnoud De Meyer, Christoph H. Loch and Michael T. Pich, “Managing Project Uncertainty: From Variation to Chaos,” MIT Sloan Management Review (Winter 2000). http://sloanreview.mit.edu/article/managing-project-uncertainty-from-variation-to-chaos/.
9. Christopher Alberts, Audrey Dorofee, “Mission Risk Diagnostic (MRD) Method Description,” February 2012, TECHNICAL NOTE, CMU/SEI-2012-TN-005, http://www.sei.cmu.edu/reports/12tn005.pdf.
10. This quote is attributed to Kent Beck, Extreme Programming Explained: Embrace Change (Reading, MA: Addison-Wesley Professional; U.S. ed., 1999).
11. The Concept of Operations is a document describing the characteristics of a proposed system from the point of view of an individual who will use that system. It is used to communicate the quantitative and qualitative system characteristics to all stakeholders. A ConOps generally evolves from a concept and describes how a set of capabilities may be employed to achieve desired objectives or end state.
1. Technical Performance Measures (TPM) are attributes used to determine how well a system or subsystem element is satisfying or expected to satisfy a technical requirement of a goal. Selection of TPMs should be limited to critical technical threshold or parameters that if not met put the project’s cost, schedule, and performance at risk. Garry Roedler and Cheryl Jones, Technical Measurement: A Collaborative Project of PSM, INCOSE, and Industry, INCOSE-TP-2003-020-01, pp. 9–10.
2. Key Performance Parameters (KPP) are critical subsets of the performance parameters representing those capabilities and characteristics so significant that failure to meet the threshold value of performance can be cause for the concept or the system to be reevaluated or the project to be reassessed or terminated. Each KPP has a threshold and objective value. KPPs are the minimum number of performance parameters needed to characterize the major drivers of operational performance, supportability, and operability. Garry Roedler and Cheryl Jones, Technical Measurement: A Collaborative Project of PSM, INCOSE, and Industry, INCOSE-TP-2003-020-01, p. 11.
3. Michael G. Christel and Kyo C. Kang, “Issues in Requirements Elicitation,” Technical Report CMU/SEI-92-TR-012, September 1992, Software Engineering Institute, Carnegie Mellon University, Pittsburgh, PA 15213. This paper describes the critical failings of how we elicit requirements and how these failings set the stage for project failure.
4. The ConOps idea is very close to the standard use case and scenario approach developed by Alistair Cockburn, several years ago. Alistair Cockburn, Writing Effective Use Cases (Reading, MA: Addison-Wesley, 2000), http://www.amazon.com/Writing-Effective-Cases-Alistair-Cockburn/dp/0201702258.
5. MAJ Mark W. Brantley, USA, LTC Willie J. McFadden, USA, and LTC Mark J. Davis, USA (Ret), “Expanding the Trade Space: An Analysis of Requirements Tradeoffs Affecting System Design,” Acquisition Review Quarterly (Winter 2002).
6. Ralph R. Young, The Requirements Engineering Handbook (Norwood, MA: Artech House, 2003).
7. Michael G. Christel and Kyo C. Kang, Technical Report, “Issues with Requirements Elicitation,” CMU/SEI–92–TR–12, Software Engineering Institute, Carnegie Mellon University, Pittsburgh, PA 15213.
8. Paired comparison analysis, Borda ranking, and Analytical Hierarchical Process are all ways to mathematically rank comparisons of information for decision making. This approach should be used for any prioritization, ranking, or selection among choices.
9. Peter Sims, Little Bets: How Breakthrough Ideas Emerge from Small Discoveries (New York: Free Press, 2011).
10. In later chapters, we’ll look in detail at reference class forecasting because it is a powerful approach not only for forecasting but also for addressing the anchoring and adjustment processes needed to produce credible estimates and identify risks and their mitigations. Reference class forecasting is used in oil and gas industries, NASA, and Defense Department cost and schedule estimating processes. We will apply it to our sample projects.
11. Audrey J. Dorofee, Julie A. Walker, Christopher J. Alberts, Ronald P. Higuere, Richard L. Murphy, and Ray C, Williams, Continuous Risk Management Guidebook (Pittsburgh, PA: Carnegie Mellon University, Software Engineering Institute, 1996).
12. David P. Gluch, A Construct for Describing Software Development Risk, CMU/SEI-94-TR-14, ADA284922, Carnegie Mellon University, 1994.
1. Peter Weill and Jeanne W. Ross, IT Governance: How Top Performers Manage IT Decision Rights for Superior Results (Cambridge, MA: Harvard Business School Press, 2004).
2. Aaron J. Shenhar and Dov Dvir, Reinventing Project Management: The Diamond Approach to Successful Growth and Innovation (Cambridge, MA: Harvard Business School Press, 2007).
3. Ibid.
4. Shenhar, et al., “Project Success: A Multidimensional Strategic Concept,” Long Range Planning 34, no. 6 (December 2001), pp. 699–725.
5. If you wish to read more about governance processes, a good place to start is the reference books by Weill and Ross. Another is the Balanced Scorecards for projects.
6. J. Gido and J. P. Clements, Successful Project Management (Cincinnati, OH: South-Western College Pub., 1999).
7. Eric Norman, Shelly Brotherton, and Robert Fried, Work Breakdown Structures: The Foundation for Project Management Excellence (Hoboken, NJ: Wiley, 2008).
8. “Total Cost Management Framework: An Integrated Approach to Portfolio, Program, and Project Management,” AACE International, 2006.
9. In defense contracting, MIL-STD-881C defines how the work breakdown structure should be built.
10. A Guide to the Project Management Body of Knowledge (PMBOK® Guide)—Fourth Edition, An American National Standard ANSI/PMI 99-001-2008 (Newtown Square, PA: Project Management Institute, Inc., 2008).
11. Thomas C. Powell, “Strategic Planning as Competitive Advantage,” Strategic Management, 13, no. 7 (October 1992), pp. 551–558.
12. This person is called a control account manager. The control account has a budget, a period of performance, assigned resources, and a detailed description of the work to be performed during that period of performance. The summation of all the budgets in the control accounts equals the Performance Measurement Baseline budget for the project. Management Reserve is outside of this budget, since Management Reserve is not assigned to actual work.
13. The concept of a “single point of integrative responsibility,” can be found in introductory managerial finance books, such as Scott Besley and Eugene Brigham, Essentials of Managerial Finance. This single point of responsibility does not occur when there is shared ownership, which results in confusion about the status of the project.
14. Milestones have to be used with care. In ancient Rome, the Emperor Augustus placed a gilded pillar at the center of the Forum, the Millarium Aureum. This marked the starting point for a system of roads, all of which led to Rome. Every mile (mille—Latin for 1,000—the distance a Roman Legion covered in 1,000 paces) of road was marked with a stone “millarium” or milestone. The milestones had several purposes. In our project paradigm, milestones should not be just rocks on the side of the road that we look at as we pass—rather, they should have tangible meaning.
15. Douglas Adams, English humorist and science fiction novelist (1952–2001), best known for his Hitchhiker’s Guide to the Galaxy.
1. Lean construction is a method of incrementally developing the constructed outcome using the principles of “Agile.” http://www.leanconstruction.org/.
1. Curtis R. Cook, Ph.D., “Project Management in Research and Development,” Energy Contractors Group (EFCOG), Project Management Working Group, 2010. PMP, Just Enough Project Management (New York: The McGraw Hill Companies, Inc., 2005).
2. Mutually exclusive means two events or conditions cannot occur at the same time. Collectively exhaustive means at least one of the events or requirements must be present. The concept of MECE comes from Ethan Rasiel and Paul Friga, The McKinsey Mind: Understanding and Implementing the Problem-Solving Tools and Management Techniques of the World’s Top Strategic Consulting Firm (New York: McGraw-Hill, 2001). This concept is a “rule” for assessing requirements. It is not a product unto itself. The lack of overlap means there is no confusion among the elements of the requirements. No gap means nothing is left out.
3. RACI is responsible, accountable, consulted, and informed. Responsible are those doing the work to achieve the outcomes. Accountable are those ultimately answerable for the correct and thorough completion of the deliverable or task and the one who delegates the work to those responsible. Consulted are typically subject-matter experts. Informed are those who are kept up-to-date on progress, usually on completion of the task or deliverable from the work packages.
4. Separation of concerns is a computer science term that is applicable to projects as well. It means dividing the activities of the project into distinct sections, minimizing the coupling between these sections, and maximizing the cohesion of the sections within the overall project. The value of this approach is the simplification of the management processes. Work is “coupled” at the interfaces only. Cohesion is maintained across the boundaries between the sections so all the participants have a clear and concise understanding of what products are to be produced, what their measures of performance are, and who is accountable for the delivery and the performance.
5. Coupling and cohesion are also computer science terms. Coupling describes the relationships between “objects” and cohesion describes the relationships with “objects.” In our case, these “objects” are the work packages and their products or services. We want to maximize the cohesion within the collection of work packages, so everyone working on the same capability and the requirements for that capability are “on the same page” about what “done” looks like. We want to minimize the coupling between work packages or groups of work packages to minimize the impact of poor performance in one set of work packages on other sets of work packages.
1. Office of Management, Budget, and Evaluation, Project Management Practices Rev E, June 2003, US Department of Energy, Work Breakdown Structure.
2. Ibid.
3. Ibid.
4. Ibid.
5. Ibid.
6. Tim Lister is a fellow of the Cutter Business Technology Council and Business Technology Strategies practice and a senior consultant with Cutter’s Agile Product & Project Management and Government & Public Sector practices. Tim coined the phrase “Risk Management is how adults manage projects” in a presentation at the Boston SPIN (Software Process Improvement Network), http://www.boston-spin.org/slides/039-Jan2004-talk.pdf. This quote can be the basis of everything you do around risk management.
7. Arnoud De Meyer, Christoph H. Loch, and Michael T. Pich, “Managing Project Uncertainty: From Variation to Chaos,” MIT Sloan Management Review (Winter 2002).
8. Many books and papers guide us in assigning numbers to the probability of something occurring and then ranking the risks using those numbers. They also guide use in assigning numbers to the impact. The two numbers are then multiplied together to get a risk score. This approach, called ordinal ranking, must be avoided if we are to have a credible Risk Management Plan. “Ordinal” ranking is a relative comparison of two items. “This risk is larger than that risk, because one has a 1 assigned and the other has a 2 assigned.”
9. Louis Anthony Cox, “What’s Wrong with Risk Matrices?” Society of Risk Management (2008), pp. 497–512.
10. This paradigm comes from the federal government’s acquisition processes. The starting point for information about this and other program management topics is https://dap.dau.mil/acquipedia/Pages/Default.aspx.
11. Philip M. Morse and George E. Kimball, Methods of Operations Research (Los Altos, CA: Peninsula Publishing, 1970).
3.142.40.56