,

Case Study 1.5

Using Human Performance Technology (HPT) to Select Projects That Yield Results

Topic: Change Management

Susan M. Pavelek, B.B.A., CPT, Internal Consultant, Oxy, Inc., Colleyville, Texas, USA

Background

During a time when training organizations appeared to be frequently outsourced rather than maintained as an internal department, the training team within a mid-size mortgage company was interested in expanding its capabilities and services by providing non-training solutions to address performance issues. The hope was to develop a process to be used to select internal performance improvement projects that would yield results proving value-added. Furthermore, the team wanted to incorporate human performance improvement best practices to ensure success. The team had recently developed a long-term departmental strategic plan and realized that a gap existed about how incoming projects were selected for involvement by the team. The team engaged department leaders to sponsor the shift from traditional training providers to a more performance-focused group offering a broader set of services.

Critical Business Issue

Historically, when the team was contacted by various department leaders to engage in a project, the request usually came in the form of client-determined training solutions. In addition, the incoming project requests typically surpassed the team's capacity to provide support. The team needed a way to differentiate the projects so that the projects it engaged in would add value to the organization. It also wanted to better balance the workload among team members, as project support was assigned based on individual team member skill in specific topic areas, not on the capacity to take on more work. No collective process existed to track or manage projects; all projects were individually tracked and monitored. Along with the new processes, the team had a desire to infuse a mindset of performance improvement throughout the company's leadership team members.

Intervention

The team prioritized the strategic plan items and selected two for immediate action: (1) the team wanted to ensure its decisions were made using objective data (rather than its existing first-come-first-serve approach.) and (2) it wanted to establish a well-defined method for accepting/dispersing projects (to better balance team workload). The processes essential to enable this transition required design, development, and deployment.

While implementing the new processes for managing incoming projects was iterativeand took place over approximately six months, the team collaborated to quickly focus on designing the new processes. The team dedicated Friday mornings to collaborative team work sessions. Weekly work sessions were conducted to not only learn performance improvement methods, but also to design and develop the methods and tools needed to be successful. Through these work sessions, the team designed tools it would use to collect and analyze the customers' request for training (and non-training) services, along with a new approach to managing incoming performance improvement projects on behalf of the collective team.

Focus on Outcomes or Results

The entire new process was designed to focus on outcomes and potential results rather than activities. Expected outcomes were gathered up-front using a new form designed by the team. Once the project was accepted and prioritized by the team, resources were assigned, including a project owner, and while the type of project might differ, consistent tools were used to track and report on project results. Data were collected during the initial customer meeting about cost of the problem, cost of the proposed solution, and estimates for the financial value (or return) the customer expected to gain was determined where possible.

Focus on Systems View

Prior to implementing the new process, input was collected by team members individually using individual team member questions and style. In most cases, follow-up phone calls were required to collect additional information in order to provide the team member the complete situation. The new approach targeted “actual” and “desired” performance and focused on the three areas that must be considered in human performance improvement initiatives: the work, workplace, and worker. The new approach involved implementation of five new methods and or tools. Each of the following tools is explained in more detail in the sections below:

  1. Customer Discovery and Rating Tool. A consistent form/tool to gather incoming project request data.
  2. Team. A sub-team used to manage the entire process, track project data, and consistently report results.
  3. Experience/Interest Matrix and Team Availability Log. A tool used to compare the experience level of all team members, and a process to better balance team capacity and workload.
  4. Statement of Work. A standard statement of work (SOW) process to serve as an internal customer contract.
  5. Customer Report Card and Lessons Learned Template. To gather customer feedback and identify recommendations for further improvements.

Focus on Value

The team had already changed many of its training deliverables to focus on performance outcomes rather than simply responding to a request to train a specific topic. To further this effort, the team's adoption of performance-based approaches to managing projects, and the subsequent design of forms/tools established processes to enable it to capture financial data surrounding both the cost of the solution (or project) as well as the resulting value (if the project was selected by the team) of the completed project to both the customer and the company. Team members developed consulting and questioning skills that not only grew the team's collective capabilities, but also helped the customer focus on and articulate the potential value of the project.

Focus on Establishing Partnerships

During the team's transition from training-focused activities to a more performance-based mindset (focused on outcomes rather than training-prescribed topics), it had already laid the groundwork for partnering with customers to concentrate on value-added results. Frequent requestors became accustomed to the mind-set of how the training might improve the department's performance results (such as improved cycle-time in processing work, improved team relationships, improved percentages of sales, and so forth). The new approach to managing training and non-training projects was well received by key customers throughout the company. The team also partnered with the departmental leadership team to help communicate and market the new performance-focused approach to managing internal projects.

When a team member received an incoming call requesting services, he or she scheduled the initial customer meeting and invited key decision-makers to attend to ensure the interviewing team members gained a clear picture of the request. In addition, the caller was asked to bring any data that might serve as evidence of the problem and/or cause(s) to better describe/support the request. By inviting decision-makers to attend the initial meeting, the team eliminated the need for multiple follow-up phone calls/meetings, thereby avoiding potential project delays. Customers were not accustomed to this approach, but most expressed approval for including the right people in the initial meetings, which helped save time and clarified expectations up-front.

Be Systematic in Assessment of Need, Opportunity, or Challenge

The team designed and implemented a consistent form/tool, the Customer Discovery and Rating Tool, to be used by all team members to collect important, comprehensive customer data surrounding each project request. The tool included a rating/scoring section whereby the same criteria could be applied to all projects, as well as a section to estimate the financial value of the expected project outcomes. The Customer Discovery and Rating Tool was not only used to assess all incoming project requests, but was also used to assess all internally initiated projects (driven by the training department). The tool was used as an agenda during the initial one-hour customer meeting to discuss the project. A sample is shown below.

Be Systematic in the Analysis of Work, Worker, Workplace, and Worldview to Identify the Causes or Factors That Limit Performance

The Customer Discovery and Rating Tool was designed to gather data surrounding the work, worker, and workplace (department-specific) and included questions to explore the project's link to key organizational strategic goals (the larger, strategic focus served as the “worldview”). During the initial customer meeting, the team's interviewers asked the series of performance-based questions (and prompts) in an attempt to gain insight into the problem and possible cause(s) of the problem, as well as what the customer thought about a potential solution. In addition, estimated costs of implementing the proposed solution, along with projected savings, revenue, or productivity gains (financial benefits) were entered into an online template containing pre-formulated, company-specific data. This enabled the team to have the information they needed to make a data-driven decision.

Upon conclusion of the customer meeting (provided complete information was obtained), the interviewing team discussed the project details, and applied the rating criteria portion of the tool. The team began with two qualifying questions that were prerequisites to any project it would agree to accept. Provided the project passed the qualifying questions, the team continued the rating process. The team assigned a numeric rating (0 to 9) to six critical items outlined on the tool. The team discussed to what degree each of the six items applied to the project being rated to determine the appropriate number assignment. Criteria for rating projects included the following:

Once the project was rated and scored, the team would make an accept/decline decision based on the data compared to other projects. The project data were also used to prioritize the newly accepted project with other projects in work and determine the team resources and other project support that would be required.

Be Systematic in the Design of the Solution; Be Systematic in the Development of the Solution

An internal project sub-team was formed (the @Team) charged with managing the new processes on behalf of the department. Team members would rotate on/off of the @Team.

Upon acceptance of the project, the @Team assigned project resources using a team availability log combined with an experience/interest matrix. For the team availability log, all team members (including the leadership team) used a simple spreadsheet to enter the number of hours (or anticipated hours) already assigned to existing projects, or ongoing operational work. Based on the data added by each team member, the spreadsheet was programmed to reflect the balance of available work hours for each individual. The @Team used the log to assign work.

CUSTOMER DISCOVERY AND RATING TOOL

SECTION 1: ISSUE IDENTIFICATION

A team description of request as recorded on the incoming project request log.

  1. What brings you here today? (Probe for a description of the problem or need.)
  2. What evidence is there that this problem exists? (Probe for objective data, descriptions, and/or metrics.)
  3. How is the current process/method/condition being done? What are the current results of this process/method/condition?
  4. How should the process/method/condition be done? (May be expressed as experienced insight or simply be objective opinion.) What should the results be?

SECTION 2: PERFORMANCE ISSUE CAUSES

The identified performance issue is caused by needs in one or more of the following areas: organization, worker, work environment, or work.

  1. How does this type of performance align with (corporate/departmental) key performance indicators (KPIs)?
  2. Does this type of performance align with any regulatory compliance requirements?

If yes, the project priority will rate higher.

  • 3. Do the workers know they are expected to do the process/method/task(s)?
  • 4. Do the workers know how to perform the needed process/method/task(s)?
  • 5. Have the workers successfully performed the process/method/tasks in the past—resulting in desired results being achieved?
  • 6. What factors in the work environment negatively impact performance? (That is, workers know they are expected to perform the process/method/task(s) and are attempting to do the task to the desired proficiency, but do not have the resource(s) needed to do so successfully, such as time/opportunity, rewards, equipment, access codes, or experience unintended negative consequences, and so forth).
  • 7. What factors related to the work itself negatively impact performance (for example, workers know they are expected to perform the process/method/task(s) and know how to successfully perform the process/method/task(s), and they have the required resources, but are not completing the task(s)? OR they are completing the task(s), but not to the desired proficiency)?

SECTION 3: PROPOSED SOLUTIONS

  1. What other factors will help describe the situation?
  2. What has already been done to improve the issue?
  3. What is the proposed solution? (May encompass multiple interventions.)
  4. What is the timeframe for the proposed solution? How negotiable is it?
  5. Does this situation and/or proposed solution impact other departments? If so, which ones?
  6. How many people are affected by this situation and/or proposed solution?
  7. If completed, what would success “look like”? (Probe for objective descriptions and meaningful metrics.)

CUSTOMER DISCOVERY AND RATING TOOL

Qualifying Questions images

  1. Does the project align with our team mission? If NO, do not continue rating.
  2. Does the project align with our team's strategic imperatives? If NO, do not continue rating.

images

Subtotals:

  • 7. List the expected financial return and its justification logic (cost reductions, revenue generation, reduced cycle time, reduced defect rate, increased value, estimated number of people impacted). Ranges should correlate to the projected financial return, for example:
$0—$5,000 0—5
$5,001—$25,000 6—25
$25,001—$50,000 26—50
$50,001—$75,000 51—75
$75,001—$100,000 76—100
over $100,001 Correlate based on estimate, i.e., $120K = 120; $149K = 149

Overall Project Score:__________

The experience/interest matrix (Pavelek, 2004) was a self-assessment tool used to identify the team's individual proficiency surrounding forty-seven human performance interventions. Over time, the team expressed a concern that projects were being assigned strictly based on experience and that several team members were not being given the opportunity to learn new skills.

Likewise, team members with a high level of skill in a specific area expressed an interest to help others learn the skill in order to expand the overall team's capability (and give themselves a break!). As a result of this feedback, an interest component was added to the matrix. In addition to experience, team members self-rated their interest in developing each of the forty-seven human performance interventions. Implementing the experience/interest matrix helped the team better balance its capacity and workload, as well as create an atmosphere of continual team development. An example is shown below.

Also upon project acceptance, the @Team implemented a statement of work (SOW) using an internal template (Pavelek, 2004) to serve as an internal customer contract. The @Team would meet with the customer a second time (usually the second meeting was scheduled during the initial meeting) to review the SOW, outline project details, clarify consultant/customer responsibilities, articulate projected outcomes, and set expectations for ongoing communication and partnering. If the project was not accepted, the @Team contacted the customer to advise of the decision and reasoning for turning down the project.

Two other forms/tools designed, developed, and implemented included processes to gather customer feedback after the project was delivered and to identify recommendations for further improvements. The customer report card was used to gather information from the customer surrounding his or her experience and satisfaction with the project team and deliverables (Level 1) and included a section on actual financial improvements realized as a result of the project (Level 4.). The lessons learned template was used by the assigned project team to summarize project statistics and recommend changes to help improve future projects of a similar nature.

Sample Experience/Interest Matrix

images

Be Systematic in the Implementation of the Solution

Once finalized, @Team processes and tools were applied to all projects, and job aids and procedure documents were created to ensure consistency.

Finally, the team implemented a project log to track project-specific details, monitor progress, and document actual hours spent on project support. The @Team provided monthly and quarterly presentations to the other team members outlining project summary statistics/results, customer report card, and lessons learned data for completed projects. The @Team also included recommendations for improvements to team processes and project execution approaches based on the report card and lesson learned feedback.

images

Lessons Learned from the Case

  • The new methods resulted in actual financial returns of approximately $3.3 million in the first year.
  • The team experienced challenges when attempting to gather accurate and meaningful financial-related data upon project completion. In many cases, the team used a productivity savings formula for valuing time.1 The team customized the formula to include company-specific information to more accurately reflect improved productivity results.
  • Adding the interest component to the experience/interest matrix provided an added bonus and enabled expanded team capability through project support assignments.
  • The team integrated technology and automated processes where possible to streamline project management/tracking.
  • Although the @Team reported actual results to the other team members, it did not include a process to adequately report actual project results to senior leaders.

Citation

1. Hale, 1998

Reference

Hale, J.A. (1998). The performance consultant's fieldbook: Tools and techniques for improving organizations and people. San Francisco: Pfeiffer.

Susan M. Pavelek, B.B.A., CPT, became involved with performance technology mid-career after an extensive background in operations/customer service in the travel and transportation field. She has led global training/organizational development teams to higher levels of performance through application of HPT methods and tools. Susan now serves as an internal consultant at Oxy, Inc., and can be reached at 817.715.5353, or [email protected].

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.237.77