5

Reporting and Using the Results

After the analysis has been completed, perhaps the most important step in the ROI process model is communicating the results of the evaluation to stakeholders. The results from the study will provide insight into the effectiveness of the technology- based learning program, help secure funding for future programs, build credibility for the ROI Methodology, and increase support for learning through technology. However, this will not occur unless the results are properly communicated to the appropriate stakeholders and actions are taken to use the results appropriately. This chapter provides a brief overview to these issues.

THE IMPORTANCE OF REPORTING RESULTS

This final step in the ROI Methodology is critical. Evaluations of technology-based learning programs are essentially useless if the results are never communicated. Communicating the evaluation results of the program allows for improvement and provides the necessary feedback to those interested in the outcomes of the learning. In this way, others in the organization can understand the value the programs bring to the organization. Communication can be a sensitive issue—there are those who will support the technology-based learning program regardless of the results. There are others, however, who are skeptical regardless of what the data show. Some will form their opinions about the program based on how the results are communicated. Different audiences need different information, and the information needs to be presented in a variety of ways to ensure that the message comes across appropriately. There are five steps to take into account when planning your communication strategy:

1.  Define the need for the communication.

2.  Identify all the audiences for the communication.

3.  Select the media for the audiences.

4.  Develop the information to be communicated.

5.  Communicate the results and evaluate the results of communication.

IDENTIFY THE NEEDS

There are a variety of needs that can be addressed through the communication process. Those needs range from getting approval for programs, to satisfying curiosity about what the program is all about. Sometimes it is necessary to gain additional support and affirmation for programs, or to gain agreement that a change or improvement in a program needs to occur. Often, the purpose of communicating the results of the program is to build credibility for the programs. Many times, the report reinforces the need to make changes to the system to further support the transfer of learning to implementation and business impact. Communicating the results can also serve to prepare the learning team for changes in the organization or, better yet, to give the staff the opportunity to increase their influence.

Communication is often conducted to enhance the entire process as well as to emphasize a specific program’s importance to the organization. The communication process is used to explain what is going on, why something might or might not have occurred, and the goals to improve a program when it results in a negative ROI.

When a pilot learning program shows impressive results, use this opportunity to stimulate interest in continuing the program as well as for potential future program participants. The communication process can also be used to demonstrate how tools, skills, or new knowledge can be applied to the organization. Table 5.1 provides a list of possible needs that can be addressed through the communication process. The next step is to identify the audience who can best help address that need.

TABLE 5-1. Reasons for Communicating Results

1.  Reasons related to the technology-based learning:

  Demonstrate accountability for expenditures.

  Secure approval for a learning program.

  Gain support for all learning programs.

  Enhance reinforcement of the learning program.

  Enhance the results of future learning programs.

  Show complete results of the learning program.

  Explain a learning program’s negative ROI.

  Seek agreement for changes to a learning program.

  Stimulate interest in learning through technology.

  Encourage participation in learning programs.

  Market future learning programs.

2.  Reasons related to management:

  Build credibility for the learning team.

  Prepare the learning team for changes.

  Provide opportunities for the learning team to increase influence.

3.  Reasons related to the organization:

  Reinforce the need for system changes to support learning transfer.

  Demonstrate how tools, skills, and knowledge add value to the organization.

  Explain current processes.

IDENTIFY THE AUDIENCE

When the purpose for communicating results is clear, the next step is to determine who needs to hear the results in order to satisfy the communication need. If the need for communicating results is to secure approval for a new program, consider the client or the top executive as the target audience. If the purpose of communication is to gain support for a program, consider the immediate managers or team leaders of the targeted participant group. If the purpose of communication is to improve the program, target the designers and developers. If it is important to demonstrate accountability for technology-based learning programs, then the target audience would be most—if not all—employees in the organization. It is important to consider the purpose of the communication to determine the appropriate audience. Listed below are the key questions to ask to make this determination:

Is the potential audience interested in the program?

Does the potential audience really want to or need to receive this information?

Has someone already made a commitment to this audience regarding communication?

Is the timing right for this message to be presented to this audience?

Is the potential audience familiar with the program?

How does this audience prefer to have results communicated to them?

Is the audience likely to find the results threatening?

Which medium will be most convenient to the audience?

There are four primary audiences who will always need the results of the ROI studies communicated to them:

The learning and development team should receive constant communication of the results of all levels of evaluation. Level 1 and 2 data should be reported to the learning and development team immediately after the program is implemented. This provides them the opportunity to make adjustments to the program prior to the next offering.

Participants are a critical source of data. Without participants, there are no data. Level 1 and 2 data should always be reported back to participants immediately after being analyzed. A summary copy of the final ROI study should also be provided to participants. In doing so, they see that the data they are providing for the evaluation is actually being used to make improvements to the program. This enhances the potential for additional and even better data in future evaluations.

Participants’ managers are critical to the success of learning programs. Without managers’ support, it will be difficult to get participants engaged in the program, and the successful transfer of learning will be jeopardized. Reporting the ROI study results to immediate managers demonstrates to them that employees’ participation in the program yields business improvement. Managers will see the importance of their own roles in supporting the learning process from program participation to application.

The client (the person or persons who fund the program), should always receive the results of the ROI study. It is important to report the full scope of success, and clients want to see the learning program’s impact on the business, as well as the actual ROI. While Level 1 and 2 data are important to the client to some extent, it is unnecessary to report this data to the client immediately after the program is implemented. The client’s greatest interest is in Level 4 and 5 data. Providing the client with a summary report for the comprehensive evaluation will ensure that the information clearly shows that the program is successful and, in the event of an unsuccessful program, a plan is in place to take corrective action.

SELECT THE MEDIA

Consider the best means for asking what is needed. As in other steps in the ROI Methodology, there are many options—meetings, internal publications, electronic media, program brochures, case studies, and formal reports. The choice of media is important, especially in the early stages of implementing the ROI Methodology. Make sure to select the appropriate medium for the particular communication need and target audience.

Meetings

When considering meetings as the medium for communication, look at staff meetings and management meetings. If possible, plan for communication during normal meeting hours so as to avoid disrupting the audiences’ regular schedules. However, this approach does present the risk of having to wait to present the report until some future meeting when it can be added to the agenda. But, key players will be so interested in the ROI study that getting a slot on the earliest possible meeting agenda should not be a problem. Another meeting might consist of a discussion that includes a participant, and maybe a participant’s manager to sit on a panel to discuss the program. Panel discussions can also occur at regularly scheduled meetings or at a special meeting focused on the program. Best practice meetings are another opportunity to present the results of the learning program. These meetings highlight the best practices in each function within the organization. This might mean presenting the ROI study at a large conference in a panel discussion, which includes managers who oversee learning programs and managers from a variety of organizations. Business update meetings also present opportunities to provide information about the program.

Internal Publications

Internal publications are another way to communicate to employees. Use these internal publications—newsletters, memos, postings on bulletin boards—to report program progress and results, as well as to generate interest in current and future programs. Internal hard copy communications are the perfect opportunity to recognize participants who have provided data or responded promptly to questionnaires. If incentives were offered for participation or for prompt responses to questionnaires, mention this in these publications. Be sure to accentuate the positive and announce compliments and congratulations generously.

Electronic Media

Electronic media, such as websites, intranets, and group emailing, are important communication tools. Take advantage of these opportunities to spread the word about the activities and successes related to programs. When using group email, whether organization-wide or targeting certain audiences, make sure that message content is solid and engagingly crafted.

Brochures

Program brochures are another way to promote learning offerings. Reporting results in a brochure that describes a program’s process and highlights successes can generate interest in a current program, stimulate interest in coming programs, and enhance respect and regard for the team who owns the programs.

Case Studies

Case studies are an ideal way to communicate the results of an ROI evaluation. Case studies demonstrate the value that learning brings to the organization or to provide others an opportunity to learn from your experience. There are multiple outlets for case studies, including books (such as this one) and learning courses offered within an organization. The ROI Institute uses case studies as a key component in training others in evaluation. Through case studies, others can learn what worked and what didn’t.

Formal Reports

A final medium through which to report results is the formal report. There are two types of reports—micro-level reports and macro-level scorecards—that are used to tell the success of programs. Micro-level reports present the results of a specific program and include detailed reports, executive summaries, general audience reports, and single-page reports. Macro-level scorecards are an important tool in reporting the overall success of technology-based learning programs.

DEVELOP THE REPORT

There are five types of reports to develop to communicate the results of the ROI studies. These include the detailed report (which is developed for every evaluation project), executive summary, general audience reports, single-page reports, and macro- level scorecard.

Detailed Reports

The detailed report is the comprehensive report that details the specifics of the program and the ROI study. This report is developed for every comprehensive evaluation conducted. It becomes the record and provides the opportunity to replicate the study without having to repeat the entire planning process. It is possible to save time, money, effort, and a great deal of frustration by building on an existing study. The detailed report contains six major headings:

need for the program

need for the evaluation

evaluation methodology

results

conclusions and next steps

appendices.

Need for the Program

Define and clarify the objectives for the program, making sure that the objectives reflect the five levels of evaluation. Objectives should relate to the participants’ perspective, describe what participants are intended to learn, reflect how they are intended to apply what they have learned, and reflect the outcomes that the knowledge and skills gained in the program will have on the organization. Objectives also present the target ROI and how that particular target was determined.

Need for the Evaluation

Typically, if the program is intended to influence Level 4 measures, this presents a need. In some cases, it may be that the Level 4 measures were never developed so the intent of the evaluation is to understand the influence the program has had or is having on the organization. The intent of the evaluation may be to understand the extent to which the program successfully achieved the objectives. The need for the evaluation may depend on the request of an executive. Clearly state the reasons in the report. Although this report will be distributed to key audiences, it will also serve as the tool to refer to in future evaluations and to describe what happened during this particular evaluation.

Evaluation Methodology

This clear and complete description of the evaluation process builds credibility for the results. First provide an overview of the methodology. Then, describe each element of the process, including all options available at each step, which option(s) were chosen, the reasons for those choices, all actions and activities related to each element of the process, and each step taken. For the data collection section of the report, detail how the data were collected, why those data were collected, from whom the data were collected, why the data were collected from that particular source(s), when the data were collected, and why those data collection procedures were selected. Also, display a completed detailed copy of your data collection plan. After the data collection plan has been described, explain the ROI analysis procedures and why the isolation method was selected. Clearly state the various ways the effects of the program could be isolated and explain why the method was chosen. In essence, answer the question, “Why did you do what you did?” When explaining data conversion, describe how the monetary values for the Level 4 impact measures linked to the program were developed, again explaining the range of possibilities for data conversion. After describing the possible data conversion methods, clearly explain why the techniques selected were chosen. Address the cost issue and provide the cost categories included in the ROI analysis. At this point, do not include the actual cost of the program. If the cost of the program is introduced too early, the audience will focus solely on the cost and their attention will be lost. As with data collection, provide a detailed copy of the ROI analysis plan so that the audience can see a summary of exactly what happened.

Results

In this section, the program that has undergone a rigorous evaluation will shine! Provide the results for all levels of evaluation, beginning with Level 1, reaction and planned action. Explain the intent for gathering reaction data, providing the specific questions the reaction data answers, and report the results. Then move on to Level 2, learning. Explain why it’s important to evaluate learning and the key questions that learning data answers and report the results. Next, move on to Level 3, application and implementation. This is one of the greatest parts of the story. Provide evidence that what was taught was used. Discuss how frequently and effectively knowledge and skills gained in the program have been applied by the participants. Discuss how the support system enabled participants to apply what they learned. Discuss the barriers to the transfer of learning gained, to behavior change, or implementation and application. It is important to explain what happened. For example, if the work environment did not support learning transfer, report that here. Also explain that when it was recognized (through the evaluation process) that a problem was occurring (the support system was not helping), that action was taken by talking with those who might know or who might provide information about how things could be changed to support the program next time.

Next, discuss Level 4, business impact, including how the program positively influenced specific business outcomes. Reinforce the fact that the effects of the program were isolated; it must be clear to the audience that other influences that might have contributed to these outcomes were taken into account. Describe the options for isolation and explain why those options were chosen.

Then, report on Level 5, ROI. First, explain what is meant by ROI, clearly defining the ROI equation. Address the benefits of the program, the Level 4 measures, and how they were achieved. Explain how the data were converted to monetary value and detail the monetary benefits of the program. Then, report the fully loaded costs. Recall that earlier in the evaluation methodology section of the report, the cost items were detailed, but a dollar value was not identified. It is here, after monetary benefits are reported, where the dollar values of the costs are outlined. The readers have already seen the benefits in dollar amounts; now provide the costs. The pain of a very expensive program is relieved because the audience can clearly see that the benefits outweigh the costs. Finally, provide the ROI calculation.

The last section in the detailed report concerns intangible benefits, which are those items that are not converted to monetary value. Highlight those intangible benefits and the unplanned benefits that came about through the program. Reinforce their importance and the value they represent.

Conclusions and Next Steps

Develop and report the program conclusions based on the evaluation, answering these questions:

Was the program successful?

What needs to be improved?

Explain the next steps, clearly pointing out the next actions to be taken with regard to the program. Those actions could include continuing the program, adding a different focus, removing elements of the program, changing the format, or developing a blended learning approach to reduce the costs while maintaining the benefits achieved. Clearly identify the next steps and set out the dates by which these steps will be completed.

Appendices

The appendices include exhibits, detailed tables that could not feasibly be included in the text, and raw data (keeping the data items confidential). The final report is a reference for readers as well as a story of success for others.

Throughout the report, incorporate quotes—positive and negative—from respondents. While it is tempting to leave out negative comments, ethically, they should not be omitted and including them enhances the credibility and respect for the report. By developing a detailed comprehensive report, there will be a backup for anything communicated during a presentation. When conducting a future ROI study on a similar program, the road map is now clear. Table 5-2 presents a sample outline of a detailed report.

TABLE 5-2. Impact Study Outline for Detailed Report

Executive Summary

Another important report is the executive summary. The executive summary follows the same outline as the detailed report (although it omits the appendices), and each section and subsection is not developed in such great detail. Clearly and concisely explain the need for the program, the need for the evaluation, and the evaluation methodology. Always include the ROI Methodology prior to the results so that the reader understands and appreciates it. The understanding and appreciation build credibility and respect for the results. Report the data from Level 1 through Level 5 and include the sixth measure of success—the intangible benefits. The executive summary is usually 10 to 15 pages long.

General Audience Reports

General audience reports are a great way to describe the success of programs to employees. General audience reports may be published in organization public- ations, like newsletters or in-house magazines; reported in management and team meetings, where a brief review of the report can be communicated in a meeting setting; and, finally, published as case studies. Case studies can be published internally and externally. There are many opportunities to publish the case study outside the organization, including trade or association publications or academic research publications. The key is to tell the story to show that the programs are working, and that when they don’t work, steps are taken to improve them.

Single-Page Reports

A final micro-level report is a single-page report. Success of a program should not be communicated using the single-page report until after the audience understands the methodology. If an audience sees the ROI of a program without having an appreciation for the methodology used to arrive at the number, the audience will fixate on the ROI and never notice, much less form a regard for, information developed in the other levels of evaluation. Therefore, single-page reports are used with great care, but they are an easy way to communicate results to the appropriate audiences on a routine basis.

Macro-Level Scorecards

Macro-level scorecards can provide the results of the overall impact of learning and development programs, such as technology-based learning programs. These scorecards provide a macro-level perspective of success and serve as a brief description of a program evaluation in contrast to the detailed report. They show the connection between the program’s contribution and the business objectives. The method of isolation is always included on the report to reinforce that credit is given where credit is due. The scorecard integrates a variety of types of data and demonstrates alignment between programs, strategic objectives, and operational goals.

COMMUNICATE RESULTS AND EVALUATE THE RESULTS OF COMMUNICATION

A final step in the communication process is the communication of results, using the selected media and information, and evaluating the results of the communication. While it is important to evaluate results of the program itself, knowing how successful you are with the communication of those is just as important. Your program may have been flawless, resulting in well over 100 percent ROI. But if the communication was poorly done, then your success may never be known.

So, how do you evaluate the success of your communication? Just like you evaluate your learning program. You observe reaction to the information and the communication process, ask participants if they know what the data mean and understand your evaluation process, follow up on actions taken as a result of the communication, observe subsequent impact (such as funding for a new program), and, if you choose, calculate the ROI on your communication process. How you communicate, to whom you communicate, and when you communicate are critical elements to your overall evaluation strategy.

Remember, there are no perfect ROI studies—someone will find an improvement opportunity in everything you do. As long as you follow the process and the standards, keep your application of the ROI Methodology consistent, and clearly communicate your approach, your results are put into the context of methodology—credible and reliable. With that in mind, good decisions can be made about programs. The case studies in the following chapters provide examples of how the ROI Methodology has been used to evaluate technology-based learning programs in various organizations.

DELIVERING BAD NEWS

One of the obstacles perhaps most difficult to overcome is receiving inadequate, insufficient, or disappointing news. Addressing a bad-news situation is an issue for most project leaders and other stakeholders involved in a project. Table 5-3 presents the guidelines to follow when addressing bad news. As the table makes clear, the time to think about bad news is early in the process, but without ever losing sight of the value of the bad news. In essence, bad news means that things can change—and need to change—and that the situation can improve. The team and others need to be convinced that good news can be found in a bad-news situation.

TABLE 5-3. Delivering Bad News

• Never fail to recognize the power to learn from and improve on a negative study.

• Look for red flags along the way.

• Lower outcome expectations with key stakeholders along the way.

• Look for data everywhere.

• Never alter the standards.

• Remain objective throughout the process.

• Prepare the team for the bad news.

• Consider different scenarios.

• Find out what went wrong.

• Adjust the story line to: “Now we have data that show how to make this program more successful.” In an odd way, this puts a positive spin on data that are less than positive.

• Drive improvement.

USING THE DATA

Too often, projects are evaluated and significant data are collected, but nothing is done with the data. Failure to use data is a tremendous obstacle, because once the project has concluded, the team has a tendency to move on to the next project or issue and get on with other priorities. Table 5-4 shows how the different levels of data can be used to improve projects. It is critical that the data be used—they were essentially the justification for undertaking the project evaluation in the first place. Failure to use the data may mean that the entire evaluation was a waste. As the table illustrates, many reasons exist for collecting the data and using them after collection. These can become action items for the team to ensure that changes and adjustments are made. Also, the client or sponsor must act to ensure that the uses of data are appropriately addressed.

TABLE 5-4. How Data Should Be Used

FINAL THOUGHTS

This chapter discussed a crucial area of the ROI Methodology, reporting results. When an ROI analysis has been completed successfully, the results must be communicated to various individuals with interest in the project. Proper communication of results is imperative for successful implementation of the ROI Methodology. The final chapter in part I discusses what is necessary to achieve business results from a design perspective.

For more detail on this methodology, see The Value of Learning: How Organizations Capture Value and ROI and Translate Them Into Support, Improvement, Funds (Phillips and Phillips, 2007, Pfeiffer).

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.226.222.89