7

Measuring Transformation, Not Just Participation

In an age where concepts like big data are front and center, L&D measures evolve by moving from measuring participation to measuring transformation. In the Track Action, select measures that indicate progress on the desired on-the-job behavior and KPI changes, as defined by the Change Action. This is how you will confirm that the learning cluster is working for everyone—learners, the business, and even L&D. With these data, L&D shifts from providing the standard backward-looking report card, to providing leading data that the business and L&D can use for continuing transformation.

With the Track Transformation of Everyone’s Results Action, C-level leaders stop asking the backward-looking question:

“Is the money we spend on L&D worth it?”

and start asking the forward-thinking question:

“What is the data telling us we need to conquer next to meet talent needs?”

Today, whether we are talking about online or offline learning, we in L&D most often report learner usage and reaction. We measure how many people use a particular platform or have completed a particular course or class. We measure satisfaction through end-of-class “smile sheets” or the “Like” button next to online course offerings. Yet, as emphasized in the Change Action (in chapter 3), where you put your focus is where you are going to go. When it comes to measures, is focusing on learner reaction and usage where we want to go? If we tell our leaders and employees that this is what L&D contributes to the enterprise, will they appreciate and value our work?

By putting focus on usage and reaction, we are saying that what’s important is the volume of people who use our products and their feelings about the in-the-moment experience. Of course, after many years of seeing such data, this is exactly the data that organizational leaders now expect. But what do we learn about business impact from such metrics? In this Action, L&D shifts the focus to on-the-job transformation by reporting changes both in KPIs and in employee behavior and attitudes. By doing so, we share a fuller story about what’s important: sustained performance improvement and collecting forward-looking data that will help shape future learning initiatives.

In this chapter, we will first look at a different approach to L&D measurement, an approach that tracks transformation for everyone—learners, the business, and L&D. Then we’ll discuss the Track Tool, which helps break down the measurement selection process, step by step. Finally, we’ll describe changes in our measurement approach that can shape the modern learning approach.

A Common L&D Story

Cast of Characters:

The Scene: The leadership team has gathered for its regular meeting. Today’s agenda includes the HR year-end report and budget request for next year.

“Next up on the agenda is the HR report,” Doug said to the leadership team. “Chris, you’re up!”

“Thanks, Doug,” Chris said. “As you know, HR is composed of several departments. I’ve invited the leaders of each department to join us and tell you their plans for the upcoming year. First up is Marissa from Learning and Development. Marissa?”

“Thanks, Chris. Thank you all for giving me the opportunity to share the great work we are doing in L&D,” Marissa said as the L&D score card was handed out. “As you can see from our scorecard, we are on track for every measure. We look at how many employees we serve, the efficiency of our training development process, and the efficiency of the training we provide based on employee ratings, tests, and usage.”

Scorecard for L&D: September 2019–August 2020

Marc interrupted. “Marissa, it’s great to see that you have met all your goals. And it’s nice that you saved some cash by pulling the training instructors from in-house versus external suppliers. But I have to ask, what is the overall value we’re getting for the budget you have? Your operating budget is relatively stable, and the annual project budget is based on the projects that the leadership team requests or agrees to do. Can you give us a sense of what we are getting for that money?”

“Now let’s be fair here, Marc,” said Raj, trying to take the heat off Marissa. “For one, we get new hire training that my new hires require. Without that, recruiting couldn’t get anyone to accept our offers.”

“Plus,” Chris joined in, “there is a huge cost avoidance of fines from government agencies if we did not have the compliance training in place and someone who can pull the training stats for auditors.”

“OK, I get it,” Marc relented. “But you can’t blame your CFO for wanting a few more hard-core numbers. And, Chris, when I see that employees average an 84 percent score on compliance training, that concerns me. Shouldn’t they be getting it 100 percent correct every time?”

The conversation devolved and got off track. Chris called time so the next department could present. She asked Marissa to work with her on a memo to address the concerns raised in the meeting. Later, Marissa told her team, “I got through the C-suite meeting. It went about the same as most years. Chris will do a good job of protecting our budget, as she has always done.”

The Issues: How can the OK-LCD model make a difference for Marissa and Chris? As you explore this chapter, form your own opinion on how to help L&D prevent some of the awkward concerns and questions raised in this C-suite meeting:

• Are L&D goals meaningful to leadership?

• Why does L&D’s worth come up repeatedly, even though leaders intuitively recognize the impact and necessity of having a learning function?

• How can L&D become a competitive advantage for the company, instead of simply reacting to C-suite requests?

The Action Explained

Recall from the Change Action in chapter 3 how we in L&D have been constricted in defining our goals because we’ve limited ourselves to what can be accomplished by the end of a course. While L&D has long attempted to measure business impact and ROI, it has been difficult to measure. After all, how would anyone know if one particular learning asset had sustained impact on performance on the job? Trying to isolate the impact of a single program is one reason many L&D teams give up on tracking transformation. With the OK-LCD model, it’s time to make another go of it. The Track Action is about new measurement thinking, made possible with learning clusters; using measures to report L&D’s impact; measuring the combined effect of multiple learning assets; and choosing measures that are relevant for the three learning touchpoints.

The New Possibility With Learning Clusters

By delivering learning clusters with learning assets that meet the many moments of learning need, we have shortened the distance between L&D’s work and employee application on the job. This reduces the unpredictability of what happens after learners leave the classroom (Figure 7-1). Now, with the broader scope of learning clusters, we can have more confidence that L&D products are affecting performance on the job. However, by increasing the number of learning assets in our product line, the number of possible measures also increases. We need to be intentional about what measures we track so that we can report an accurate, meaningful picture of the impact of a learning cluster without being buried in data.

Figure 7-1. Reducing the Unpredictability of Measuring Impact

The Track Action is about selecting measures for a learning cluster that track the transformation we set out to achieve, as captured in the strategic performance objective. To track transformation in an impactful way, consider the main reasons measures are important. Powerful measures can:

Articulate meaningful targets so learners, L&D, and the business know the “finish line” that they are currently shooting for (KPIs, on-the-job behaviors).

Demonstrate value that invites further support through funding and resources.

Spread awareness of the learning cluster and its availability to learners.

Provide internal feedback so L&D knows what is working for learners and what needs improvement.

There’s no doubt metrics are important. Now we need to figure out how to make them work for us.

Emphasizing Metrics That Show and Tell Transformation

One of the key principles behind the OK-LCD model is to change on-the-job behavior. When starting on the Track Action, review the Change Action, which describes the desired behaviors and business metrics for the learning cluster. For those of you familiar with Kirkpatrick’s Levels of Evaluation (see the sidebar for a quick refresh), you might notice the consistency between OK-LCD philosophy on measures and the Kirkpatrick Levels. In our experience, L&D struggles for resources because the value isn’t clear. That’s why our emphasis in the Track Action is to measure and report Kirkpatrick’s Level 3 and 4. These are the data that the business cares about the most!

But our measures may fall short if we don’t build awareness. First, learners must be aware that the learning cluster exists. Be sure to create a marketing plan or system so that employees will use your products. Confirm that your plan or system works by measuring awareness of the learning cluster. Next, ensure that users of your products know that the product comes to them through L&D. Every learning cluster, every learning asset, and every L&D system should have L&D’s logo or brandmark on it. L&D can then include metrics like Net Promoter Score (NPS) to gain the customer perspective on our L&D products.

NPS is based on research by Frederick Reichheld and Bain, which found that a single question is most effective in understanding customer loyalty: “How likely are you to recommend xyz to another?” (Reichheld 2003). While NPS doesn’t tell you why or what to do about unsatisfactory scores, high NPS scores indicate viral enthusiasm about a product, so much so that many would be willing to stake their personal reputation on the value of a product. This is an idea worth reapplying for our learning clusters. Just by reporting good NPS, we gain traction and awareness that can help spread transformation.

Finally, to meet the need of feedback for improvement to L&D, and to meet base expectations of measurement, Level 1 and 2 should still be tracked. However, the primary focus is on metrics that reflect performance change across the target learner groups, and on the resulting business impact.

Kirkpatrick’s Levels of Evaluation

In 1960, Don Kirkpatrick proposed four levels for the evaluation of training that are very useful for the L&D industry. The four levels are:

Level 1: Reaction. The degree to which [learners] find the [learning asset] favorable, engaging, and relevant to their jobs.

Level 2: Learning. The degree to which [learners] acquire the intended knowledge, skills, attitude, confidence, and commitment based on their participation in the [learning asset].

Level 3: Behavior (or Application). The degree to which [learners] apply what they learned during training [or through use of the learning asset] when they are back on the job.

Level 4: Results (or Impact). The degree to which targeted outcomes occur as a result of the training and the support and accountability package [or as a result of the learning cluster].

Other methodologies have built on Kirkpatrick’s work. For example, Phillips’s ROI Methodology adds two other levels: Level 0—which measures the number of training programs and attendees, audience, costs, and efficiencies—and Level 5, ROI, which compares monetary benefits with the costs of the program.

Measuring the Combined Effect of Multiple Assets

When it comes to measurement, a key difference between the OK-LCD model and past models is that the Track Action considers the impact of multiple learning assets that are organized into a learning cluster. This is something very new for L&D. Existing methodologies are strictly focused on evaluating one learning asset, and, in practice, only formal learning assets. OK-LCD practitioners measure Level 3 and 4—Behavior and Results—at the overall learning cluster level, instead of the individual learning asset level. This is absolutely critical.

In the past, L&D’s success or failure was often reliant on usage of a single learning asset. If people were not using that asset, it might be removed, updated, or left collecting dust. In the learning cluster world, however, L&D designs assets knowing that everyone will not use each asset equally! That was intentional, as part of our work in the Learn Action. Now, instead of seeing low usage as a failure on the part of L&D, we celebrate the ability to meet nuanced learner needs by reporting the success of the full learning cluster instead. But, we can still remove assets if they aren’t working. Metrics like NPS and Kirkpatrick Levels 1 and 2 tell us if an asset is meeting a persona need or not, and if its inclusion in the learning cluster needs to be reviewed.

Choosing the Right Metric for Each Learning Touchpoint

This Action highlights how Reaction, Learning, and Behavior measures may manifest differently depending on if the learning asset is a social, formal, or immediate learning touchpoint. For example, we know from the neuroscience of learning and the AGES model that emotion has a lot to do with learning. Consider how the emotions of a learner might be different during a coaching call with a peer versus covering the same topic in an e-learning course. When it comes to learning assets that belong to the three different learning touchpoints, what you measure and how you measure it could look very different. As you measure these differences, you will discover what works well for your learner personas to drive the desired behavior.

Here’s an example. Consider a YouTube-like video as an immediate learning asset. How could we evaluate its effectiveness at the Application level? The metric could be as simple as asking if the learner was able to complete the task at hand after watching the YouTube video. Alternatively, you could measure whether the learner had to seek out additional materials to achieve success. In this case, one metric could be to directly ask the learner: “Was this video all you needed to complete your task?” Another method would be to use background analytics to indirectly measure and track how many people stop the video early, or move on to another asset, instead of completing the video and getting back to the work task at hand.

This same how-to situation could look very different in a social or formal learning situation. For example, at the end of a class, you might only be able to measure the Learning level. You might use a knowledge check or a practice simulation to see if learners can complete the task on their own. The metric is simply reporting how many people pass the knowledge check or complete the simulation. An alternative for in-class activities is to ask the trainer to complete an after-class evaluation. Ask such questions as, “In your opinion, did at least 80 percent of participants demonstrate competency during the in-class activity?”

Finally, look for measures that track L&D efficiency and effectiveness. For example, a simple number-of-hits measure on a reading asset can provide clues about user preferences, as well as needs for L&D to increase awareness that a learning asset exists. If no one is clicking a learning asset, and you have confirmed awareness through other means, maybe it’s time for L&D to reduce their own workload by eliminating assets like this. A caution: Don’t drop a learning asset type just because only 10 percent of learners use an asset. It’s highly possible that those 10 percent of learners really value and need this asset. Keep this in mind as you design your measures so you can get the full picture of what your learners want and need to enable them to learn and behave in a way that drives results.

When you complete this fifth and final Action, you will have selected the measures and systems you will be tracking, so that you can begin to Track the Transformation of Everyone’s Results: L&D, your learners, and your business!

The OK-LCD Story Continues

Cast of Characters:

The Scene: Jon just shared his proposal for a learning cluster for compliance, including new measures. Marissa is giving him feedback.

“I’m liking this new learning cluster approach for compliance training, Jon,” Marissa said after hearing his proposal. “In particular, I like how you are planning out the measures. Let me try to summarize what you are proposing to do differently. Tell me if I’ve got this right:

“First, everything is now tracked on our upgraded LMS. Our dashboard shows what percentage of employees—and managers—are behind on completing compliance tests. Monthly reports flag issues that we can follow up on. Measures are no longer a once-a-year event, after the fact, when it is too late to intervene and make improvements.

“Second, you added some learning assets that fall outside of the formal compliance e-learning course and the related test. These would reach learners at the immediate and social learning touchpoints. I can see how these would provide spaced learning throughout the year, in between the annual e-learning course and the test. I’m particularly fond of your ‘compliance story of the month’ concept. Soliciting stories from across all the employees, internal auditors, and managers can provide great insights for all of us. These stories are a different form of measurement: a qualitative measure that says behavior change is happening.

“Third, you have added some measures—the thumbs-up and comments feature for the video bites, and online job aids—to indicate if employees are finding these materials, and if employees think they are useful.

“Lastly, you are suggesting we get the internal audit scores, by department, and correlate these to the percent on-time completion of compliance training, and to the percentage of correct scores on the compliance tests. This may provide leverage for getting some recalcitrant organizations to get compliant. I suspect that the numbers will reflect better audit scores for those organizations that have a higher percentage of employees who use the training.”

Jon nodded. “You’ve got it! I’d like to experiment this first year with a few other measures, especially for the overall impact of the learning cluster. But that’s most of it. It feels good to be able to go beyond the once-a-year ‘compliance mill’ that everyone in the industry seems to be doing, and provide support for employees so that compliance training is more user friendly.”

The OK-LCD Difference: By completing the Track Action, Jon was able to suggest measures that showcase the fuller story of the learning cluster’s impact on the business, rather than simple usage or reaction metrics!

The Action’s Impact

The Track Action is at the very right side of the model because it is the last Action you take before rolling out your learning cluster to the organization despite the fact that you’ve been considering these measures during the Change and Surround Actions. It’s also an ongoing Action that provides feedback and clues to improving the previous four Actions over time, as indicated by the feedback arrow in the model graphic.

Many organizations do not track results, relying instead on intuition, gut feelings, and historical procedures as they design their training programs. Other organizations track Level 1 to determine if learners like a training, based on the idea that if they like it, they are more likely to be learning from it. Others use Level 2 evaluations to prove that participants learned something. But what is the value to the company if participants like it, learn from it, and yet don’t do it? L&D can do better. We can prove to ourselves and others that our training drives behavior that drives results. And what if it doesn’t? Well, that’s learning, too. And who would want to spend their career building things that don’t work? Let’s dig into the measures, find out what drives behavior and results, and go build those things (Figure 7-2).

Figure 7-2. Summary of Track Action Mindset Shifts

The Action Implemented

The Track Tool can help you collect your learning from other Actions and convert it into strong measures that demonstrate the impact of L&D’s product—the learning cluster. In this section we will walk you through the nuances of the three-step tool (Figure 7-3).

Figure 7-3. Track Tool

Learning Cluster Name: ______________________________________________________

Learning Cluster Strategic Performance Objective:

Step 1: Select Measures for the Overall Learning Cluster

The purpose of this set of measures is to determine if the approach is working. Can you identify a trend as you build a critical mass of employees who are changing their behavior such that the KPIs improve?

Step 2: Measures for Key Learning Assets

The purpose of this set of measures is to determine if L&D is meeting learner needs and to identify assets whose metrics share a story that gets attention.

Step 3: Select a Few Measures to Share

1.

2.

3.

4.

5.

There are three main steps for the Track Tool. The first is to consider how to measure the learning cluster by reviewing the strategic performance objective developed in the Change Action. The second is to identify different methods and metrics for key learning assets, in part by thinking through differences in learning touchpoints. Finally, make strategic choices to narrow down the measures and metrics, and from this reduced list of measures, select a handful that the L&D team would like to report to leadership for this learning cluster.

Step1: Review the Strategic Performance Objective

Identify what is most important to measure for the learning cluster by reviewing your strategic performance objective. If you did a thorough job on the Change Action, you will see that the KPI and behaviors portions outline “what to measure” for the performance of the overall learning cluster. Identify measures of the behavior change that stakeholders expected. Then determine how to get these data. You will likely need to partner with other organizations to get them. Such behavior change measures may be a valuable early predictor that the KPI is about to improve.

Step 2: Review Each Learning Asset

What learning touchpoints do the learning assets fall under? List ways you can get Level 1 Reaction evaluations for each learning asset, with a focus on gathering feedback for asset improvement. Here are some ideas:

Videos: Include a thumbs-up/thumbs-down feature with a comment box. Precede a comment box with a question, such as “Was this helpful?”

Job Aids: On interactive PDF links, include a “Was this helpful? yes/no” question with an open box for inputting “Why or why not?”

Peer Coaching or Mentoring: Have a quick two- or three-question evaluation sheet each party fills out at the end of an interaction. For virtual meetings, this could be a part of a mobile app or online social community tool such as Microsoft Teams.

Any Asset: Measure NPS by simply asking, “Would you recommend this learning asset to others?”

Then go beyond Level 1 by thinking about how learners can demonstrate growth in performance for the learning assets. For Social methods, one way is to let others—perhaps it’s the SME or a peer of the learner, or the manager—remark on the growth in performance. Think of ways to capture such remarks within your context. For assets in the immediate learning touchpoint, consider such things as self-evaluation or self-reflection on performance using an L&D-provided rubric. For formal learning touchpoints, try using face-to-face demonstrations of competence.

Consistent qualitative data are important in demonstrating Level 3 Behavior and Level 4 Results. For example, if a learning cluster is on a culture change, and a learning asset is a discussion forum, you could create a weekly word cloud of the forum responses to see the nature of the language changes over time. Imagine how much more powerful these data would be to stakeholders in comparison with simply reporting the number of posts in the forum. This is where emerging technology like big data can really help. (See the Future Technology sidebar to learn more.)

Step 3: Make a Selection

Finally, look through all the methods and metrics you’ve planned to track. Which ones do you think are most likely to communicate with stakeholders and demonstrate the impact of L&D’s work toward the goals of the strategic performance objective? Remember, people pay attention to what you share. Here, you really want to create a separation between metrics that are important for you in L&D, and those that are important for sharing with stakeholders and employees in order to gain sponsorship and awareness of your contributions to their success.

Level 1 evaluation metrics typically become a part of L&D’s ongoing improvement strategy, but this type of measure does not necessarily add value for stakeholders and employees. Be careful that you do not expend all your resources responding to Level 1 suggestions for improvements. Instead, step back and look at the bigger picture, selecting Level 3 and 4 evaluation metrics that demonstrate to stakeholders and employees the impact and value of L&D. In your reports and scorecards, always include a measure on the impact of the learning cluster.

The Track Tool represents our current approach to measuring the impact of learning clusters. It’s a framework to get you thinking about how different learning assets could require different measures, and how to look at the combined impact of multiple assets on performance change. We encourage you to dig deeper into the topic of measurement, possibly by exploring other proven processes to identify metrics such as:

Phillips’ ROI Methodology, which is used by more than 5,000 organizations around the world, systemizes and expands on Kirkpatrick’s Levels of Evaluation work. The basis of the methodology is to collect data for a single learning asset at each level of evaluation: efficiency/cost, reaction, learning, application, impact, and ROI (where ROI is strictly a monetary value).

Basarab’s Predictive Evaluation model, which centers on identifying the intended goals of a learning asset and the behaviors that need to be adopted by learners. Then, the L&D team strives to predict the percent of learners who will buy into the intended goal and behaviors, thereby creating the ability to calculate the impact on the business.

As the L&D industry moves forward, we need to identify ways to modify these rigorous methods to measure the combined effects of multiple learning assets. If you know of a measurement method, or have ideas to share, please reach out and let us know! Sharing among our own L&D networks is a great way for all of us to continue to build respect for our profession. Then, check back on LearningClusterDesign.com to see which ideas are gaining traction in our industry.

Future Technology—Big Data and L&D

Big data is big news. And now, more than ever, it’s available for L&D to use if we learn how.

Volume of Data

Data are everywhere: sales, marketing, operations, research, LMS, internal websites, and more. If your SPO connects to something significant for the business (as it should!), there are probably already metrics being tracked as a part of the business strategy. Reflect on KPIs in your strategic performance objective and find out who is tracking it, and how the data are being tracked. Suggestion: Explore possibilities with IT, who typically knows where the data are. Tap into your customer’s data source or look to other organizations in your company.

Data Tools

Data science, machine learning, and deep learning (a subset of machine learning in artificial intelligence) offer some powerful tools that help businesses dive into their data. These tools help uncover patterns and future trends that, if capitalized upon, can maximize the bottom line. Look for historical patterns and use them to help predict future needs of learners and the business through the process of extrapolation.

L&D Capability

What do you need to learn to make big data work for you? Here are some skills that can help:

Data strategy skills. Make sure you are asking the right question, because the question determines the answer you will get! Also, determine the most efficient, accurate way to obtain the data.

Data visualization skills. When you have a lot of data, creating a meaningful presentation can be complex, but visuals can simplify complexity.

Critical thinking skills. Become adept at questioning validity and any biases in the data set or in the interpretations.

Technology platform skills. Seek certificates or training in data visualization systems such as Tableau, xAPI, SAS, other software that is commonly used at your organization.

These are just some of the aspects of big data that may help you understand employee needs, culture trends, and emerging business issues. L&D can get ahead of the requests for more training by using data to strategically choose what learning to build and what infrastructure to develop to support talent development. A word of caution: Big data may set an expectation of needing all the data before moving forward. But that’s not how it works. Use what you have, and then extrapolate. It’s also important not to use the numbers without thinking. Communicate the full story by sharing not only your data, but your interpretation of it, and your process for forming conclusions. Share your stories and your data with others to gain alternative interpretations and concepts. Then use the data to gain support and move forward with your recommendations.

Keep watch on this area of big data, and if you spot a chance to build your big data muscles, take it!

In Practice

After discovering learning cluster design through our two-day workshop, Sravani Tammiraju at Visa developed and implemented a learning cluster for new hire onboarding. For the Track Action, she decided to tell the story of transformative results in several unique ways. Out of all the measures and metrics she could have reported, she focused on three things: Net Promoter Score, qualitative data, and demonstrated competency through a capstone project. The first time she piloted her learning cluster, the learning cluster received an outstanding NPS of 53. For perspective, an NPS rating above 0 is good, above 50 is excellent, and above 70 is world class.

She gathered a round of feedback and made some changes. After the second implementation, the learning cluster received a best-in-class Net Promoter Score of 87! NPS doesn’t just represent a number for her. She knows the learning cluster is demonstrating the value of L&D to employees across the company because, as she relates, “We have people who have been working for two years asking if they can register for this new hire program. It’s all spread through word of mouth.”

In addition to the power of NPS, she is tracking changes in performance transformation through a promotional video. It provided a way to market the L&D program and the company while creating excitement, buy-in from senior leadership, and global adoption: “One of the things that we did that we’ve never ever done before was to have someone recording the students as they went through the week. We then consolidated that into a quick two-minute video at the end of the week. We sent the video to senior leaders, who were very pleased with the learning experience early career professionals were going through.” This video approach did several things. It provided participants with a resource for spaced learning after the onboarding program, and it shared with leaders and others the value of the learning experience in a very impactful way. As a bonus, this video is now used in recruiting efforts so candidates can see the learning opportunities new hires receive in their first few months of onboarding.

Sravani’s new hire program has demonstrated competency development, especially on the technical side. As part of the new hire learning cluster, the learners develop innovative solutions to a current list of challenges in the payments industry, challenges raised by Visa’s vice presidents and senior vice presidents. At present, one such idea by a group of six early career professionals has led to a patent application filing through the innovations team at Visa. This learning cluster has been a game changer for Visa and for these hires in the first three months in their career! Visa’s leaders are convinced of the power of the learning cluster. According to Sravani, “The response [from leadership] has been phenomenal. The collaboration between leaders and the learning organization has been phenomenal.”

By completing the fifth and final Action, Track Transformation of Everyone’s Results, Sravani was able to tell the story of what she contributed to the company as an L&D professional. In doing so, she gained further sponsorship, raised awareness of L&D’s success throughout the organization, and exceeded expectations for behavior change on the job for new hires. When the story is that powerful, and that convincing, budgetary needs become a forgone conclusion.

Final Note

In this chapter, we shared the new possibilities for evaluating your work based on the new job of taking responsibility for behavior change in the workplace, not just meeting the end-of-class objectives. As the L&D industry shifts into developing and delivering learning clusters, we can be confident of our impact back on the job and on business KPIs and ROI. We talked about how to address the unique challenges of measuring the impact of multiple learning assets and how each learning touchpoint calls for different methods and metrics of measurement. We walked through the tool to help guide your work in this Track Action. To support you further, consider using some of the learning assets listed here to supplement or reinforce what you have gained from this chapter:

• The Track Tool. We’ve included the most recent version for you in the appendix. For future evolutions, head to LearningClusterDesign.com/Book-Bonus.

• The OK-LCD Learning Cluster (see chapter 9) that we designed to help you grow your capability to use the model in your workplace.

• The chapter 8 example of the OK-LCD model with the five Actions and associated Tools in practice.

• The chapters on the other Actions in the OK-LCD model, so that you can see how each Action supports the others.

Ultimately, this fifth and final Action puts your focus on owning the story stakeholders and employees will hear about successful learning, performance improvement, and your contribution to everyone’s results. Because of the core principles of multiple assets and targeting behavior change on the job, the OK-LCD model opens new doors for L&D to gain sponsorship, meaningful insights, and perception as an essential culture-builder in the organization. Rather than focusing only on attendance, usage, and satisfaction, the tool in this final Action encourages you to tell a fuller, more accurate picture of the power of your learning clusters.

Reflect

Does your organization measure the impact of learning initiatives today? How do your current measures compare with those described in this chapter?

What one or two ideas from this chapter empower you to put into practice the Track Transformation for Everyone’s Results Action?

How can you shape the story of L&D’s work results and contributions differently, or better, than you are doing today?

How can the OK-LCD model make a difference in the following areas for our story characters:

• crafting L&D goals that are meaningful to leadership

• eliminating questions about L&D’s worth, impact, and necessity

• moving L&D from an order taker to a competitive advantage for the company?

Apply

Consider the latest learning initiative you are working on. Brainstorm some measures that most likely:

• Demonstrate performance change in the workplace.

• Help you learn about stories that demonstrate the learning cluster success in the workplace.

Look at the Track Tool in the appendix or download a copy from LearningClusterDesign.com/Book-Bonus. Fill out each section to gain deeper practice with the Track Action.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.16.51.3