3

More Lies About Instructional Design

Mindy Jackson

The truth about learning is that it’s hard to create experiences that are relevant to diverse people. And supporting the transfer of learning is even harder. Our brains forget. The neurological mechanisms for memory encoding and retrieval require time and practice to develop our internal network of knowledge. And everyone’s neural network—and its structures of mental schema—is unique to her experiences. Whether it’s one-to-many training or one-to-one instruction, creating relevant and effective learning experiences is hard.

I try to be candid with what I know about learning. But some people don’t want to listen to reality. They require immediate remedies to their performance and development needs. They seek expedient solutions, development shortcuts, and quick-acting salves. They want to believe that providing an annual 30-minute program on network security means that everyone logs off the computer before stepping away from their desks, or that an eight-hour seminar prepares the sales force for handling questions on new product features, or that two weeks of onboarding teaches the technician how to troubleshoot the breakage of a $1 million industrial machine. We know it requires more to transfer new skills and knowledge: continuing performance support, job aids, mentoring, access to experts and knowledge bases, performance plans and reviews, and so on.

Most lies about instructional design are but fantasies of an alternate reality—a truth we wish existed. The six lies exposed in this chapter result as much from self-deception as from a misunderstanding of the inextricable challenges of good instructional design.

Lie #1: A Good Training Program Will Solve the Problem

Attend training and voila! Go forth, lead, perform, and succeed.

A large multinational company wanted a high-quality 20-minute e-learning experience for its employees and customers. The topic was a new process innovation, and they had identified one learning objective: “to change the way the world does business.” It was a terrific 20-minute program. Have you felt the change?

When pitching promises of a day-long transformative executive leadership program, a potential client asked me whether people should attend the program just once. I told the client yes, quickly adding that we hope the program is foundational to change and that you will follow up with support, incentives, and more training. It’s difficult to defend the merits of a single training event having deep or lasting effect on your organization—and that’s why training programs need to be designed with a longer view. One-and-done training is rarely a remedy, and often the problem, but it also can be part of an effective, broad-based solution.

Research from cognitive science validates the dulling effects of time (forgetting curves) and the strengthening of memory through spaced practice and repetition (spacing effect). Rather than cramming learning into a short time span, effective learning requires repeated exposure and practice over time. So a monthly seminar series or a set of e-learning courses, combined with performance support tools, is more effective than a one-shot approach.

Granted, there are times when you want a new learning program to start with intensity. That day-long transformative executive leadership program was meant to provide a sense of urgency, emotional motivation, and impact by catalyzing behavior change. It was an intense introduction to certain business problems and new ways of thinking about leading people to deal with these problems. It was an acceleration event, so follow up and additional support is needed to sustain any change.

Inherent to the lie that one good training program can solve the problem is a self-deception akin to believing that drinking from a fire hose refreshes. Large amounts of content consumption do not equate to large amounts of learning transfer. Organizations thirst for the knowledge and skills their workforce needs to stay current and agile amid the fast-changing disruptions and innovations of business. When the need is now, it is human nature to seek shortcuts—often by trying to accomplish too much at once.

Providing learning—both formal training experiences and informal on-the-job experiences—and performance support requires a long-term view. Rather than thought of as peripheral occasions, learning and performance support yields the best results when integrated with the ecosystem of workplaces and work practices.

The truth is that learning requires more than one acquisition event. Evergreen knowledge and skills require continuing reinforcement and support.

Lie #2: ADDIE Is Dead

For a five-letter mnemonic device representing a process model, ADDIE (analysis →design →development →implementation →evaluation) sure receives a lot of criticism and hate. But try to think of any project that would not benefit from an inquisitive and iterative process model of define it, design it, make it, try it, and revise it. Alas, DDMTR is neither an attractive nor a memorable mnemonic. That is all ADDIE is—it’s simply much easier to say, remember, and, apparently, hate. I hear complaints that ADDIE is not an agile and iterative development process. Whoever said ADDIE had to be strictly linear?

I recently completed a personal project that was an ADDIE process: creating a family photo album for my brother’s birthday. I started by examining the boxes of photos inherited from our mother and sequencing the photos by time and event. I thought a lot about thematic representations and groupings of relatives we didn’t know (how would I tell that part of the family story through the photos?). Next, I sampled some different patterns for page layouts and started placing the photos. I discovered new things I liked while working, so I made adjustments to previous pages. When I thought I was done, I looked over the entire book and then swapped several of the pages and photos around again. This is a simple example of ADDIE.

People say ADDIE takes too long. I finished my beautiful photo album in two days, and I’m happy with the results. Would I be as satisfied if I’d just put the pictures into the page sleeves without any forethought, organization, or trial and error? Certainly not. The process I used—simple and flexible—helped me create a better result.

The “ADDIE is dead” lie arises from a wish rather than a truth. Companies wish they could shorten the time to delivery by cutting analysis and design time and moving directly into development. Take the A for analysis and D for design away from ADDIE, and what is left? DIE. ADDIE is not dead; it’s just been reduced to DIE. (How apropos for this new acronym.) But without analysis and design, the training is apt to fail. Engaging and effective learning programs come from informed and intentional design decisions. The best results are accomplished through purposeful planning and reasoned choices aligned to needs and goals. Isn’t this true for any worthwhile project?

Rather than pushing learning projects toward best results, businesses too often place the imperative on driving projects to completion—often using artificial deadlines. My experience has been that timelines drive training development more than any other factor—and that rarely leads to the best possible outcome. In essence, there is an inverse relationship between the time spent on analysis and design and the level of risk associated with the initiative. As the amount of time spent increases, risk goes down; spend less time, and the risk goes (way) up.

In more than 20 years working as an instructional designer, I can only recall three large projects that were allocated enough time and resources for analysis and design. And those three are the best examples in my professional portfolio. They are my exemplars, not only because I was proud of the end products, but because they were also huge successes. Their success was validated by the satisfaction, performance gains, and the returns on investment the organizations saw. Two of the three returned more than $1 million in investment within six months of the program launch, and all were on time, on budget, and aligned to business objectives.

I don’t understand what people mean when they say that Agile is a better development process than ADDIE. Agile development means tasks are broken into short phases of work that are assessed, adapted, and reassessed. The ADDIE process model is iterative and agile, and helps designers attend to the important components of instruction. It is scalable to any project budget, schedule, or scope. Its biggest challenge is the lack of flexibility in those who use it. The methodology doesn’t dictate rigid linearity. That comes from users who don’t think about how ADDIE can be best used to solve instructional or performance problems.

Lie #3: Mobile Is the Killer App for Training

Mobile devices are fantastic tools for training and development. Laptops, tablets, and smartphones all enable mobile learning. If the tool is portable and accessible anytime, it can be called a mobile device. That’s why books and training manuals also can be categorized as mobile. But when customers ask for a mobile solution, they’re usually referring to handheld telecommunication devices.

I love the story from John Seely Brown, former chief scientist of Xerox Corporation and director of its Palo Alto Research Center, about providing mobile phones to Xerox service repair technicians (many years before cellphones were commodities). At Xerox, a reorganization separated technicians accustomed to co-location and social access to one another. They regularly shared field experiences and war stories about managing machine breakdowns and customers nearing meltdowns. Not only had they formed a community of practice, according to Brown, but their informal mentoring and workplace learning had boosted problem identification and reduced repair times. Once separated, field performance dropped. Mobile phones thus reconnected the technicians to their community of practice.

Mobile is an anywhere, anytime access technology. Mobile can tether us to social networks, information hubs, knowledge management systems, just-in-time training, and other forms of performance support. But is mobile the killer app for training?

With HTML5 and responsive design guidelines, webpages can scale for legibility on different screen dimensions, like a 25-inch computer monitor or a four-inch phone screen. But the affordances of extreme screen scalability are mostly forgotten by customers who want their employees to be able to take a training program on their phones—while waiting for airplanes, eating a sandwich, or waiting for a meeting. Just because the training can be delivered to a mobile device does not mean it will deliver the same quality experience on all mobile devices.

Enspire, the company for which I work, mostly creates immersive learning experiences that put the learner in simulations of real-world contexts and within interactive case studies. Sure, simulations can be delivered on a small screen, but the size limits the realism and depth of interactions. Notice how most games played on cell phones are based on patterns and repetitive movements within abstract environments. Finer and nuanced details are sacrificed for usability within the constricted screen space.

Often we put technology to the wrong purposes when we use a tool in ways that don’t match its natural affordances. An affordance is a quality of an object that provides useful interaction with the object, such as a doorknob to be twisted and a door handle to be pushed or pulled. I like the analogy of a pencil: It’s a great tool for writing on paper, but not such a good tool for punching holes into paper. You can do it, but the holes are apt to be ragged and misaligned. That’s an example of a technology being extended beyond the affordances of its design—beyond its purpose.

Mobile has many uses for training and development, especially for just-in-time resources and performance support. But it can be misused as well. Sadly, mobile’s quintessential affordance—social access—is mostly underused by learning and development. As the Xerox repair technicians’ story showed, mobile is a killer app for accessing expertise in your organization. As a social learning tool, mobile can accelerate learning curves, expand knowledge sharing, increase the uptake of process norms, and support a connected community of practice within your organization. Mobile can expand the reach of your training and learning programs, but it is often not the best choice for delivering your training and learning experiences.

Lie #4: Big Data Analytics Provide Adaptive Learning

Don’t be fooled by the grand pronouncements (and buzzwords) that big data analytics will revolutionize instruction. This lie is based on high hopes that data analytics will result in more personal and relevant instructional experiences, while overlooking the enormous hurdles to implementation.

Data analytics have been effective tools for business planning, especially for sales and marketing. But the proof cases for data-driven learning analytics are nascent. To be sure, I’m optimistic. Imagine a future in which data-driven instruction personalizes and adapts learning in real time, based on individual differences and preferences. That’s the true promise of computer-based instruction. But let’s move beyond the speculative to what is known and doable today.

Here are the broad basics of what I know about data analytics. Collecting data provides a record of past events. By analyzing these data, we can tell a story about the past. And by using algorithms—based on statistical inference models—we can then calculate probabilities.

But what else is required to yield useful meaning from data analytics? Consider data collection. How confident are you in the reliability and validity of the data you collected? Do the data points you tagged consistently and accurately measure the skill, knowledge, or behavior you intended to measure? We can drown ourselves in data points from clickstreams. The story you get from the data depends on the data selected and on the suppositions you made about the data’s value. It’s a sticky proposition, but it gets even stickier.

Now consider what must happen within the “black box” to extrapolate data into an adaptive learning path (Figure 3-1). We know the operations within the black box present several difficult engineering problems. The system must somehow generate a path based on the statistical inferences of past data, the learner’s knowledge and skills, and correlate instructional goals to available curriculum. To do this, the system will require learning maps (canonical knowledge structures that trace skill enablers and knowledge dependencies), accurate learner profiles of present-state needs, and rules of logic (Bayesian networks) to reconcile the multiple inputs.

Figure 3-1. Black Box

I use the black box analogy so we don’t have to concern ourselves with its internal workings. Suffice it to say, serious consideration and effort are required for data inputs. So let’s look at the other side of the equation. What does the black box produce? Learning analytics? Branching through instructional assets? A curricular sequence?

No. The black box doesn’t output curriculum and instruction; rather, it calculates probabilities as outputs (if-then-else logic). At best, we might say the system recommends a pathway based on the input data points and their correlations. You, or your instructional design team, must develop all the instructional activities for creating adaptive paths. Think about how much training is required to create skill and knowledge paths based on individual differences. Think about an adaptive system reacting appropriately to 30, 300, or 3,000 employees.

With data we can make better decisions. With data we can be more responsive to individual and group needs. And with data we can identify opportunities to improve or optimize instructional pathways. The lie I’m attempting to dispel is the hyperbole of big data learning analytics or that big data is an easy, or even sufficient, solution to providing adaptive learning. Beyond the data—and a brilliant statistician, of course—are colossal requirements: learner profiles, learning maps, data reliability and validity testing, and multiple curricula and instructional pathways. Someone has to make it all; it doesn’t come in a box.

Lie #5: Khan Is King: Long Live Content Curation

Khan Academy is a fabulous online repository of learning objects, as well as full-fledged curricula and instructional pathways that addresses many subjects. (Note: Khan Academy is making progress with using big data analytics for adaptive learning pathways for math and science learning objectives by analyzing use patterns and generating knowledge maps from its millions of online users.) The open educational resources movement provides unprecedented access to share, use, and reuse high-quality training materials. But when I hear suggestions that content curators are replacing instructional designers, I want to yell back: “What about your context?” And so the age-old debate continues between content and context.

Indeed, there are many opportunities for content curation, such as using open-source content to target foundational knowledge and skills. For instance, a nonfinancial manager needs to understand how to read P&L spreadsheets and calculate cash flow statements. Khan Academy has training videos about that. But the curated content does not offer anything on how your company shares costs among its departments.

Take another example. A sales team seeks to improve performance. It finds a free, online sales cycle management program. An excellent find. The course provides an orientation to the seven stages of the sales cycle. But that course is not going to teach the ins and outs of the team’s products, services, or vision of the customer experience. Many of the training parts can be cobbled together from multiple sources. Perhaps it’s a bit disjointed from the team’s own business context, but it can work for some uses.

Regardless, most organizational training needs revolve around the intricate details of your business processes, customer needs, and supply chain partners. Even with the off-the-shelf resources, your organization needs instructional designers to gather requirements, conduct task analysis, establish learning goals and objectives, locate or create the instructional materials and performance support, and much more.

I’m excited by the prospects of content curation combined with crowdsourcing. In this way, content vetting could be done on a larger scale, with employees rating the usefulness of instructional materials. With an eye on the context of their own daily work and learning needs, employees could filter out irrelevant content pieces and bring forth the best. The truth is, context matters.

Lie #6: People Learn Differently Now

I’m often asked how instructional technologies are changing the way people learn. Sometimes my first, and rather coy, response is: “Not much at all.” Human brains function today just as they did 10,000 years ago. The internal mechanisms of human learning and memory have not suddenly evolved. Today’s generation doesn’t learn differently from previous generations. What is different, however, are innovative instructional technologies and a better understanding of human cognition. The strategies and tools for supporting learning and memory are rapidly changing.

So it’s a wonderful time to be in training and education. Great new tools permeate the classroom and the workplace. In fact, the influx of new educational technology products—podcasts, Twitter, motion graphics, games, simulations, social networks, social collaboration, intranets, hangouts, wikis, Wikipedia, e-books—makes choosing which ones to use daunting, but exhilarating. Some are new, some are different, and some are not. People may not learn all that differently today, but we certainly have more choices for access points and instructional methods.

From K–12 and postsecondary teachers, I often hear stories of tablet technologies and audience response systems flipping the classroom from content-centric to learner-centric experiences. Project-based learning and peer instruction are widely adopted in our best universities.

Michael Starbird, a brilliant mathematician and teacher at the University of Texas at Austin, told me that too many instructional activities and too much classroom time is devoted to getting to the right answer. He suggests that we focus on investigating mistakes and raising essential questions about a problem. Studies show that this teaching strategy—productive failure—draws learner attention to critical features and leads to enduring understanding far better than direct instruction.

People are not learning differently; we are teaching more effectively.

Conclusion

In Lies About Learning, I wrote a chapter rebuking the lies that devalued instructional design as a process discipline:

   Instructional design is irrelevant.

   All you need is a subject matter expert.

   Instructional design is a front-end process only.

   Instructional design takes too long and costs too much.

   Instructional design makes learning tedious and not very fun.

   Instructional design is out of touch with the dynamics of business.

This time I’ve written about the common lies I hear from clients and learning industry pundits. I bet you’ve heard them too:

   A good training program will solve the problem.

   ADDIE is dead.

   Mobile is the killer app for training.

   Big data analytics provide adaptive learning.

   Khan is king: Long live content curation.

   People learn differently now.

These lies, and many others, continue to proliferate due to misconceptions and misinformation from people with their own agendas, often related to selling a product. Allocating the time and resources to instructional design provides the best opportunity for delivering effective learning and performance improvements. To think otherwise is to indulge in fantasy or denial.

Human resource development should not happen on the periphery—it belongs within the ecosystem of the enterprise to provide and sustain learning, collaborative social experiences, and performance support. With new technologies, better instructional methods, and scientific insights into human learning, instructional design can, and should, lead the way.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.216.151.164