8. Watch the Game Film

“Look in the mirror, and don’t be tempted to equate transient domination with either intrinsic superiority or prospects for extended survival.”

Stephen Jay Gould

The Pro Football Hall of Fame inducted Raymond Berry on July 28, 1973. That moment capped a remarkable career in which he had caught a record six hundred thirty-one passes. He teamed with quarterback Johnny Unitas to form one of the great quarterback-receiver tandems in football history. Together, they led the Baltimore Colts to two National Football League (NFL) championships.1

Berry’s story is remarkable because of the rather unexceptional start to his career.2 In high school, he played for a team coached by his father, yet he did not become a starter until his senior year. He was a skinny kid who lacked dazzling speed. He suffered from nearsightedness and a bad back. Berry wore special shoes because one of his legs was longer than the other. At Southern Methodist University, he caught a total of thirty-three passes in his entire career. The Baltimore Colts chose him in the twentieth round of the 1954 NFL draft; no other team expressed interest.

In Berry’s first pro season, he caught only thirteen passes. The team did not fare much better, finishing in fourth place in its division. The offense ranked next to last in the league. Berry worked relentlessly, though, to improve his game. He watched countless hours of film, dissecting how the best receivers in the game excelled at their craft. He studied the Colts’ opponents in great detail, trying to detect their tendencies and vulnerabilities on film. Today, all NFL coaches and players study film endlessly, but at the time, Berry was an exception. Teammates found his methods rather bizarre. Berry developed a wide array of maneuvers for getting open against defenses despite his limited speed—eighty-eight different moves by his estimation. Then, he practiced these moves relentlessly. He would simulate an entire game by himself during the off-season, trying to run each pattern “within inches of how they were diagrammed.”3

Johnny Unitas joined the Baltimore Colts at the start of Berry’s second season, having been cut by the Pittsburgh Steelers the previous year. Unitas hoped to catch on as a backup quarterback. Soon, Unitas and Berry formed a bond—two eager young players desperate to improve, make the team, and contribute in a meaningful way. In the evenings, Berry asked Unitas to study film with him in his apartment. The two men stayed on the field after regular team practice for hours, working on each pass pattern repeatedly. Berry named each type of catch. He wanted to run perfect patterns, and he practiced making the most challenging catches repeatedly. Berry once described the importance of these practices to one of his teammates: “He (Unitas) has to know that after three and two-tenths seconds, this is where you are going to be. You’ve got to time it up with him. It’s like music. The same beat has to be playing in all of our heads.”4

The two players soon became starters for the Colts. In 1958, Unitas led the league in passing touchdowns, and Berry topped the NFL in receptions.5 In that same year, Unitas and Berry led the Colts to the NFL championship against the New York Giants. On December 28, 1958 in Yankee Stadium, the Giants led the Colts 17–14 with two minutes and twenty seconds remaining in the contest. The Colts offense had the ball at their own fourteen-yard line, far from where they needed to be to attempt a game-tying field goal. After more than a minute of trying to attack the Giants’ defense, the Colts had made little progress. With just seventy-five seconds remaining, they stood at the twenty-five yard line. Time was running out.

Unitas hoped to throw to Berry along the sidelines on the next play, according to Mark Bowden, who wrote a wonderful book about this historic championship contest.6 However, Giants defensive coach Tom Landry anticipated the play. Just before Unitas took the snap, a Giants linebacker shifted out to line up near Berry, as instructed by Landry. Now, Berry faced two defenders, typically meaning that a pass play to him would not succeed. Berry had not seen such a defensive maneuver by the Giants when he studied film of their previous games. However, in that moment, he recalled a film session with Unitas several years earlier, in which the two men had noticed this defensive strategy employed by a different opponent. Berry and Unitas had detected a problem with this defensive scheme, and they concocted a way to capitalize on this weakness.

Now the two men had an opportunity to employ their counterstrategy, but they could not speak with one another. They stood many yards apart, with the play about to begin. Berry simply gazed in Unitas’s direction, hoping that they were on the same page. As the ball was snapped, Berry did not run the pattern that Unitas had called for in the huddle. Instead, he ran a different pattern, the one they had devised several years earlier in his apartment. As Berry made his move, Unitas anticipated precisely what his receiver would do in that situation. Unitas connected with Berry on a pass play that covered twenty-five yards. It was like music, the same beat playing in both men’s minds. Several plays later, the Colts had tied the game. Ultimately, they prevailed in overtime in a game many still consider the greatest ever played. Berry finished the contest with a then-record twelve catches for one hundred seventy-eight yards and a touchdown. The skinny kid who barely made the team a few years earlier had become a record-breaking champion.7

Athletes not only study film on the competition; they watch themselves perform too. They study film of their own performances to identify problems and flaws. Baseball player Tony Gwynn became a pioneer in the use of video in his sport. When Gwynn joined the San Diego Padres in 1982, all football teams had adopted Berry’s film-study methods, but baseball players had not. In his second season, Gwynn fell into a miserable slump, partly due to a wrist injury suffered during the winter. Gwynn purchased a video cassette recorder for $500, and his wife Alicia began videotaping each of his at-bats. He reviewed the tapes and identified the flaws in his hitting approach. Gwynn said, “I came back from a trip and looked at the tapes and I knew immediately what was wrong. From that point on, I hit like .350 and had a 25-game hitting streak.”8 He never finished a season in his long and storied career with a batting average lower than the one he compiled in that season (which was still a very good .309).9

Gwynn became a fanatic about using video to study his swing, as well as the pitchers whom he opposed. He carried videos with him wherever he traveled. Teammates nicknamed him “Captain Video.” At first, they viewed his near-obsession with video as rather odd, much like Berry’s teammates had. Over time, his peers became believers. After reviewing the video, Gwynn practiced with intensity. Baseball coach Dave Engle once said, “You could take the next five guys who put the most time in, and added together, they would not put as much time in as Tony.”10 He did not just try to hit the ball during batting practice; he imagined a particular situation and practiced the precise swing he would use in that circumstance. Alternatively, Gwynn might focus on correcting a mechanical flaw that he noticed on video; he would practice that particular refinement over and over. During practice, he focused as much, if not more, on the process of hitting as on the outcome of each swing.

In 1984, Gwynn’s first full season using video, he won the batting title (for highest batting average in the National League). By the end of his career, he had earned that honor eight times, tying the National League record.11 He entered the Hall of Fame on July 29, 2007 as one of the greatest pure hitters in the history of baseball.12 His peers recognized him as one of the most astute students of hitting that the game had ever seen.

The story of these two athletes demonstrates two important lessons regarding how effective problem-finding can lead to superior performance. First, we see the value of “watching the game film.” Like Berry and Gwynn, companies should study their past performance, as well as their rivals’ performance. They should search for problems and vulnerabilities that can be exploited. Of course, many companies do engage in benchmarking and competitor intelligence. They also conduct “after-action reviews” to identify problems experienced during a major project or initiative. However, the promise of these learning activities often remains unrealized. Firms encounter a series of common pitfalls that make these activities far less productive than they can be. In this chapter, we will take a look at these pitfalls and identify ways in which leaders can avoid them.

Second, we learn from these two athletes’ stories that elite performers do not excel simply due to innate talent. They hone their skills through a great deal of practice. In fact, research shows that individuals in many different fields achieve greatness through hard work, not simply raw talent. However, research shows that it takes a particular type of preparation to truly excel; scholars have described it as “deliberate practice.” Berry and Gwynn adopted this approach to preparation and skill refinement. Through their practice regimens, Berry and Gwynn discovered the small problems and flaws that prevented them from achieving their potential. Some individuals work very hard, but they adopt the wrong practice techniques. Research demonstrates that elite performers engage in an immense amount of highly effective “deliberate practice” over their careers. This chapter explains deliberate practice and describes how it facilitates effective problem-finding. Moreover, we will explain why many firms do not provide employees with sufficient opportunities for deliberate practice, or why they encourage the wrong types of training and preparation. We also will take a look at how some companies have provided effective practice opportunities for their employees.

After-Action Reviews: Promise and Peril

Many companies have tried to conduct lessons-learned exercises after the completion of major projects. Outside of sports, the U.S. Army became one of the first large organizations to develop a systematic approach to postmortem analysis. The Army developed its after-action review (AAR) procedure in the 1970s, although widespread adoption did not take place for a number of years. Each reflection-and-review process focuses on four fundamental questions:

• What did we set out to do?

• What actually happened?

• Why did it happen?

• What will we do next time?13

Harvard Professor David Garvin has conducted extensive research on the Army’s use of AARs. He reports that the Army now conducts these lessons-learned exercises routinely. The Army has learned that these reviews must become “a state of mind where everybody is continuously assessing themselves, their units, and their organizations and asking how they can improve.”14 AARs must be conducted immediately after a mission has been completed so that the key events can be recalled easily and accurately by all participants. Garvin points out that the process requires skilled facilitators and a willingness on the part of military leaders not to dominate the discussions, even to admit their own mistakes. Finally, the Army works very hard to create a climate of openness and candor, and facilitators actively discourage finger-pointing and the assignment of blame during these reviews.15

Other organizations have adopted the Army’s techniques. For instance, many hospitals try to conduct lessons-learned exercises after medical accidents. The Joint Commission on Accreditation of Healthcare Organizations (JCAHO) now requires hospitals to conduct thorough reviews following serious medical accidents, which health care professionals describe as sentinel events. Many hospitals also have expanded the use of such reviews to less serious incidents, going well beyond the mandate of the accreditation body. For instance, at Children’s Hospital in Minneapolis, the Patient Safety Steering Committee chose to conduct “focused event studies” after a wide range of less serious incidents, as well as “near misses”—instances in which an accident was narrowly averted and no harm came to a patient.16 Dr. Chris Robison, Associate Director of Medical Affairs, serves as one of the facilitators of this review process. Like the Army, Children’s focuses on establishing clear ground rules for how these reviews should be conducted, and it follows a structured procedure for analyzing each incident. Here’s how Robison kicked off the review of a morphine overdose incident that took place at the hospital:

“We have several objectives today: to understand what happened, to identify opportunities for improvement, and to support the caregivers, patient, and family that were involved. Today, we will focus primarily on documenting the process flow of yesterday’s events. We have three ground rules for this discussion. First, it is a blameless environment; we are not here to find a scapegoat but to identify failures in our operating system. We want to reveal all of the issues and problems in an open discussion. Second, this process is confidential. Please do not reveal the name of the patient or the identity of the caregivers. Third, we ask you to think creatively about how to improve our systems and processes. Try to envision the patient as your own child and to identify systems that you would like to have in place to ensure your child’s safety.”17

During the session, Robison asked questions in order to identify, understand, and diagram the sequence of events that led to the morphine overdose. As people spoke, Robison recorded the details carefully on a whiteboard. He found that creating a visual aid, such as a process flow diagram documenting the sequence of events, helped facilitate a constructive, fact-based discussion. Robison tried to ensure that the physicians did not dominate. He frequently asked, “Have we documented the process accurately? Are we missing something?” Ultimately, the group agreed on the detailed sequence of events that had taken place, and from there, it identified a number of opportunities for improvement.18

While some organizations have employed AARs quite effectively, most firms struggle to capture the true value of such lessons-learned exercises. Attempts to review projects become blame games in some companies. In others, the procedure becomes slow, cumbersome, and bureaucratic. The review process drags out far too long; in some cases, it does not even begin until well after the project has been completed. At that point, memories of key events have become foggy, and hindsight biases cloud people’s perspectives. Individuals write lengthy reports on the lessons learned from a project, and the binders collect dust on someone’s bookshelf. Little follow-up occurs to ensure that improvement ideas are implemented. As noted organizational learning expert Peter Senge has said, “The Army’s After Action Review is arguably one of the most successful organizational learning methods yet devised. Yet, most every corporate effort to graft this truly innovative practice into their culture has failed because, again and again, people reduce the living practice of AARs to a sterile technique.”19

Why do many AAR processes fail? Many firms stumble for the reasons just cited: the inability to create a climate of candor, a lack of skilled facilitators, and a poor follow-up process for ensuring that improvement ideas are implemented efficiently. The problems extend beyond those usual suspects, though. First, many organizations study past projects in a compartmentalized fashion. A small group of people come together, but they do not necessarily understand the entire picture. For instance, at one firm in my research, a group of marketing managers came together to study a failed product launch. However, they did not involve key individuals from other organizational units, including operations, logistics, and procurement. This small group did not understand the entire system of activities involved in the product launch. The individuals involved in the postmortem analysis did not understand the interconnections among the activities of people in multiple functions. They missed the fact that some problems occurred during the handoffs that took place from one unit to another. Individuals found themselves jumping to conclusions about the mistakes that had occurred without understanding all the facts.

Dr. Robison has learned that assembling the right group of people, from diverse units of the organization, is essential to the success of an after-action review. People need to develop a systemic perspective about failures. Moreover, you need to come to a clear understanding of the facts before trying to ascertain cause-and-effect relationships. He explains:

“I don’t think we could have gotten as thorough an understanding of what happened to Matthew if I had talked with people individually. There was so much point and counterpoint during the meeting. We saw the event from the nurse’s perspective and then from the respiratory therapist’s perspective and then the doctor’s. It is not that people only perceive things consistent with their viewpoint, but that they have actually only touched one part of the elephant. I have found that most people think that they know where the failure was and what failed. However, when they come into one of these meetings, they realize that there were ten possible defenses in the system. They come to understand its complexity. They recognize that there were aspects of the situation they didn’t even know existed. These focused event analyses develop disciples that then go out into the organization understanding the complexity of medical accidents.”20

After-action reviews also fail because people do not have an accurate recollection of what happened, and they haven’t kept a complete record of the key events that took place during a particular project or initiative. When possible, the Army compiles extensive audio and videotapes during its training exercises, as well as during some actual missions, so as to have an objective record of activities to examine during the review process. The Army obtains data from instrumentation technology, and it employs observers who record key events.21 In essence, it compiles a “game film” much like a coach or athlete does in sports. In so doing, the Army does not rely only on individuals’ recollections, which may be incomplete or biased. The videotape never lies, as many coaches say.

Hospitals have the patients’ medical charts that they can review, as well as the results of various tests and procedures. These archival documents provide objective evidence that helps individuals compile an accurate picture of what occurred. During the Columbia space shuttle’s final mission, NASA taped key Mission Management Team meetings, and it stored emails and other key documents. That record of events enabled the Columbia Accident Investigation Board to piece together precisely what had transpired during the mission as it tried to determine the causes of the tragedy.22 Airlines, of course, have flight data and voice recorders on every plane. Most companies cannot videotape activities and events, but they can consider what evidence will be needed to conduct an effective after-action review as they launch a major project. In so doing, firms can plot their data-collection strategy. Managers can encourage employees to store key documents, record minutes after crucial meetings, and track key metrics and milestones as a project is planned and executed. Everyone involved in a project should be asking: What evidence should I be collecting that will enable us to perform a useful after-action review in the future?23

Many firms only conduct postmortems—they study failures, but not successes. However, many small problems and mistakes occur even during the most successful projects. If these issues are not addressed, they may escalate and contribute to a major failure in the future. Moreover, many companies examine projects in isolation. They fail to compare and contrast a particular initiative with other projects either inside or outside the organization. Comparison helps protect against spurious conclusions. When we study a single project, it becomes rather easy to jump to conclusions about what factors contributed to that outcome. However, we may not have identified the correct cause-and-effect relationship; we attribute the outcome to the wrong factors. Examining how the same behaviors and activities played out in multiple situations, perhaps some more successful than others, enables us to refine our attributions and conclusions. We develop much richer models of cause and effect.

Research supports this contention that after-action reviews should invoke comparisons among multiple projects, and that enterprises should not only study failures. Tel Aviv University scholars Schmuel Ellis and Inbar Davidi examined after-event reviews conducted by Israeli military forces. They compared soldiers who conducted post-event reflection exercises after successful and unsuccessful navigation exercises with soldiers who reviewed only failures. Ellis and Davidi discovered that “contemplation of successful events stimulated the learners to generate more hypotheses about their performance.”24 The soldiers who systematically analyzed both successes and failures developed richer mental models of cause and effect. Perhaps most importantly, these soldiers performed better on subsequent missions.25

After-action reviews fail to achieve promised results for one final reason. Organizations often do not identify near-miss incidents and review them systematically. Near misses occur in all sorts of enterprises, and they represent powerful opportunities for learning and reflection. However, many individuals simply breathe a sigh of relief when a near miss occurs. They do not surface the issue for discussion and evaluation. The results can be disastrous. In April 1994, two U.S. fighter jets mistakenly shot down two American Black Hawk helicopters on a humanitarian mission in Northern Iraq’s no-fly zone. As it turned out, a near miss had occurred a short time before this tragic incident. The near miss never surfaced at higher levels of the organization; officers did not have an opportunity to analyze it closely. If they had, a tragedy may have been averted.26

Scholars James March, Lee Sproull, and Michael Tamuz have argued for the value of studying near-misses quite closely. They pointed out that, “Organizations learn from experience, but learning seems problematic when history offers only meager samples of experience.”27 For instance, an airline rarely experiences a fatal aviation accident. It becomes difficult to learn from experience if the sample size proves so limited. Moreover, we have a tendency to “overgeneralize” the lessons from a single, yet quite memorable, episode in an organization’s history. Near-misses protect us against this type of flawed learning; they provide an opportunity to increase the sample size from which we can derive lessons. With respect to the field of aviation, these scholars explained that, “Information on near-accidents augments the relatively sparse history of real accidents and has been used to redesign aircraft, air traffic control systems, airports, cockpit routines, and pilot training procedures.”28

At Children’s Hospital, nurses initiated “good-catch logs” to document near misses and trigger further analysis. Good-catch logs are located in locked medication rooms on each floor of the hospital. If nurses “catch” a problem that could have resulted in an accident, they describe the situation in the log. Nurses felt comfortable with this process because they could record events anonymously. As one staff member noted, “Here, nurses can report accidents waiting to happen.”29 Good-catch logs are a perfect example of proactive problem-finding. Teams in each unit periodically reviewed the logs and then initiated process improvements based on an analysis of these near-miss incidents. As nurses realized that their entries often led to concrete changes, they became more comfortable with writing in the logs. One nurse explained, “Now we feel like someone is listening and doing things about our concerns.”30 Every organization should strive to create its own version of this hospital’s good-catch log if it wants to discover problems proactively and improve its learning processes.

Competitor Intelligence: Promise and Peril

Like the star athletes Raymond Berry and Tony Gwynn, companies need to study the competition as well. They have to compile a game film that can be dissected and analyzed. Such evaluation enables us to spot our own problems as well as the weaknesses and vulnerabilities of our rivals. Many firms engage in competitor intelligence and benchmarking. However, these activities do not always prove as useful as leaders expect. Competitive-intelligence expert Leonard Fuld explains:

“Sometimes, people just get in the way of valid intelligence because their minds block out reality. There is a great psychological component to analyzing and convincing others of critical intelligence. For too many managers, denial, rationalization, groupthink, or not-invented-here attitudes are among the reasons why a competitive revelation never bubbles to the surface.”31

Let’s take a closer look at how and why many attempts to analyze our competitors do not bear fruit. First, many firms engage in highly generic analysis. They conduct SWOT analysis (strengths, weaknesses, opportunities, and threats), but that exercise leaves them with a laundry list rather than a clear understanding of what issues matter most.32 They also define capabilities and vulnerabilities too broadly. For instance, a firm might categorize a rival as having a stronger supply chain management capability. However, it does not dig deeper to understand whether its competitor’s advantage lies in procurement, inbound logistics, outbound logistics, inventory management, and so on. To be more precise, companies ought to consider compiling a precise record of how a rival conducts a particular project or initiative. For instance, an effective competitor analysis might track a competitor’s new product launch in great detail and then compare and contrast that effort with the firm’s own recent launch. Such efforts give the analysis more precision and depth, and they provide opportunities for valuable direct comparisons.

Many firms involve a narrow set of employees in the competitive-intelligence process. They have an individual or small unit, often within the corporate strategic planning organization, responsible for gathering data about rivals. They do not take advantage of the fact that many frontline employees are learning about the competition on a daily basis. Effective firms tap into and synthesize that fragmented local knowledge. They involve frontline employees not only as data gatherers, but also as analysts who can help senior leaders derive conclusions from this data. The frontline employees often do not have the same blinders that may distort the judgments and conclusions of senior executives. Since senior executives have set the current strategy, they may not be as willing to admit where rivals have outmaneuvered them.

Benchmarking efforts frequently entail the creation of a mountain of quantitative data. The numbers compare and contrast organizations using a multitude of metrics. However, firms may become lost in the numbers and ignore crucial qualitative information about the competition. The numbers also may deceive us, since it proves difficult to make perfect apples-to-apples comparisons among firms’ financial results. Leonard Fuld argues that overemphasizing the numbers creates a “one-dimensional blindness” in competitor intelligence. He advocates careful attention to “soft, qualitative information.”33 For instance, many airlines have tried to understand the secrets of Southwest Airlines’ success. Without question, Southwest has made a number of strategic trade-offs that have enabled it to build a unique business model that cannot be easily imitated. However, the company’s success hinges as much on its culture as on its hard assets and investments. As company founder Herb Kelleher has said, “What keeps me awake at night are the intangibles. It’s the intangibles that are the hardest thing for a competitor to imitate, so my biggest fear is that we lose the esprit de corps, the culture, the spirit. If we ever do lose that, we’ll have lost our most important competitive asset.”34

Perhaps most importantly, we must remember to assess the leaders of our rival firms, rather than treating the organization as a monolithic black box. Who is making the decisions at the top, and what is their mindset? Do we understand their historical pattern of behavior vis-à-vis key rivals? Is the firm publicly held, or is it a privately held, family-owned enterprise? Such qualitative factors must be considered as we assess the competition. Think of a typical football coach. He does not simply assess the statistics of his opponents. He also wants to know how that rival behaved in a wide range of situations, including any prominent tendencies that the opposing coaches have exhibited throughout their careers. He uses the game film to look far beyond the numbers.

Finally, competitive-intelligence efforts falter because organizations adopt a narrow perspective. They focus too intently on their direct rivals. Companies often pay insufficient attention to potential new entrants, firms that offer substitute products, and suppliers or buyers who might forward- or backward-integrate. As an example, consider how Polaroid might have performed a competitor analysis in the early 1990s. In terms of instant cameras, Polaroid had a dominant position. After years of having a virtual monopoly in the U.S., Polaroid had watched Kodak enter the instant-camera market in the mid-1970s, but then exit in 1986. However, in terms of substitutes, Polaroid should have had many other firms on its radar screen. One-hour photo processing was becoming more widespread. Kodak had introduced one-time-use disposable cameras in 1987, and other firms had followed. Sales had become substantial by the early 1990s. Digital camera technology was emerging. With the emergence of digital technology, new electronics and computing firms, not previously in the camera business, stood poised to enter the market. In short, an effective competitor analysis would have entailed a wide-ranging look at potential rivals.35

The best firms do not stop there. They try to learn from firms well beyond their industry. They compare themselves against firms that will never become competitors, but that have a process or approach worth exploring. Consider how many product-design firms operate. When developing a new product, they do not simply study how other companies have designed that item. For instance, Pentagram, a California-based design firm, took an interesting approach to developing the high-end Fuego barbeque grill: They visited a luxury car dealer that sold Lamborghinis and Bentleys. They came up with ideas for how to fashion the grill’s temperature gauge as well as how to add the look and feel of luxury to the grill. That out-of-the-box comparison enabled the designers to spot problems with more-traditional barbeque grills that detracted from their appearance and functionality.36

Deliberate Practice

Tony Gwynn and Raymond Berry not only watched a great deal of film on their opponents; they also spent an enormous amount of time practicing their craft. Sometimes, they began by working to resolve a problem they had already identified. On other occasions, focused repetitions helped them discover a problem that hindered their performance. Through practice, they developed a more refined mental model of the cause-and-effect relationships that drove their performance; they could spot the problems that led to failure much more easily.

K. Anders Ericsson and his colleagues have studied star performers in many fields, such as athletics, chess, and music.37 He chose those fields because one can measure performance over time quite precisely. His research demonstrates that “important characteristics of experts’ superior performance are acquired through experience and that the effect of practice on performance is larger than earlier believed possible.”38 Put another way, “experts are always made, not born.”39

Ericsson documents how elite performers practice for an incredible amount of time during their lifetimes. For instance, one study examined three groups of violinists of differing abilities at the Music Academy of West Berlin. The best young violinists, as evaluated by the school instructors, accumulated an average of 7,410 hours of practice by age eighteen. That exceeded the next-best group by more than 2,000 hours and the least-talented set by 4,000 hours.40

The hours alone do not determine success, though. Elite performers do not simply exhibit extraordinary diligence and determination. They engage in what Ericsson calls “deliberate practice.” Fortune magazine writer Geoffrey Colvin explains how a golfer such as Tiger Woods approaches practice far differently from those who hit the links a few weekends each summer:

“Simply hitting a bucket of balls is not deliberate practice, which is why most golfers don’t get better. Hitting an eight-iron three hundred times with a goal of leaving the ball within twenty feet of the pin eighty percent of the time, continually observing results and making appropriate adjustments, and doing that for hours each day—that’s deliberate practice.”41

When elite performers engage in deliberate practice, they set a specific performance improvement goal, and they engage in a task that provides immediate feedback. Moreover, deliberate practice involves focusing on the things that elite performers don’t do well. Many of us tend to practice that at which we already excel in our leisure sport activities. Ericsson and his colleagues point out that “Research across domains shows that it is only by working at what you can’t do that you turn into the expert you want to become.”42 Consider the example of basketball legend Larry Bird. When he entered the National Basketball Association, he did not have a strong left-handed shot. He worked on it relentlessly over the years. As it turned out, he made several of the most clutch shots of his career with his left hand in critical playoff games. In 1981, the Boston Celtics faced the Philadelphia 76ers in the final game of the Eastern Conference Championship Series. With the game tied and less than a minute left, Bird drained a difficult left-handed bank shot to give the Celtics a lead that they would not relinquish. The practice paid off handsomely.43

Deliberate practice consists of extensive repetition of the very same activity, so as to hone a particular skill. It emphasizes focus over variety in the building of skills—working on one thing at a time. As famous tennis instructor Vic Braden said, “Losers have tons of variety. Champions just take pride in learning to hit the same old boring winning shots.” Finally, deliberate practice means paying close attention to your technique, not simply the results you achieve. As Braden argues, “You have to be process-oriented.”44

Can business leaders engage in deliberate practice to improve their performance? Colvin concludes that “Many elements of business, in fact, are directly practicable. Presenting, negotiating, delivering evaluations, deciphering financial statements—you can practice them all.”45 Ericsson concurs. He points out that even the most accomplished leaders can practice skills such as persuasive communication. He notes, “Bear in mind that even Winston Churchill, one of the most charismatic figures of the twentieth century, practiced his oratory style in front of a mirror.”46

Many companies fail to capitalize on opportunities to build deliberate practice into their employee-development programs. Far too many corporate universities continue to incorporate a substantial dose of passive learning into their programs. Passive learning consists of instruction in which the participant sits waiting for the teacher to impart wisdom. We certainly do not develop expertise in key managerial skills by listening to someone lecture us on a particular topic. We have to get our hands dirty and work on our skills to improve.

Some firms have embraced active-learning methods, such as simulations and experiential exercises. These employee-development techniques provide real opportunities for deliberate practice. Participants participate in a realistic scenario, attempt certain methods and techniques, and receive rapid constructive feedback. For years, airline pilots honed their skills through the use of complex, realistic simulations. Today, these simulation methodologies have begun to spread to a wide variety of industries and firms.

Video game technology has fueled growth in the development and use of realistic simulations that provide opportunities for deliberate practice. Consider the case of Hilton Garden Inn. In January 2008 the company launched Ultimate Team Play, an interactive training simulation for its hotel employees. The game puts staff members in a virtual hotel. Employees take on roles such as front desk, housekeeping, food and beverage, and maintenance personnel. They perform tasks such as answering the phone, checking guests in and out of their rooms, and the like. They encounter various scenarios and must respond to guest requests. The game produces a SALT (Satisfaction and Loyalty Tracking) score for the virtual hotel, based on how effectively the staff members perform during the simulation. The SALT metric represents the actual tool used to evaluate Hilton Garden Inn locations. Adrian Kurre, senior vice president at the company, explains the value of the SALT metric: “Including SALT was key because it really emphasizes to the entire team that no matter what role they have or what job they do, each person ultimately affects the guest’s overall hotel experience.”47 By using this simulation, employees can work on key skills while receiving immediate feedback. They can repeat similar scenarios many times. Most importantly, they do not have to practice on actual customers; the virtual hotel provides a way for inexperienced personnel to improve without sacrificing the actual experience of Hilton Garden Inn guests.

UPS has adopted an even more far-reaching approach to creating opportunities for deliberate practice.48 UPS encountered a problem several years ago, when young workers seemed to be taking longer to achieve proficiency in key skills. Many of them quit in their first few months at the company. These Generation Y employees did not seem to enjoy UPS’s standard training methods. For years, UPS taught hundreds of rules and policies to its new drivers in a lengthy series of lectures. The company has since transformed its training practices to address the unique ways in which Generation Y workers tend to gather information, communicate, and learn. UPS shifted to an approach that emphasizes hands-on learning.

UPS opened its new $34 million 11,500-square-foot Integrad training center in Landover, Maryland in 2007. The facility consists of a series of hands-on learning tools. For instance, at one station, the company has placed a transparent UPS truck filled with packages. Instructors explain and then demonstrate how to load and unload a truck safely and efficiently. They show employees the company’s incredibly precise rules and policies in action, rather than simply lecturing about them. Employees then have multiple opportunities to practice these tasks. Individuals identify problems that hinder their performance, and they try to correct these issues. At another station, UPS has created a slip-and-fall simulator. This rather fun exercise helps employees learn how to adjust their bodies as they begin to fall, so as to prevent serious injury. By reducing accident rates, UPS saves a great deal of money. Finally, the outdoor parking lot at this facility simulates a community, and trainees have an opportunity to drive a truck and serve customers. The town has model homes and stores, several street signs, and a UPS drop box. Employees drive through the town, and they practice conducting various tasks. Others play the role of customers in the town. As the employees practice, the instructors provide them rapid feedback on their performance. Over time, the instructors ratchet up the difficulty of the tasks. Although UPS adopted this approach with Generation Y employees in mind, the principles apply to people of all ages. We all benefit from hands-on learning opportunities. Active learning beats passive learning; deliberate practice enables people of all generations to improve and refine their skills.

Looking in the Mirror

Bill Parcells has achieved remarkable success as a professional football coach. He won two Super Bowls with the New York Giants, and he turned around several other losing franchises. With the New England Patriots, he inherited a team that won only two games and lost fourteen in the previous season. Several years later, Parcells took the team to the Super Bowl. With the New York Jets, he took over a team that had won only one contest the previous season. In two seasons, Parcells had the Jets competing in the conference championship game. Even with the Dallas Cowboys, where he enjoyed less success, he managed to take the team to the playoffs twice in four years. The franchise had won only five games in each of the three seasons prior to his arrival.49

Many people have noted that Parcells rarely seems happy about his team’s performance regardless of whether they have won or lost. In fact, he often appears especially dour after a victory. His protégé, Bill Belichick, adopts a similar approach now as head coach of the New England Patriots—a team he has led to three Super Bowl championships. On many occasions, Belichick is critical of his team even when they win. He focuses on the team’s mistakes as he dissects the game film. He drives them hard in practice, not letting them become complacent after a win. The two men both seem to find it hard to enjoy victory.

Parcells and Belichick offer a lesson for all leaders. We certainly do not propose that leaders should become miserable after their successes. However, they can take a hard look in the mirror after both success and failure. Leaders can watch the film, searching for the problems and mistakes, even when the outcome was highly successful. They can refine all the organization’s critical learning and review processes. When many of us look in the mirror, particularly after a successful venture, we see a very positive image. Belichick and Parcells stare into the mirror, always looking for the warts. They search for problems consistently and relentlessly. All leaders need to help their organizations look in the mirror without being blinded by success. As noted evolutionary biologist Stephen Jay Gould once said, “Look in the mirror, and don’t be tempted to equate transient domination with either intrinsic superiority or prospects for extended survival.”

Endnotes

1 http://www.profootballhof.com/hof/years.html.

2 Mark Bowden has written a marvelous book about the 1958 NFL championship game between the Baltimore Colts and the New York Giants. He profiles Raymond Berry at length in that book. This account draws from his meticulous research. See Bowden, M. (2008). The Best Game Ever: Giants vs. Colts, 1958, and the Birth of the Modern NFL. New York: Atlantic Monthly Press.

3 http://www.profootballhof.com/hof/member.jsp?player_id=25.

4 Bowden. (2008). p. 68.

5 http://www.pro-football-reference.com/.

6 Bowden. (2008).

7 Ibid.

8 http://www.sportingnews.com/archives/gwynn/hit_masters_prep.html.

9 http://www.baseball-reference.com/.

10 Kepner, T. “Mets salute hitter who can’t be beaten.” New York Times. August 15, 2001.

11 http://www.baseballlibrary.com/ballplayers/player.php?name=tony_gwynn_1960.

12 Sandomir, R. “At Hall of Fame, Day Dedicated to Two Icons.” New York Times. July 30, 2007.

13 Garvin, D. (2000). Learning in Action: A Guide to Putting the Learning Organization to Work. Boston: Harvard Business School Press. p. 107.

14 Garvin, D. (2000). p. 106.

15 Garvin, D. (2000).

16 Edmondson, A., M. Roberto, and A. Tucker. (2002). “Children’s Hospital and Clinics (A).” Harvard Business School Case Study No. 9-302-050.

17 Edmondson, A., M. Roberto, and A. Tucker. (2002). p. 2.

18 Ibid.

19 http://www.gojlc.com/articles/After-Action-Reviews-Ray-Jorgensen.pdf.

20 Edmondson, A., M. Roberto, and A. Tucker. (2002). p. 10.

21 Garvin, D. (2000).

22 For more information on the Columbia shuttle accident, see the multimedia case study I developed with Professors Richard Bohmer and Amy Edmondson, our research associates Erika Ferlins and Laura Feldman, and an amazing technical team led by Melissa Dailey. Roberto, M., R. Bohmer, A. Edmondson, L. Feldman, and E. Ferlins. (2005). “Columbia’s Final Mission, The.” Harvard Business School MultiMedia Case Study No. 9-305-032. For a scholarly treatment of the disaster, see our book chapter: Edmondson, A., M. Roberto, R. Bohmer, L. Feldman, and E. Ferlins. (2005). “The Recovery Window: Organizational Learning Following Ambiguous Threats in High-Risk Organizations.” In M. Farjoun and W. Starbuck (eds.). Organization at the Limit: Lessons from the Columbia Disaster. London: Blackwell Publishers. For an article about the tragedy written for a managerial audience, see Roberto, M., R. Bohmer, and A. Edmondson. (2006). “Facing Ambiguous Threats.” Harvard Business Review. November: 106–113.

23 At times, firms may wish to break a large project down into smaller pieces to reduce risk, enhance flexibility, and maximize opportunities for learning. In the strategic management literature, scholars have described a concept called “real options.” The notion is that a firm might choose not to embark on a major project in one giant leap. Instead, it may make a small investment with the option to proceed with the remainder of the project at some future date. In the period prior to “exercising the option,” the firm can learn a great deal about the situation. The learning will help managers determine if and when to proceed. Moreover, an effective after-action review of the first phase can help improve the implementation of the remainder of the project. Firms, therefore, might wish to look for the “real options” embedded in major investment opportunities, and then utilize after-action reviews within these projects to help maximize the value of the options. For more on this concept, refer to the following two books: Dixit, A. and R. Pindyck. (1994). Investment under Uncertainty. Princeton, NJ: Princeton University Press; Trigeorgis, L. (1996). Real Options: Managerial Flexibility and Strategy in Resource Allocation. Cambridge, MA: MIT Press.

24 Ellis, S. and I. Davidi. (2005). “After-event reviews: Drawing lessons from successful and failed experience.” Journal of Applied Psychology. 90(5): 857–871. The quote is found on page 866.

25 Ibid.

26 Snook, S. (2000). Friendly Fire: The Accidental Shootdown of U.S. Black Hawks Over Northern Iraq. Princeton, NJ: Princeton University Press.

27 March, J., L. Sproull, and M. Tamuz. (1991). “Learning from samples of one or fewer.” Organization Science. 2(1): 1-13, p. 1.

28 Ibid, p. 5.

29 Edmondson, A., M. Roberto, and A. Tucker. (2002). p. 12.

30 Ibid.

31 Fuld, L. (2006). The Secret Language of Competitive Intelligence. New York: Crown Business.

32 Kenneth Andrews was one of the pioneers in the field of strategic management who popularized the SWOT framework for strategy formulation. See Andrews, K. (1987). The Concept of Corporate Strategy. 3rd Edition. Homewood, IL: Irwin.

33 Fuld, L. (2006). p. 57.

34 Low, J. and P. C. Kalafut. (2002). The Invisible Advantage: How Intangibles Are Driving Business Performance. New York: Perseus Books. p. 79.

35 For an interesting examination of Polaroid’s response to the rise of digital imaging technology, see Tripsas, M. and G. Gavetti. “Capabilities, Cognition and Inertia: Evidence from Digital Imaging.” Strategic Management Journal. 21: 1147–1161.

36 The story of Pentagram’s development of what became the Fuego line of high-end barbeque grills is chronicled in a Discovery Channel documentary titled “The Launch: A Product Is Born.” It aired in April 2004, and it is now available for instructors to purchase. Faculty members may find it very useful when teaching about creativity, innovation, and new-product development. You might want to use it as a comparison case vis-à-vis the ABC News video of IDEO designing a new supermarket shopping cart. Students can examine the similarities and differences between the two firms’ approaches.

37 Ericsson, K. A., R. Krampe, and C. Tesch-Romer. (1993). “The role of deliberate practice in the acquisition of expert performance.” Psychological Review. 100(3): 363–406.

38 Ericsson, K. A., R. Krampe, and C. Tesch-Romer. (1993). p. 363.

39 Ericsson, K. A., M. Prietula, and E. Cokely. (2007). “The making of an expert.” Harvard Business Review. July–August: p. 114–121. The quote is on page 115.

40 Ericsson, K. A., R. Krampe, and C. Tesch-Romer. (1993).

41 Colvin, G. “What it takes to be great.” Fortune. October 30, 2006.

42 Ericsson, K. A., M. Prietula, and E. Cokely. (2007). p. 116.

43 Bird, L. and B. Ryan. (1990). Drive: The Story of My Life. New York: Bantam. Sports fans, particularly Bostonians, will remember another remarkable left-handed shot that Bird made in a crucial playoff game. In Game 7 of the 1988 Eastern Conference Semifinals, the Celtics squared off against the Atlanta Hawks. In the fourth quarter of that game, Bird and Dominique Wilkins went toe-to-toe in a remarkable scoring duel. At one point, Bird drove into a crowd and hit an unbelievable left-handed scoop shot while being fouled. He finished the three-point play, of course! Bird scored twenty points in total during that quarter, and the Celtics prevailed.

44 Williams, P. “Vic Braden’s mental mojo experience.” New York Times. October 29, 2006.

45 Colvin, G. (2006).

46 Ericsson, K. A., M. Prietula, and E. Cokely. (2007). p. 117.

47 http://www.reuters.com/article/pressRelease/idUS77640+28-Jan-2008+BW20080128.

48 Hira, N. “The making of a UPS driver.” Fortune. November 12, 2007.

49 http://www.pro-football-reference.com/.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.190.156.93