Chapter 13. Bringing It All Together

In this section, we’ve laid out all of the major techniques in Lean UX, from framing your work as a business problem statement to defining success in terms of outcomes for both the business and the user. We’ve shown you how to capture the essence of what you know about your users as proto-personas. We’ve talked about how to get started figuring out your solutions. And we’ve shared ways to write and test your hypotheses. The Lean UX Canvas gives you a concise, single-page way to organize your work as you apply these techniques in the real world.

That said, the real world is a messy place, and every project you work on will be messy in its own way. (That’s one of the reasons to use the canvas in the first place!) To wrap up this section, we thought we’d share some stories of teams using Lean UX in this very messy real world. You’ll see how well Lean UX works—in fact, how well-suited it is to addressing the messiness of the real world.

In these stories, you’ll see that some of these teams used the canvas. Others just used some of the Lean UX techniques that are embedded in the canvas, without using the canvas itself to organize their work. As we’ve said before, this is fine. Take these tools and make them your own. We hope these stories will give you some ideas and inspiration for how to do just that.

The Lean UX Canvas in the Enterprise

Recently we heard from a product and design leader at a large enterprise software company in Silicon Valley. This company develops a cloud computing platform to help companies manage digital workflows for enterprise operations. They got in touch because they wanted to tell us how they’d been using the Lean UX canvas to kick off projects and new initiatives. We thought it was a great story to share.

A team at this company was working on the second release of a product that had received good feedback from customers. The product had a novel way of displaying data—a display that was really beautiful, that tested well, and that generated really positive feedback from customers. The product used an unusual UI element: a custom-built map that helped users visualize work processes and spot opportunities for process improvements.

In addition to generating positive feedback from customers, this UI showed really well internally. Stakeholders loved it and supported the idea of enhancing it. So it seemed like a natural choice to build on the success of this feature in the next version, and, in fact, that’s what the team planned to do: they were going to really lean into this feature for the second release.

Still, they had recently come across the Lean UX Canvas as a tool and liked the way that it spurred methodical thinking about the work they needed to do. They decided to work through a Lean UX Canvas for this initiative. When they did, an interesting thing happened. When they came to Chapter 8, they had a breakthrough.

In order to complete Box 4, the team had to turn back to the feedback data that they had already collected from customers. This often happens when you work on a Lean UX Canvas: you have to find the information that you need to complete a section. Sometimes that means you need to do additional research, and sometimes it simply means you have to review the research you’ve already done.

In this case, as the team reviewed their data, they noticed an important pattern in the user feedback they’d overlooked: customers were telling them they loved the map, but they wanted more from the product than simply data display. They wanted the product to be more proactive. They were saying essentially, “The display is really beautiful, but we want you to highlight where we should be paying attention.”

The process of completing a Lean UX Canvas forced the team to stop and methodically examine the data they had collected. One team member told me, “We actually had the feedback. We just weren’t listening to it!” He continued: “When we used the canvas to go through the data about what was materially important to our customers, we realized that we had simply been ignoring the data!”

So the team spent time interpreting this feedback. They really dug in and worked out what they thought the feedback meant and what it meant about the outcomes customers were seeking. When they went back to the customers to validate these ideas, customers responded positively.

After that, the team made an easy decision: they needed to change their priorities. One team member told us that the conversation on the team was clear. “When the voting came down, it wasn’t ‘make the map better’; it was ‘let’s do what our customers really find valuable.’”

In the pilot for the second release, “we got really high marks from our customers.” Those high marks helped the team get to broad release even faster. “We probably got there six months sooner because of this feedback.”

Validately: Validating Your Product with Customer Interviews and a Two-Day Prototype

Serial entrepreneur Steven Cohn got the idea to launch Validately because of the challenges he faced as an entrepreneur doing his own user research. When he surveyed the landscape, he discovered that other user researchers had similar challenges. User researchers were typically using free tools like Skype and Google Docs to conduct their studies. He knew that these tools weren’t designed for user research and made the process inefficient. Some teams supplemented these tools with the best-known service provider in the space, usertesting.com. He knew that there was an opportunity here.

Steven and his team used customer interviews to discover this user’s needs and goals (Box 4 of the Lean UX Canvas). In these conversations, they discovered where they should focus when they asked researchers what they do with the results their studies generated. “That’s where the bulk of my work is,” they told him. In fact, he learned that the bulk of researchers’ work began once the actual research was over.

The Validately team learned that, following a round of customer interviews or usability tests, researchers had to go back to a lengthy and disorganized Google Doc of notes and connect those notes to timestamps in the video of the study. They would use these timestamps (and some kind of video editing tool) to create video clips, assemble these clips into a highlight reel, and finally share that reel with their teams, clients, and stakeholders. All this just to share what they learned during the study.

The team conducted dozens of interviews to validate this problem. They learned that this work—creating reports and video highlights—was frequently more than 50% of the total effort devoted to each study. He knew he’d found a problem worth solving.

The next step was to work on the solution. (Box 5 of the Lean UX Canvas.) Steven and team created a prototype in InVision that hypothesized what a streamlined tool that combined note taking, time tracking, and report and highlight reel creation could look like. They spent two days creating this prototype before showing it to customers.

The prototype almost single-handedly answered the question “How are you better than usertesting.com?” Once potential customers saw the value of the streamlined tool Validately wanted to build, they were immediately interested. At this point, Steven and team asked for one more type of “feedback.” They asked people to commit on the spot—to purchasing a product not in the future but right now, even though it wasn’t ready for delivery. The contract offered a cancelation clause if delivery didn’t happen, but otherwise, Steven used the prototype as a selling tool for a service that did not yet exist. The pitch worked, which helped Validately validate that their solution was the right one. They converted enough customers to build a thriving business in the gaps between free tools like Google Docs and Skype. Validately went on to be a huge success, ultimately selling to UserZoom in 2019.

The MVPs they used here—customer conversation followed by a two-day prototype—helped Steven and team gather three levels of validation:

Time

Will people give us 30 minutes of their time to discuss this problem? If not, the problem we’re solving is not important enough to them and likely not a space we want to play in.

Social

Will the people we speak with take it to their boss, team, infosec, procurement, and others in the organization? Will they socialize it and endorse it internally? To understand this, they always asked, “Will you introduce me to others in the organization who might be interested in this tool?” Again, if this resulted in no introductions, that was also a signal.

Money

If the prospective customer gave their time and their reputation, they’d be asked to purchase. This is the ultimate validation as it actually resulted in a sale.

Kaplan Test Prep: Using Lean UX to Launch a New Business

Kaplan Test Prep has been helping students in the US prepare for standardized tests like college and medical school entrance exams since 1938. Now, with education transforming on a nearly annual basis, Kaplan has to continuously reinvent how it delivers value to its customers. Lee Weiss, currently a senior vice president at the company, has helped Kaplan do exactly this for more than 20 years. Most recently, in the fall of 2018, Lee started thinking about a new idea to reinvent Kaplan’s university partnerships business. He wanted to make it possible for universities to create online courses for high school students on the topics of college and career readiness.

Their idea was to create a safe way for high school students to learn about various career paths and try out different universities at the same time. Lee and his colleagues spent a few weeks sketching out some ideas of how this could work and put their thoughts down in a PowerPoint deck. This deck became their first experiment—their MVP—to help them test their first hypothesis: that leadership would be interested in their plans.

In early 2019, they met with the leadership team to test their idea. Kaplan’s leaders were excited about this new idea and gave Lee and his colleague Liz Laub the green light to focus solely on this new idea. Kaplan’s leadership team told Lee and Liz, “Take the next 90 days and see if you can get this concept off the ground.”

Lee and Liz started by asking themselves the first key question of Lean UX: What’s the most important thing we need to learn first? (Box 7 on the Lean UX Canvas.)

They realized they wouldn’t have a business at all if they couldn’t get any universities interested. With their biggest risk identified, they were able to move on to the second key Lean UX question: What’s the least amount of work we need to do to learn it? (Box 8 on the Lean UX Canvas.)

They started talking to universities to see if they’d be interested in partnering with them on this initiative—even though the initiative itself did not yet exist. Within 90 days, the team had spoken to 20 different universities and ended up with 2 of the biggest universities in the US interested in the idea—all through simple, and at times serendipitous, conversation. This was great news, as it would give Lee and Liz enough information to make a more detailed request from leadership. A request for budget to bring a product to market.

Before they could do that, though, there were other questions they’d need to answer: like what would students and parents want? What would their solution look like? (These are the kinds of questions you capture in Box 7 of the canvas.)

This is where the obstacles started to appear: their assumptions about their product offering (Chapter 9) started getting smashed on the rocks of reality. Initially, they’d hoped to build a product that allowed teachers and students to interact together in real time. Unfortunately, time zones got in the way of that, something they learned quickly through early and continuous conversations with students. So they pivoted to asynchronous courses and quickly faced a new challenge: how to build an engaging and high-value course.

They assumed that the best way to do this would be to build cohort-based communities. They tested their assumptions by starting to create early versions of this offering. While this received strong positive feedback from the students who were involved in the tests, there was still demand from both parents and students for live mentoring and support. They addressed this by hiring former students from these universities as mentors. These pieces were the foundations of a product that met both synchronous and asynchronous needs.

There was one last piece of the puzzle they had to explore in their initial 90-day period, though: their hypotheses about the kind of organization they would need to build and support this business. It wouldn’t do them any good to solve this need in the market with a compelling product if they couldn’t create a viable, sustainable business. (These service-design considerations should be part of your solution definition in Box 5.)

Here again the team made a series of assumptions about who they’d need to hire, how they should price the product, how much it would cost to run it, and, of course, what shape the product should take. Lee notes that nearly all of these assumptions were wrong in hindsight, but they now had enough information to go back to leadership with a more detailed request: a request for budget—$700,000 to be exact—to get the product built and to market. The request was approved, with one caveat: you have one year to break even.

The team got started. Using nothing more than conversations with students, teachers, and administrators at their two (now signed) customers, the team created an initial curriculum of three courses. The goal was to make the highest quality courses they could as quickly as possible.

With the curriculum in place, the team needed operational systems to support the work—CRM, learning management systems, content management systems, etc. They sought systems with the mandate to use whatever got them up and running the fastest. They stitched together a lightweight experience using SaaS products—all outside the Kaplan tech ecosystem, which would have slowed down their ability to run quick tests. (This is a great use of the No-Code MVP technique.)

The first two-week course had eight students in it, all of whom got the course for free. All eight finished the course and gave strong positive feedback about the product quality and overall experience. Now it was time to acquire the first paid cohort. There wasn’t a lot of interest at first. Lee, Liz, and the team were starting to worry. Their application was long, the cost was high, and the $50 application fee they’d seen elsewhere and copied into their product all seemed to be keeping potential customers from submitting their applications.

The next experiments were clear to the team—remove the application fee, shorten the application process, and lower the price. They were able to do this for two reasons. First, because they were an internal innovation team, they had decision-making authority to explore pricing. So they cut their prices in half. The second reason that they could experiment like this? They were working in one-week sprints. So even if a decision was catastrophic, the longest they had to live with it was one week.

The team didn’t have to wait long, though. The day they made the changes, they got more applications than the prior two weeks combined. Sales revenue quintupled. As the product saw stronger traction, the team began to worry about maintaining product quality as they scaled. They defined a set of outcome-based metrics to guide their decision making and keep quality high: they wanted to make sure that every student logged in within 48 hours of any course starting, they wanted to see an 80% course completion rate, and they wanted to maintain a Net Promoter Score (NPS) of 50.

By using outcomes as their north stars, evidence-based decision making as their engine, and short cycles to keep themselves on track, the team has managed to build a business unit that now employs more than 30. They proved that following data and instinct and slowly scaling up decisions allows you to make the best course corrections along the way. When you couple this data with an autonomous team and a clear, outcome-based mandate, the results speak for themselves.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.225.209.95