Chapter 23

Ten Crowdsourcing Blunders to Avoid

In This Chapter

arrow Preparing for the crowd’s mistakes

arrow Protecting yourself from failure with a trial run

arrow Guarding your reputation

The basic idea behind crowdsourcing is simple. You have a problem that you can’t solve by yourself or work that’s too much for you to do. You turn to the crowd and ask the crowd for assistance. If you ask in the right way, the crowd comes to your aid and all’s as right as can be.

However, obvious crowdsourcing blunders exist, roots along the path that can trip the unsuspecting crowdsourcer. But forewarned is forearmed, so steer well clear of the following mistakes as you prepare for your first attempt at crowdsourcing.

Thinking Crowdsourcing Is Easy

Hubris brought down the Greeks and it can also bring down the crowdsourcer. If you think that crowdsourcing is so easy that you don’t have to pay attention to details, then you are overconfident and heading for a fall.

example.eps Liang, a graduate student in psychology, learned about crowdsourcing as he was doing his doctoral study. Assured by other students that crowdsourcing platforms are easy to use, he decided to use the crowd at Amazon’s Mechanical Turk to proofread his thesis. He created a collection of microtasks by dividing the paper by paragraphs. He posted the paragraphs on Mechanical Turk with the request to rewrite the paper into standard English and offered $0.25 (£0.15) per paragraph. When the tasks were complete, Liang assembled the work into a complete paper and submitted it. Judgement came quickly. His professor said the paper was terrible. Some of the paragraphs had been written properly but most had not. For some, the crowdworker had changed only one word (and not always for the best). For other paragraphs, the worker had copied text from Wikipedia or just typed nonsense. Liang’s blunder was to crowdsource without first thinking about the right way to do it.

warning_bomb.eps You make the mistake of thinking that crowdsourcing is too easy when you forget about the details of the process or believe that those details will magically take care of themselves. If you don’t think about the details of how to divide your job, how to engage the crowd and how to check the results, you’re likely to get work that’s done badly, delivered late and filled with errors.

remember.eps When you do it, you need to remember that crowdsourcing is a powerful technique that requires attention to detail. You need to think about your job and the kind of approach that you need. Choose an appropriate platform and have a means of checking the quality of the work you receive. Above all, think about what you’re doing.

Failing to Review the Work of the Crowd

Some people believe that you can always trust the work of the crowd. After all, trusting souls are out there who believe in the fundamental virtue of the common man, and who transfer that trust to crowdsourcing. The truth, though, is that you can’t trust the work of the crowd any more than you can trust the work of any other group of human beings. You always need a mechanism to check the work that the crowd does.

example.eps If you want an example of why you should check the work of the crowd, talk with any experienced crowdsourcer. You’ll hear stories of tasks undone, work copied from websites and arbitrary answers given to clearly written questions. To try to understand the quality of crowdsourced work, Michael, a crowdsourcing researcher in California, did an experiment. He put a job on Amazon’s Mechanical Turk that asked the crowdworker to flip a coin and report whether the coin landed heads up or tails up. He did 100 replications of the experiment and found that 70 of the crowdworkers responded that the coin landed heads up. Standard statistical theory states that only 50 of the crowdworkers should have reported that the coin landed heads up, with a margin of error of 15. It would be extremely unusual for as many as 70 workers’ coins to land heads up. From this, Michael concluded that some of the crowdworkers never bothered to flip a coin and were lying.

warning_bomb.eps If you commence any form of crowdsourcing without planning to check the results of the workers, you’re committing a blunder. This problem’s most common in microtasking. In macrotasking, self-organised crowds and crowdcontests, you can usually check the work of the crowd fairly easily, and in crowdfunding, the money is as good as the bank says it is. In microtasking, however, you need a process for checking each result.

You may be tempted to believe that you can check the results of microtasks by duplicating or triplicating the work and then comparing the results. Duplicate tasks are definitely better than single tasks, but they still aren’t perfect. First, you can’t duplicate every task. You can’t double-check Michael’s experiment by duplicating the coin flips, for example. Second, sometimes duplicate tasks merely duplicate errors. Two people can do the same task and both do it wrongly.

remember.eps When you put a job to the crowd, have a way of checking the quality of the work you receive back. If you’re doing macrotasking or crowdcontests, or even working with self-organised crowds, you can probably review all the submissions yourself. When you’re microtasking, though, you may need a more sophisticated process that allows the crowd to check the work or that duplicates the work.

Not Knowing Who’s in the Crowd

Believing that the crowd is a great teeming mass of people who possess any skill you may need and who can do any job you may put to it is an easy trap to fall into. The crowd members are, indeed, a diverse group, but you usually can’t see all of the crowd at one time. From your perspective, or from the perspective of the crowdsourcing platform you use, you may not see the skills that you need or get the help that you want.

example.eps Two years ago, my neighbourhood was struck by an earthquake, the Great Millennial Hoover Park Earthquake, as the residents have come to call it. It was as bad as anything I’ve ever seen, but it was nothing compared with the kinds of quakes that have hit Japan, Haiti and Iran. It cracked some plaster, knocked some books off the shelves, broke a favourite bowl and tipped over recycling bins up and down the street.

Within a few minutes, someone organised a crowd reconnaissance of the neighbourhood. The organiser told us to tweet news reports of what we saw and use the tag #hooverquake. The first reports were of broken windows and cracked walls, but the stories quickly escalated. Collapsed buildings; fires in the street; children trapped in schools calling for their parents. None of these latter events were true. The local primary school had actually hurried the children onto the playground, where they stood screaming because they thought the earthquake was the most exciting thing that had ever happened to them.

These stories came from people with great imaginations who lived nowhere near Hoover Park. Some of them even had Twitter IDs that suggested they were from Florida or Ohio or North London. Still, readers were inclined to believe the stories. Some of the commercial media outlets picked up the stories and repeated them. None of them bothered to check whether the tweets came from part of the crowd that was actually in Hoover Park.

You don’t need an earthquake in your neighbourhood to appreciate the diversity of the crowd. If you scan Wikipedia, for example, you’ll see that it’s the work of quite a diverse crowd and that it hasn’t yet found a crowd that can give it a complete, balanced reference work. Some of the articles – those that teach with technology, for example – contain mathematical descriptions and almost require a Masters degree to understand them. Articles about science fiction may contain incredible details that can’t possibly be of interest to most readers. But other articles – articles on social or historical topics that may have some general interest – may still be incomplete, and may contain vague generalities and statements that really don’t describe the subject. The Wikipedia crowd, at least the part of the crowd that contributes to the work, doesn’t yet have all the skills and knowledge it needs to complete the encyclopaedia.

warning_bomb.eps Don’t assume that you always have the crowd that you need. Be sure that the crowd members have the experience or skills that they need to do data work or gather your data. You do this by qualifying your crowd.

remember.eps The quality of the work that you can get from the crowd depends on the kind of crowd that comes to your project. You can’t great expect technical work from a crowd of teenagers on Facebook, nor can you always get good marketing advice from a team of Java coders. Know your crowd, and be sure that you have a crowd with the right skills and the right experience. You can get a sense of your crowd in macrotasking, because you get to see information that gives the background of the workers. However, if you’re doing microtasking, crowdcontests or self-organised crowdsourcing, you may have to do something extra when you qualify your workers, in order to ensure that you have the right crowd. You can find out more about how to qualify the crowd for microtasking in Chapter 8.

Failing to Do a Trial Run

The trial run is so important, especially in microtasking, that it almost doesn’t deserve comment. In crowdsourcing, you learn by doing, by creating a crowdsourcing job, writing a statement of work and instructions, posting the job, observing how it works and assessing the results. A trial run’s a great way to do all this. (You can find out more about learning from your results in Chapter 14.)

example.eps Brigid attempted to do a crowdfunding project on Facebook for a charity. She faced a tight deadline and decided that she didn’t have time to do a trial run. Rather than prepare her materials and test them with a small group, she posted them and started urging everyone she knew to contribute to the cause. Little happened. A few people pledged to the cause, but not enough. Eventually, a friend contacted her and said that she didn’t understand the purpose of the campaign. A review of the campaign showed that several problems existed. The description was not as clear as it could have been, some links pointed to the wrong web pages and a video which had been lovingly prepared didn’t display properly. The deadline came, and Brigid had nothing to show for it.

warning_bomb.eps If you don’t do a trial run, you take the risk that your job will fail. By doing a trial run, you learn by doing and give yourself an opportunity to learn without paying for a full, complete and expensive job.

remember.eps In crowdsourcing, doing trial runs is easy. For microtasking, you just send a small number of tasks to the crowdsourcing platform. In macrotasking you ask your worker to do a small job before you try the big job. In crowdcontests or self-organised crowds, you do a small contest first so you can see if you have the talent for your project. In crowdfunding, you try to raise a small amount of money first.

Putting the Crowdsourcing Ahead of the Job

Getting excited about crowdsourcing is easily done. Crowdsourcing is a novel and powerful way of doing work. It can achieve things that other forms of work cannot. However, crowdsourcing isn’t the only way to work, and it may not be the best way to do a job. When you’re preparing a project, ask yourself ‘Is this the best way to do the job?’ and ‘Am I attempting to crowdsource an activity that would be done better another way?’

example.eps Min and Sasha were preparing a business plan for a start-up company that they wanted to create. Both of them had seen their friends use crowdsourcing in their companies and decided that they wanted to use it to create a business plan. They wrote a job description, posted it on a macrotasking site, selected a worker and started the process. Thirty days later, the crowdworker delivered the business plan. It was an impressive document, but it had little value as a business plan. The descriptions were general and could have been applied to any business. The financial projections were based on assumptions that weren’t explained, and the marketing section didn’t seem to say anything specific about the service that Min and Sasha wanted to produce. The investors in the company read the document and announced they were withholding funds until Min and Sasha created a better plan.

warning_bomb.eps Complete business plans are probably not good things to crowdsource, although perhaps some parts of them can be addressed by macrotasking. Good business plans require a variety of skills, much knowledge, an understanding of the founders’ skills and a vision for the organisation. Much of this can’t be gained from the crowd. In general, don’t ask the crowd to do things that you don’t know how to do, and don’t ask it to do things that require a detailed knowledge of you and what you’re doing.

remember.eps Don’t crowdsource just because it’s new, because it seems special or because it’ll impress your market, your boss, your neighbour, your girlfriend or you. Crowdsource because it’s the right way to do your particular job. Keep your attention focused on your goal and avoid becoming fascinated with crowdsourcing. Crowdsourcing is indeed fascinating and it has the ability to transform many kinds of activities and do things that can’t be done in other ways. However, it can’t transform anything if it’s unsuccessful, and it can’t be successful if you spend more time thinking about crowdsourcing than you spend thinking about your goal.

Losing Your Reputation

Not only do the members of the crowd have a reputation; you do too. Furthermore, the crowd talks. Sometimes you like what crowd members say and sometimes you don’t.

You’re going to find it hard to crowdsource if the crowd views you as incompetent, hostile or unorganised. There are many ways to lose your reputation on a crowdmarket. You can give bad descriptions, offer difficult jobs or set arbitrary deadlines. However, you usually get in trouble with issues that involve money and payments.

example.eps Purna manages the crowd for Tinitasks, a microtasking firm that offers two kinds of microtasking service: microtasking jobs that are prepared by Tinitasks’ staff and microtasking jobs that are prepared by its customers. Purna knows which jobs are which, but the crowd doesn’t. The crowd members think all jobs come from Tinitasks.

In the middle of the spring, Purna noted that fewer and fewer workers were accepting work from Tinitasks. Soon, Tinitasks was running at only 70 per cent capacity. It was able to attract only 70 per cent of the workers it needed to do its work. In reviewing the blogs and forums for the Tinitasks crowdmarkets, Purna concluded that the company was in danger of losing its reputation. The workers were saying that instructions issued by Tinitasks were difficult to understand and that the company was slow to pay workers. So, Purna quickly reviewed the customer jobs and found many with poorly written instructions. She also found one customer who had been placing large jobs with Tinitasks without ever reviewing the work or paying workers.

Purna needed three or four months to rebuild Tinitasks’ reputation in the market. She created a process that edited the text for all jobs, and got the vice president of marketing to bring the non-paying firm into line. She also started a small publicity campaign among the workers. ‘We’ve heard you and we’ve changed’ was the message. Still, Purna knows that Tinitasks still carries a little stain from the incident. Every now and then, someone will post on the worker blog ‘You know Tinitasks, they don’t pay.’ Mud sticks.

warning_bomb.eps The easiest way to ruin your reputation is to change the rules of your project with no warning or explanation. For example, if you suddenly change the price that you offer for your tasks, you may get the reputation of being stingy. If you raise your prices too many times, the crowd resists taking your tasks. They assume that you will raise your prices again and that they may as well wait until you do. And if you change the rules in the middle of a job, even if you do so in an effort to improve the quality of work, you get the reputation of being arbitrary or not knowing what you want.

remember.eps Two rules will generally secure your reputation. Firstly, treat the crowd as you would like to be treated. Secondly, be as consistent and transparent as you can be in all your decisions. (Should you find yourself under attack from a disgruntled member of the crowd, take a look at Chapter 14.)

Hiding from the Crowd

You can’t do crowdsourcing without a crowd, and you can’t get a crowd unless you make the effort to call the crowd together. In some forms of crowdsourcing, finding a crowd isn’t difficult. For example, if you’re macrotasking on an established crowdsourcing platform, you should have little trouble finding a crowd. However, if you’re running a crowdcontest from your own website or doing crowdfunding in any form, even from one of the major crowdfunding sites, you can’t just assume that the crowd will come or that pledges will flow to you; you need to make an effort to attract a crowd. If the crowd can’t find you, you can’t do crowdsourcing.

example.eps Hamish, a marketer with Benedict Foods, organised several crowdcontests to promote the Benedict brand. He wanted to produce a Benedict Foods cookbook and organised contests for recipes, for designing the site and for writing the text. The prizes for the contest were generous, and the contest description was enticing. However, few people submitted entries. After he concluded that the contest was unsuccessful and closed it, Hamish investigated the process. In publicising the event, the company had largely reached older customers who didn’t regularly use the Internet. It had failed to reach a crowd.

warning_bomb.eps When you crowdsource, be sure that you can draw a crowd. Just because your website is connected to billions of people on the web, don’t assume that enough of them will visit that site and be ready for work. Use a commercial crowdsourcing platform unless you have solid evidence that you can draw an adequate crowd at your site. And when you use a crowdsourcing platform, be sure that the platform has the kind of crowd you need.

remember.eps Crowdsourcing works best when you have a natural crowd on the web, a group of people who use all aspects of the Internet and are interested in your work. But even when you’re using an established crowdmarket for macrotasking or microtasking, you need to be prepared to take steps to draw the crowd to your work. Describe your work in words that are clear and attractive. Give the crowd a reason to rally around you. Make sure that your project is visible on the web. Have a place where the crowd can offer feedback. Publicise the results.

Assuming That All Crowdworkers Understand

You can make bigger blunders when you’re getting good results from the crowd than when you’re getting bad results. When you get bad results, you know that something’s wrong. You know that the instructions aren’t clear and that the crowd is misunderstanding you.

When you get good results, you’re tempted to believe that all’s well. Usually, everything is well, but occasionally it’s not. Sometimes the crowd is doing the right things by luck. In other words, the crowd is doing the right thing but not for the right reason. Something changes and the crowd no longer does what you want.

This kind of problem isn’t common but it can occur when you try to reuse a crowdsourcing job. It worked the first time, and it may have worked a second time. You try it a third time, however, and you watch things go wrong. The crowd does the wrong things and get you the wrong results.

example.eps Elmira used crowdsourcing to help translate small texts into different languages. She had access to a translation program that did a fairly good job. However, she wanted simple, colloquial translations that read as if they were written by a native speaker. To do this, she put each text through the translation program and then gave both the original text and the translated text on a crowdsourcing market. She asked the crowd to put the text in ‘common speech’ without exactly saying what common speech might be. The process worked well on a couple of occasions. Then people started editing the English text rather than the translated text. They’d rewrite the English text as a very informal document filled with slang. In reviewing her instructions, Elmira recognised how the crowd could misinterpret them. She pulled the job, rewrote the instructions and then reposted the tasks. In the end, she was only a little surprised that she’d not seen a problem earlier.

warning_bomb.eps Just because you understand what you want, don’t assume that the crowd knows what you want. Make sure that your instructions are clear. Ask the crowd to test them. If you don’t, you’ll find that at least some of the crowd, or maybe all of the crowd, will interpret the instructions in their own way and give you information that you can’t use.

remember.eps Always assume that you’ll have trouble communicating with at least certain parts of the crowd. They are many. You are one. Some members of the crowd may have backgrounds that are quite different from yours. They may tend to interpret your instructions in ways that you didn’t intend. No matter how many times you’ve used a job description or set of instructions, you may want to give it one more review before you use it again. (You can look at Chapter 11 for more information about writing instructions for the crowd and how to test those instructions.)

Having Too Much Faith in the Market

Having faith in the crowdmarket isn’t a blunder, but thinking that markets are the only way to manage people definitely is a blunder. Sometimes, you can be better off using a more conventional technique to get your job done.

example.eps Fred, who runs a small technology firm, fell in love with crowdsourcing. To him, crowdsourcing represents a return to the fundamental truths of economics, to the ideas of Adam Smith and the invisible hand that guided the actions of business people. He vowed that he’ll use crowdsourcing for every aspect of his business.

Fred began using macrotasking throughout his business: financial planning, bookkeeping, marketing and even sales. Eventually, he realised that some crowdsourcing jobs were wasting his time. In these jobs, he had to teach the crowdworker the history of his company, explain the context of the market and revisit decisions he’d made three or six or nine months before. Eventually, he concluded that some of his jobs were better done by long-term contractors or permanent employees than by crowdworkers.

Fred didn’t completely revert to his old way of doing business. He still does a lot of work through crowdsourcing, but he resists the idea of hiring long-term contractors. The idea doesn’t seem right to him. He’s so in love with the idea of a free market that he can’t accept that crowdmarkets may involve too much extra work to make them the best way to manage some workers or do some tasks.

warning_bomb.eps Don’t crowdsource because you love the idea. Crowdsource because it’s the best way to do your job. If you crowdsource just because you love the idea of markets, or because you’re in love with technology or you think it’s the wave of the future, you’re likely to give a badly designed task to the wrong workers and get an unusable result.

remember.eps You may sometimes meet business people, like Fred, who are so in love with the idea of free markets that they don’t see the shortcomings of crowdsourcing. Markets are indeed lovely things and they make crowdsourcing possible. You can love them all you like, but they sometimes involve costs that you’d rather not pay again and again. You have to train your new crowdworkers. You have to integrate them into an existing organisation. You have to tell them the history of your activities. Never assume that crowdsourcing is the best way to manage an activity just because it involves a market.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.118.31.11