Chapter 7. Dealing with Vandalism and Spam

Although vandalism and spam are constant aggravations, the ongoing efforts of thousands of editors—like you—do a surprisingly good job of minimizing these problems. This chapter explains in detail what you, a Wikipedia editor, can do in terms of spotting and fixing vandalism and spam.

For Wikipedia, the “encyclopedia that anyone can edit,” vandalism—the destruction of content or the addition of useless or malicious content—is a constant, ongoing issue. “Anyone” includes cranks, juveniles (of any age) who don’t have anything better to do, and those who hold a grudge against Wikipedia because of past blocks or bans. For readers, obvious vandalism casts doubt the accuracy of Wikipedia articles. If the vandalism is subtle, readers can be deliberately misinformed. For editors, fighting vandalism reduces the amount of time available to improve articles.

Spam, at Wikipedia, refers to improper external links added to Wikipedia articles, which is why you often see the term linkspam. Spam is a smaller problem than vandalism because most readers of Wikipedia articles don’t follow external links. Still, as Wikipedia becomes more widely read, the temptation grows to add links in the hopes that someone will click them, generating traffic for the spamming Web site. (See the box below for more detail on the differences between vandalism and spam.)

Fighting vandalism and spam is a bit like doing detective work: In addition to figuring out who did what (Chapter 5), you investigate the extent of the problem, assess the possible underlying motives of the perpetrator (that affects things like warning levels), and then decide what to do (warn, request a block, and so on). It’s important work, and many editors specialize in it.

Lines of Defense

The Wikipedia community has evolved multiple lines of defense against vandalism and, to some extent, spam. They are, roughly in the order of how fast they kick in (bots being the fastest):

  • Bots. Some vandalism is so egregious that even a computer program can recognize it. Wikipedia allows bots to revert vandalism because in the rare cases where they make a mistake, the mistake is easy to revert.

  • Recent changes patrol. The RCP is a semi-organized group of editors who monitor changes to all the articles in Wikipedia, as the changes happen, to spot and revert vandalism immediately. Most RC patrollers use automated tools to handle the routine steps in vandal fighting.

  • Watchlists. Although the primary focus of monitoring is often content (and thus potential content disputes, as described in Chapter 10), watchlists are an excellent way for concerned editors to spot vandalism. (Watchlists and other methods of monitoring articles are described in Chapter 6.)

  • Readers. Readers, including editors who are just looking over an article, are in some sense the last line of defense. Most readers don’t know the proper way to remove vandalism (but you do, if you’ve read Chapter 6). Still, even if readers bungle vandalism removal, they’ve still improved the page, and hopefully a more experienced editor will complete the job.

When you read a randomly picked Wikipedia article, you rarely see vandalism. That’s a testimony to the effectiveness of vandal-fighting, despite evidence that the extent of vandalism is increasing (Figure 7-1).

As Wikipedia has gotten more popular, the percentage of edits that are reverted—mostly, but not always, because of vandalism—has risen. This graph shows problem edits and the edits that fixed them. Assuming, for the sake of simplicity, that there is a one-to-one ratio of problem edits to corrective edits, then a point on the 20% line would mean that one of every 10 edits was a problem, and one out of every 10 edits was done to fix such problems. [Graph courtesy of editor Dragons Flight (Robert A. Rohde), based on his large September 2007 sampling of article edits.]

Figure 7-1. As Wikipedia has gotten more popular, the percentage of edits that are reverted—mostly, but not always, because of vandalism—has risen. This graph shows problem edits and the edits that fixed them. Assuming, for the sake of simplicity, that there is a one-to-one ratio of problem edits to corrective edits, then a point on the 20% line would mean that one of every 10 edits was a problem, and one out of every 10 edits was done to fix such problems. [Graph courtesy of editor Dragons Flight (Robert A. Rohde), based on his large September 2007 sampling of article edits.]

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.221.234.114