4

HOW NEXTDOOR ADDRESSED RACIAL PROFILING ON ITS PLATFORM

by Phil Simon

On March 3, 2015, hyperlocal social network Nextdoor announced that it had raised $110 million in venture capital. The deal valued the company at more than $1 billion—revered, unicorn status. It had to be a giddy moment for CEO Nirav Tolia and cofounders David Wiesen, Prakash Janakiraman, and Sarah Leary. But just three weeks later, all of that celebrating must have seemed like a distant memory.

The news site Fusion ran an article explaining how Nextdoor “is becoming a home for racial profiling.”1 Reporter Pendarvis Harshaw detailed how presumably white members were using Nextdoor’s crime and safety forum to report “suspicious” activities by African Americans and Latinos. Jennifer Medina of the New York Times followed up, reporting that “as Nextdoor has grown, users have complained that it has become a magnet for racial profiling, leading African-American and Latino residents to be seen as suspects in their own neighborhoods.”2

As I discuss in my book Analytics: The Agile Way, how Nextdoor responded illustrates not only the importance of reacting quickly in a crisis but the usefulness of a data-driven, agile approach.

The Response

Agile teams benefit from different perspectives, skills, and expertise, so the cofounders assembled a small, diverse team to tackle the issue. Members included product head Maryam Mohit, communications director Kelsey Grady, a product manager, a designer, a data scientist, and later a software engineer.

As for the data, much of Nextdoor’s data here was unstructured text. Especially at first, this sort of data doesn’t lend itself to the type of easy analysis that its structured equivalent does. This goes double when trying to deal with a thorny issue such as racial profiling. Five employees were assigned to read through thousands of user posts.

The outcome was a three-pronged solution: diversity training for Nextdoor’s neighborhood operations team; an update to Nextdoor’s community guidelines and an accompanying blog post; and a redesign of the app. This last step proved to be the thorniest.

Nextdoor had long allowed people to flag inappropriate posts, by either content or location. For instance, commercial posts didn’t belong in noncommercial areas of the site. Nextdoor realized that a binary (re: flagged or not flagged) was no longer sufficient. Its first attempt at fixing the problem was simply to add a report racial profiling button. But many users didn’t understand the new feature. “Nextdoor members began reporting all kinds of unrelated slights as racial profiling. ‘Somebody reported her neighbor for writing mean things about pit bulls,’ Mohit recall[ed].”3

The team responded by developing six different variants of its app and testing them. Doing so helped the company answer key questions such as:

  • If the app alerted users about the potential for racial bias before they posted, would it change user behavior?
  • Characterizing a person isn’t easy. How does an application prompt its users for descriptions of others that are full and fair, rather than based exclusively on race?
  • In describing a suspicious person, how many attributes are enough? Which specific attributes are more important than others?

Using lean methods, the team conducted a series of A/B tests. Blessed with a sufficiently large user base, Nextdoor ran experiments to determine the right answers to these questions. For instance, consider two groups of 25,000 users divided into cohorts (A and B). Each group would see one version of the Nextdoor app with slight but important differences in question wording, order, required fields, and the like.

Over the course of three months, Nextdoor’s different permutations made clear that certain versions of the app worked far better than others. And by August 2015, the team was ready to launch a new posting protocol in its crime and safety section. Users who mentioned race when posting to “Crime & Safety” forums were prompted to provide additional information, such as hair, clothing, and shoes.

The Results

Simply adding additional details and a little bit of user friction did not eliminate posts by insensitive people or racist users with axes to grind. But by taking a data-oriented and agile approach to design, the company reported it had reduced racial profiling by 75%.

Nextdoor was able to stem the bleeding in a relatively short period of time. A different organization would have announced plans to “study the problem” as it continued unabated. Nextdoor took a different approach, and the results speak for themselves.

TAKEAWAYS

When multiple reports emerged with details of how presumably white members were using Nextdoor’s crime and safety forum to report “suspicious” activities by African Americans and Latinos, the company responded with the conventional approaches of more diversity training and updates to its community guidelines. But Nextdoor went one step further by also using agile to explore potential app feature solutions. The quick response this approach afforded Nextdoor provides a great example of agile as a solution to even the thorniest problems.

  Nextdoor already had a function for folks to flag inappropriate posts, but it realized the binary (flagged/not flagged) was no longer sufficient.

  Its team developed six variants of its app and conducted a series of A/B tests with users to answer key questions such as “How can an app prompt users for descriptions of others that are full and fair?” and “How many attributes are enough?” to describe a suspicious person.

  After three months, Nextdoor had enough data to demonstrate that certain versions of the app worked better than others.

  Just five months after the initial news reports, Nextdoor was ready to launch a new posting protocol in its crime and safety section that reduced racial profiling by 75%.

NOTES

  1. 1. Pendarvis Harshaw, “Nextdoor, the Social Network for Neighbors, Is Becoming a Home for Racial Profiling,” Splinter, March 24, 2015, https://splinternews.com/nextdoor-the-social-network-for-neighbors-is-becoming-1793846596.

  2. 2. Jennifer Medina, “Website Meant to Connect Neighbors Hears Complaints of Racial Profiling,” New York Times, May 18, 2016, https://www.nytimes.com/2016/05/19/us/website-nextdoor-hears-racial-profiling-complaints.html?mtrref=undefined&gwh=D02E4A7AEBE5A737D229D57AF463DF05&gwt=pay&_r=0.

  3. 3. Jessi Hempel, “For Nextdoor, Eliminating Racism Is No Quick Fix,” Wired, February 16, 2017, https://www.wired.com/2017/02/for-nextdoor-eliminating-racism-is-no-quick-fix/#.byvblyn7y.

Adapted from content posted on hbr.org, May 11, 2018 (product #H04BFP).

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.227.111.208