How Nextdoor reduced racial bias by 75%
Thoughtful friction gave users time, and guidance, to avoid falling prey to their own biases
In neighborhoods across the country, a common pattern was emerging on the neighborhood platform Nextdoor: concerned residents posting about "suspicious black men" in their areas. These posts were often vague, describing nothing more than someone walking down the street. It was 2016, and anger was rising. Nextdoor had a dilemma: how to address these racially biased posts without censoring legitimate safety concerns? This dilemma led Nextdoor to a smart redesign of their platform, resulting in a 75% reduction in racial profiling.
Speaking in May this year at a Symposium on Comment Section Research & Design co-hosted by the Plurality Institute and the Council on Tech and Social Cohesion, former head of Neighborhood Operations at Nextdoor Gordon Strause recounted their journey of making key design changes, and what they learned along the way.
Shocking biases
Activist groups, notably Neighbors for Racial Justice in Oakland, brought the issue to Nextdoor’s attention. Resident Monica Bien, quoted in the New York Times, said, “What I saw was just shocking to me.”
“There is this automatic fear or suspicion of anyone different, and it was validated by all these neighbors,” she continued. “It was like the bias was so insidious, and somehow the online community allows them to say what they have been thinking all along but not saying.”
Initially, Nextdoor's reaction was defensive. “We believed that our platform, designed to foster neighborly communication, was inherently positive,” recalls Strause. But the activists persisted. "They said, 'You talk about yourself as a community, as a platform for building community, but you're actually making us feel less safe in our neighborhoods.' That really stuck with us," Strause recounts.
Nextdoor began by gathering data to be able to identify different types of racial profiling incidents. They also did a deep dive on bias, inspired by Jennifer Eberhardt’s book Biased: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do. The analysis revealed two main issues: traditional profiling and insufficient descriptions.
Prompting better descriptions
Traditional profiling involved users posting about someone acting suspiciously based solely on their race, without any specific suspicious behavior. More frequently, however, were cases of insufficient descriptions. “People were posting about actual suspicious behavior, like someone trying to open car doors, but the descriptions were so vague—like ‘young black man’—that they weren’t helpful and caused undue suspicion toward innocent individuals,” says Strause.
The core of Nextdoor's approach was to redesign their posting flow to add what some call ‘thoughtful friction’. Here’s how they did it:
Initial Pause: An interstitial step was introduced, prompting users to pause and reflect before posting. "A big part of the research on profiling is that when people go into quick mode and they're just sort of instinctual, they tend to behave in one way. Whereas if you actually have them pause and slow down and think a little bit, they will behave differently," Strause explains.
Behavior Focus: Users were first asked to describe the suspicious behavior without mentioning personal characteristics. This encouraged users to critically evaluate whether the behavior was genuinely suspicious.
Structured Descriptions: Describing individuals required clicking a button to open a structured form, mandating detailed information. "When you click that button, this field opens up that structures the description and forces you to provide a lot of useful information so that the description of the person you're describing is a specific person and not a general category of people," says Strause.
Awareness and Education: Information about profiling was included in the posting flow to raise user awareness. Links to community guidelines provided further context and education.
Implementing these changes led to a 75% reduction in profiling-related posts. This success highlighted several key lessons:
Listen and Understand: Engaging with affected communities and understanding the nuances of the issue was crucial. "These weren't people trying to do bad stuff. These were people trying to help their neighbors but they just didn't know how to do it. They weren't aware of their own biases or how their posts would collectively become something antisocial," Straws notes.
Seek Expertise: Collaborating with academic experts provided valuable insights and validated the approach. Many experts are eager to assist with real-world applications of their research.
Revisit Assumptions: Challenging the initial assumption that ease of posting should be prioritized above all else. This is a big deal, as the majority of social media platforms prioritize easy posting, over higher quality posting. "We want to make it as easy as possible to post was this sort of general rule of thumb, but we had to revisit that and think about what could make a better flow," says Straws.
Iterate and Improve: The solution evolved through multiple iterations, guided by both qualitative and quantitative feedback. Flexibility and a willingness to adapt were key.
Design Over Moderation: Effective design can preempt issues that would otherwise require moderation. "Design is way more important than moderation. Moderation is really a failure case; you want to stop that from happening in the first place," Strause emphasizes.
Nextdoor's experience underscores the profound impact that thoughtful tech design can have on social cohesion. By intentionally designing features that promote responsible behavior and awareness, technology can foster safer, more inclusive communities. The success in reducing racial profiling by 75% demonstrates that with the right approach, technology can indeed be a catalyst for positive change in our communities.
Click here to watch Gordon’s full video presentation, as well as 11 other presentations from the Symposium illustrating design approaches to improving comments sections.