This article is an excerpt from the Shortform book guide to "Biased" by Jennifer L. Eberhardt. Shortform has the world's best summaries and analyses of books you should be reading.
Like this article? Sign up for a free trial here .
How did racism develop on the Nextdoor app? What did the executives do to combat it? Did their efforts succeed or were they just an empty gesture?
The Nextdoor app is basically a digital neighborhood bulletin board designed to keep neighborhoods safe and share local news. However, racism on Nextdoor was unchecked and many blacks were being wrongfully flagged as suspicious.
Learn how the Nextdoor executives tackled the racism problem and whether it saved their app.
Racism in the Nextdoor App
In her book Biased, Jennifer Eberhardt argues that this unchecked bias is a particular problem for companies like Nextdoor, a social network specifically designed to connect people who live in the same physical neighborhoods. Nextdoor functions like a neighborhood bulletin board for the digital age where neighbors can ask for babysitter recommendations or spread the word about a lost dog. Its “crime and safety” category, however, quickly became a hotbed for racial profiling. The speed of communication was so quick that users never stopped to think about why a person walking down the street or sitting in their car seemed so “suspicious”; they thought they were helping to keep their neighborhood safe, never realizing that their suspicions boiled down to racial profiling.
When Nextdoor’s founders became aware of the problem, they called on Eberhardt and other experts to weigh in on the best ways to reduce racism at Nexdoor. (Shortform note: Nextdoor executives were so impressed with what they learned about implicit bias that they now have a whole webpage dedicated to it.) They discovered that fear and speed are the two biggest factors contributing to racial profiling: People are most likely to let their biases go unchecked when they’re afraid and they don’t take time to stop and think before acting. Therefore, to combat racism, Nextdoor needed to add just enough friction to the posting process to force users to stop and think about their biases, but not enough that users would become frustrated and abandon the site entirely.
To slow down the process of posting in the “crime and safety” category, Nextdoor implemented a checklist that users have to click through before posting. The checklist prompts users to get specific about what exactly makes someone “suspicious” by reminding them to focus on specific behaviors and give a detailed description of the person’s clothing. Crucially, the checklist also explicitly mentions race: It prompts users to “consider unintended consequences” if their description were to lead to an innocent person being stopped or arrested and reminds them not to “assume criminality” because of someone’s race. That’s important—according to Eberhardt, research shows that explicitly talking about race (instead of just alluding to it) leads people to act more fairly.
For technology companies, adding friction isn’t a natural instinct because the entire point of technology is typically to make daily activities faster and easier. However, in this case, slowing the process down worked: Racial profiling on the site dropped by 75%, with no significant reduction in the total number of users. Nextdoor even created international versions of the checklist that reflect the dominant racial, ethnic, and religious biases in each country, with similar positive results worldwide.
Nextdoor’s Racial Reckoning: Success Story or Empty Gesture?
Eberhardt speaks highly of Nextdoor’s progress in combating racial profiling, but other commentators have raised serious concerns about the company’s approach to racial issues. For example, moderators often remove posts advertising Black Lives Matter protests, and the company still actively recruits local police departments to join the app.
The conversation escalated in 2020, when Nextdoor’s official Twitter made a post in support of Black Lives Matter one week after the killing of George Floyd. They faced immediate backlash from users who felt that, despite the company’s efforts, the Nextdoor app was still riddled with racism. The conversation garnered so much public attention that even Congresswoman Alexandria Ocasio-Cortez publicly called on the company to take concrete action and “deal [with] their Karen problem” instead of just tweeting a hashtag (“Karen” is slang for an entitled, middle-aged, and often openly racist white woman).
To their credit, Nextdoor listened to these complaints and took further concrete steps to address the app’s race problem. Nextdoor’s CEO accepted responsibility for Black Lives Matter posts being deleted and promised to provide bias training for the local “neighborhood leads” who serve as moderators on the app (previously, “neighborhood leads” received no training or vetting). The company also created an antiracism resource page on their site and is working to diversify their mostly-white executive board.
Like many companies, Nextdoor underwent significant upheaval in response to the COVID-19 pandemic and the widespread protests against police brutality. So far, it seems to be too soon to tell whether the changes they’ve implemented will be enough to solve the problem, and no data is available. Whatever its next moves, the company will move forward with Dr. Eberhardt’s guidance—she’s now an official member of their Neighborhood Vitality Advisory board
———End of Preview———
Like what you just read? Read the rest of the world's best book summary and analysis of Jennifer L. Eberhardt's "Biased" at Shortform .
Here's what you'll find in our full Biased summary :
- How implicit bias forms in the brain
- Whether or not bias training actually works
- Why there has been a sudden resurgence in white nationalism