X

Bumble's Cracking Down on This Very Specific Dating App Discrimination Ploy

The company finds some daters file false violation reports based on identity-based discrimination.

Erin Carson Former Senior Writer
Erin Carson covered internet culture, online dating and the weird ways tech and science are changing your life.
Expertise Erin has been a tech reporter for almost 10 years. Her reporting has taken her from the Johnson Space Center to San Diego Comic-Con's famous Hall H. Credentials
  • She has a master's degree in journalism from Syracuse University.
Erin Carson
2 min read
The Bumble logo on a phone against a pink background.

Making false reports on Bumble could get you banned.

Sarah Tew/CNET

Bumble moderators dismiss about 90% of violation reports directed at gender nonconforming members. That's because the dating app has discovered some daters will report a profile as a means of discrimination, not because that profile actually violates any rules.

Now, Bumble is cracking down on false reports targeting daters for their identities, the company said Thursday. Those who make intentionally false reports could find themselves booted off the platform. 

"Identity-based hate is an issue that negatively affects many communities, and is something that increasingly many gender nonconforming folks, like trans and nonbinary people, have faced in online dating," Bumble said in a statement. 

A 2021 Medium piece from MIT Media Lab summarized how dating apps can be fraught and even dangerous places for people in marginalized communities. From filters that let daters keep out certain ethnicities to privacy concerns for those on apps geared to the LGBTQ community, online dating can be yet another avenue for the expression of hatred toward a specific group.

Bumble's move comes with a broader policy announcement banning identity-based hate, spanning "race, ethnicity, national origin/nationality, immigration status, caste, sex, gender, gender identity or expression, sexual orientation, disability, serious health condition, or religion/belief."

"We want our members to connect safely and free from hate that targets them simply for who they are," Azmina Dhrodia, Bumble's safety policy lead, said in a statement. 

This week Bumble also made an AI tool that it built called Private Detector open source -- Private Detector intercepts unwanted dick pics, blurring them and warning daters before they get an eyeful of something they didn't ask for.