X

The Dick-Pic-Fighting AI That Dating Apps Deserve

Bumble's image-detecting AI goes open source in the company's fight against cyberflashers.

Rae Hodge Former senior editor
Rae Hodge was a senior editor at CNET. She led CNET's coverage of privacy and cybersecurity tools from July 2019 to January 2023. As a data-driven investigative journalist on the software and services team, she reviewed VPNs, password managers, antivirus software, anti-surveillance methods and ethics in tech. Prior to joining CNET in 2019, Rae spent nearly a decade covering politics and protests for the AP, NPR, the BBC and other local and international outlets.
Rae Hodge
2 min read
An image generated by DALL-E's open ai art generator is displayed in the style of digital modern art with characteristics of an oil painting in texture and shading. Its composition is akin to the traditional saint iconography of Eastern Orthodox artwork, with the central figure filling the middle third of a symmetrical image, cropped just below the figure's bust line. It depicts a female commander in a yellow and black mechanical suit who is smiling contentedly with her hand over her heart in front of a warm sunny background, emanating around her as an aureola might, while her swarm of protective robotic bee warriors gently circle around her and await her orders.

Just as tech like OpenAI's DALL-E moves AI image-generation forward into wider public use, tools like Private Detector from Bumble illustrate how image-detection is progressing with AI and machine learning. Here, art generated by DALL-E depicts a female battle commander, eyes closed in a contented smile amid the protective swarm of her robotic bumblebee army.

DALL-E / CNET / Rae Hodge

Bumble can only protect you from dick pics on its own apps. But now its image-detecting AI, known as Private Detector, can give every app the power to shut down cyberflashers for you. 

First released in 2019 exclusively across Bumble's apps, Private Detector automatically blurs out inappropriate images and gives the you a heads-up about potential incoming lewdness. This gives you the chance to view, block or even report the image. On Monday, Bumble released a revved-up version of Private Detector into the wilds of the internet, offering the tool free of charge to app makers everywhere through an open-source repository

Read moreBumble will use AI to protect you from unwanted dick pics

Private Detector achieved greater than 98% accuracy during both offline and online tests, the company said, and the latest iteration has been geared for efficiency and flexibility so a wider range of developers can use it. Private Detector's open-source package includes not just the source code for developers, but also a ready-to-use model that can be deployed as-is, extensive documentation and a white paper on the project.

"Safety is at the heart of everything we do and we want to use our product and technology to help make the internet a safer place for women," Rachel Haas, Bumble's vice president of member safety, said in an email. "Open-sourcing this feature is about remaining firm in our conviction that everyone deserves healthy and equitable relationships, respectful interactions, and kind connections online." 

Bumble's master presentation template is graphic divided into 4 sections, each representing a layer of the image-detection process used by the open-source Private Detector tool.

Bumble said the company's decade of machine learning legwork has allowed it to create flexible, new architecture for its Private Detector neural network that is both faster and more accurate than its 2019 iteration. New to Private Detector is an EfficientNetv2-based binary classifier that boosts the detector's training speed and efficiency, while it works in tandem with the tool's other layers for faster overall execution. 

Bumble

The Private Detector AI model isn't Bumble's only strategy to fight online sexual harassment and cyberflashing. The company has repeatedly lobbied for legislation aimed to stymie the world's least desirable camera-users. In a 2018 survey, Bumble found that one in three women on Bumble received unsolicited lewd photos from someone and that 96% were not happy to see those pictures. Since then, Bumble has successfully lobbied to get anti-cyberflashing laws on the books in both Texas and Virginia and is currently pushing similar measures in four other states. 

"Bumble was one of the first apps to address cyberflashing by giving the power to our community to consensually decide if they would like to see certain photos and creating a safety standard if not. We've been working to address cyberflashing and to help create more online accountability for years but this issue is bigger than just one company," said Bumble public policy head Payton Iheme.

"We cannot do this alone."

And the company may not have to. In offering Private Detector for free, Bumble may have just summoned a swarm of support to its cause.