X

Marijuana or broccoli? Facebook illustrates AI's challenges with this example

From hate speech to misinformation, the social network has big problems.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
3 min read
20190501-fb-f8-01-3

Facebook CTO Mike Schroepfer says Facebook's AI can distinguish between images of marijuana (left) and broccoli tempura (right). 

Screenshot by Stephen Shankland/CNET

Facebook uses both human beings and artificial intelligence to combat some of its toughest problems, including hate speech, misinformation and election meddling. Now, the social network is doubling down on AI.

The tech giant has come under fire for a series of lapses, including its failure to pull down a live video of terrorist attack in New Zealand that killed 50 people at two mosquesContent moderators who review posts shared by the social network's 2.3 billion users say they've suffered trauma from repeatedly looking at gruesome and violent content. But AI has also helped Facebook flag spam, fake accounts, nudity and other offensive content before a user reports it to the social network. Overall, AI has had mixed results.

Facebook CTO Mike Schroepfer on Wednesday acknowledged that AI hasn't been a cure-all for the social network's "complex problems," but he said the company was making progress. He made the remarks in a keynote at the company's F8 developer conference.

Schroepfer showed the audience photographs of marijuana and broccoli tempura, which look surprisingly similar. Facebook employees, he said, built a new algorithm that can detect differences in similar images, allowing a computer to distinguish which was which.

Read more: CBD: What it is, how it affects the body and who it might help

Schroepfer said similar techniques can be used to help machines recognize other images that might otherwise escape the social network's detection.

"If someone reports something like this," he said, "we can then fan out and look at billions of images in a very short period of time and find things that look similar."

Facebook, which doesn't allow the sale of recreational drugs on its platform, discovered that people tried to work around its system by using packaging or baked goods, such as Rice Krispies treats. The social network can now flag those images by putting together signals like the text in a post, comments and the identity of the user.

"This is an intensely adversarial game," Schroepfer said. "We build a new technique, we deploy it, people work hard to try to figure out ways around this."

Identifying the right images isn't the only AI challenge the company is facing. When the company was building a smart camera for its Portal video chat device, Facebook had to make sure the technology wasn't biased and could recognize age, gender and skin tone.

Facebook is also trying to train its computers to learn with less supervision in order to tackle hate speech in elections. 

But as the social network uses AI to moderate more content, it also has to balance concerns that it's being fair to all groups. Facebook, for example, has been accused of suppressing conservative speech, but the company has denied those allegations. And people might disagree about what's considered hate speech or misinformation. 

Facebook data scientist Isabel Kloumann said in an interview that when the company is determining what is hate speech the identity of the person could be an important factor along with who they're targeting. At the same time, Facebook has to balance safety concerns with whether they're treating groups of people equally.

"We don't have a silver bullet for this," she said. "But the fact that we're having this conversation is the most important thing."

Originally published May 1, 1:46 p.m. PT
Update, 5:19 p.m.: Adds comments from Facebook data scientist and more background.