A Bear's Face on Mars Blake Lively's New Role Recognizing a Stroke Data Privacy Day Easy Chocolate Cake Recipe Peacock Discount Dead Space Remake Mental Health Exercises
Want CNET to notify you of price drops and the latest stories?
No, thank you

YouTube plans more people, better algorithms policing content

The number of employees reviewing content anacross Google will increase to more than 10,000 in 2018, says Susan Wojcicki.

Google CEO Sundar Pichai Opens I/O Developer Conference
YouTube CEO Susan Wojcicki says the video-sharing site will increase the number of employees reviewing content on the site to more than 10,000 in 2018.
Justin Sullivan/Getty Images

YouTube plans to add more human moderators and to increase its use of machine learning to cut down on content that violates the video-sharing site's policies.

YouTube CEO Susan Wojcicki said in a blog post published Monday evening that Google will increase the number of content moderators and other employees reviewing content and training algorithms to more than 10,000 in 2018. Wojcicki said it's taking the lessons learned in the past year tackling violent extremist content and applying it to other "problematic" content.

"Our goal is to stay one step ahead of bad actors, making it harder for policy-violating content to surface or remain on YouTube," Wojcicki wrote in her blog post.

The changes come in the wake of an advertiser boycott of the Google-owned video site over videos with children that were the target of sexually inappropriate comments. YouTube killed hundreds of accounts, removed more than 150,000 videos from the platform and turned off comments on more than 625,000 videos targeted by alleged child predators.

The company will also focus on training its machine-learning algorithm to help human reviewers identify and terminate accounts and comments violating the site's rules. Machine learning, a key aspect of artificial intelligence, has helped the company remove 150,000 videos containing violent extremist content since June, a process that "would have taken 180,000 people working 40 hours a week" to complete, Wojcicki said.

The company's moderators are now removing five times more videos than they used to, thanks to machine learning, she said. Those algorithms flag 98 percent of the videos removed for extremist content.

Wojcicki also said YouTube will take a "new approach" to advertising on the site, deciding which channels and videos should be eligible for advertising.

"We are planning to apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should," she wrote.

YouTube said last month it had removed ads from nearly 2 million videos and more than 50,000 channels that tried to pass off as family-friendly but that featured inappropriate content. The company also outlined new rules to make YouTube safer for kids.

Update, Dec. 5 at 11:15 a.m. PT: Clarifies that Google is increasing the number of people reviewing content, not just YouTube.

The Smartest Stuff: Innovators are thinking up new ways to make you, and the things around you, smarter.

Special Reports: CNET's in-depth features in one place.