CNET también está disponible en español.

Ir a español

Don't show this again

Oscar Isaac to play Snake 2021 Ford Bronco delayed Walmart drone holiday light show Fauci to join Biden's COVID team Mulan free on Disney Plus The Mandalorian episode recap PS5 inventory

Facebook wants AI researchers to figure out privacy

The company will pay for 5,000 people to learn about "secure and private" artificial intelligence because, as its CEO says, "The future is private."

Facebook headquarters in Menlo Park, California

Facebook is based in Menlo Park, California

Stephen Shankland/CNET

Facebook is working with online learning site Udacity to try to enable AI research that doesn't hurt privacy.

At the social network's F8 conference on Wednesday, Facebook announced that it's offering scholarships to 5,000 people to take a new Udacity course called Secure and Private AI. This is the second phase of a Facebook program to help 300 people earn a Udacity "nanodegree." The idea is to learn to apply techniques that AI powers are using like differential privacy at Apple and federated learning at Google.

Facebook is itself an AI giant, broadly applying the technology called deep learning and employing AI pioneer Yann LeCun, one of three recent winners of the prestigious Turing Prize.

But when it comes to privacy, Facebook has a lot of work to do after mishandling sensitive data, the Cambridge Analytica scandal and a business that profits from knowing all about its 2.38 billion monthly users.

"I know that we don't exactly have the strongest reputation on privacy right now," Chief Executive Mark Zuckerberg said Tuesday at the first day of F8 in San Jose, California. Still, the company's new mantra is "The future is private."

The Udacity course -- a sequel to one on Facebook's PyTorch AI tool that attracted 18,000 students -- dovetails with that privacy focus.

"Without data scientists who know how to properly preserve privacy, private data is either left unused -- a critical loss in fields such as health care -- or is put at risk through data science techniques which lack the proper privacy protections," said Andrew Trask, a research scientist at Google-owned AI firm DeepMind and leader of private machine learning software project OpenMined. He helped design the course.