X
Nick Golebiewski

The cure for Facebook's fake news infection? It might be these women

At Facebook, where men outnumber women almost two to one, the future of news is female.

Alex Hardiman speaks softly. Leading a team meeting in a conference room at Facebook's New York office, where a cardboard cutout of a NASA astronaut leans in a corner next to an arrangement of corporate motivational posters, Hardiman places all five fingertips on the wood table in front of her. She gives her feedback in numbered lists. Moving from one thought to the next, she lifts and lowers her hand like she's playing simple chords on a piano.

To put something in context, she prefaces sentences with "In a world..."

"In a world where people with different viewpoints and opinions cannot come together around a shared set of facts," she tells me in an interview, "that's a very dangerous place to be."

In her hushed, measured tone, it may sound like the most placid movie trailer ever.

But make no mistake, Hardiman and her colleagues are on the kind of high-stakes mission that's ripe for cinematic retelling. They're trying to wipe aside fake news from Facebook's massive social network, a critical source of information to 2.23 billion people, while also fostering a support system for more legitimate reporting. Their success or failure will affect the health of the news industry and the well-being of democracy worldwide.  

And at Facebook, where men outnumber women nearly two to one, the commanders of this mission are women.

antonia-woodford-sara-su-tessa-lyons-facebook-0223

Sara Su, Antonia Woodford and Tessa Lyons (left to right) are misinformation product managers on Facebook's large News Feed team. 

James Martin/CNET

Facebook's two dedicated news groups -- Hardiman's news products team and a news partnerships team run by former CNN and NBC anchor Campbell Brown -- are led by women. A majority of the managers on both teams are women. And the phalanx of Facebook's News Feed employees that handles issues like disinformation and hoaxes has five product managers; three are women.

"They are fearless. They are fierce," says Hardiman of her female colleagues. "It's because, when you think about how to spend your time, for many of us, there's no greater thing that we can try to do than to solve these problems as best as we can."

OK, so what? It'd be easy to dismiss Hardiman's view as girl-power boosterism. Or cheerleading to divert attention from the jeers about Facebook's record on privacy, integrity and security these days. But the accumulation of data shows business teams led by women with greater diversity are more successful than the norm.

09-women-of-facebook

Campbell Brown (left) and Alex Hardiman are the heads of Facebook's two teams dedicated to news on the massive social network. 

Sarah Tew/CNET

And today, society is re-examining Silicon Valley's norms -- disruptive, witlessly idealistic and, yes, male-dominated -- for the nightmares they've created. Twitter, Google 's YouTube and others share in this reckoning with big tech's flaws, but no company encapsulates it more than Facebook.

If a departure from the norm is what Facebook needs, it makes sense to put the fate of news there into the hands of women. Over the course of three months, I interviewed nine women and sat in on three meetings of the people most closely grappling with Facebook's treatment of news. 

I still don't know how this movie ends.

Mission critical

Where Hardiman's fingers alight as if playing a keyboard, Campbell Brown punctuates statements by impressing the table with the side of her hand. It's like stamping "POINT MADE" in black ink on invisible notecards placed in front of her.

"Trying to do this at scale is hard," says Brown, who speaks with the kind of self-composed polish that persists in the off-camera mannerisms of TV journalists. But, she says, fixing Facebook's news problems isn't impossible. "Because [stamp] we have the resources [stamp]. It's a huge [stamp] priority for us, not only for Facebook but for our country." Stamp.

Over the last two years, the world's biggest social network has been besieged by controversy after controversy. CEO Mark Zuckerberg 's notorious first reaction to the role of fake news on Facebook in the 2016 US presidential election was to call it a "pretty crazy idea." Since then, the Cambridge Analytica scandal corrupted public trust in how Facebook wields intimate data about our lives. The company has continued to be a favorite playground for those seeking to interfere in elections.

Chastened by mounting evidence that a mission to "connect the world" isn't necessarily benevolent, Zuckerberg broadened Facebook's mission last year with a new responsibility: building an "informed community." As part of that, Facebook significantly expanded the teams that tackle how your Feed treats news and how the company itself deals with the news industry.

Brown and Hardiman run those efforts. Brown's news partnerships team focuses on Facebook's relationships with news outlets. Hardiman's news products team develops site features for news content, like a red "breaking news" label on a story about an earthquake that just struck. They both contribute to the Facebook Journalism Project, a collection of programs providing tools for publishers and building news products in collaboration with them.   

Both of their New York-based teams coordinate with other factions of Facebook's workforce too, since problems with news fall on overlapping domains. News Feed's "integrity" team, based in Menlo Park, California, handles prototypical fake news like hoax articles and doctored images, for instance.

"Last year was very much about trying to reduce the amount of false news on the platform, reduce the bad," Brown says. This year, the news teams have begun to focus on "elevating the good," she says. They're focusing now on the programs and features to help legitimate news thrive on -- and off -- Facebook.

Brown herself acknowledges her biggest fear is they are too slow.

"Mark always says move faster, but I worry we can't move fast enough, as fast as we need to," Brown says. 

Slow creep

Nick Golebiewski

"Elevating the good" may seem less thorny than dealing with malevolent hoaxes, but, really, all of Facebook's interaction with news is fraught. As a journalist sitting in on weekly meetings of the news products team, I was heartened by how much it advocates for people like me, before being completely creeped out by just how much Facebook knows.

Under Hardiman's direction, the news products team tries to be genuine advocates for journalists inside Facebook. Hardiman repeatedly reminds team members about unintended ways new Facebook features may lose money for news organizations. Make sure the video team is keeping publishers in mind as they build a new format, she instructs them. 

But in a discussion of how Facebook could help journalists on the ground, I got twinges of Orwellian unease. Facebook isn't Big Brother, but it has nearly as much detailed insight into our lives. Almost all of it is detail that we -- its billions of users, journalists included -- voluntarily surrender. Even when that insight is deployed for good, it's creepy. 

Generally speaking, "elevating the good" includes identifying trusted, informative outlets and prioritizing them in News Feed. Hardiman's team also espouses "collaborative product design." It works directly with news organizations as they build, so the results actually suit publishers' needs.

The progress has been slow.

"The only area I've seen positive light -- and I don't know just because it's so early -- is local," says Jason Kint, the chief of a trade group representing digital publishers, Digital Content Next. (Disclosure: CBS Interactive, which publishes CNET, is one of Digital Content Next's 60-plus members.)

Kint means local news, which has been the recipient of Facebook's most substantive initiatives so far. Brown's partnerships team launched a Local News Subscriptions Accelerator in February. The $3.5 million, three-month pilot program helps metropolitan newspapers with digital subscriptions.

The accelerator lines up well with the paywalls that Hardiman's team is testing with a small group of publishers. In it, a Facebook user who reads multiple stories from the same publisher in Instant Articles, a mobile layout for stories, hits a limit on free reading. The paywall links to the outlet's own site, where the reader can subscribe directly.

Paywalls, at least, demonstrate Facebook has been chastened enough to change. Instant Articles banned paywalls initially. Zuckerberg disapproved of them, asking how tollbooths create a more open and connected world. Now Facebook is doubling down on the accelerator concept with a second $3.5 million program later this year to help nonprofit and local news organizations.

facebook-f8-2016-mark-zuckerberg-0086

Mark Zuckerberg, Facebook's founder and CEO, initially dismissed the notion that fake news on Facebook had a role in the US presidential election as a "pretty crazy idea." 

James Martin/CNET

Hardiman says some of Facebook's efforts are beginning to show "some really good early signs." A test called Today In, which gathers local news and information into a dedicated place on the social network, has increased publishers' distribution by 8 percent on average.  

But the slow pace of progress is partly because collaborative design takes extra time and effort. It'd be faster for the team to stick to its own instincts and barrel through changes without any outside input, in more typical Facebook fashion.

But Hardiman has seen how that fell flat when she was on the other side. In 2015, she was The New York Times' vice president of new products when Facebook launched Instant Articles. "The key proposition was that it was really fast," she recalls. "I was like, 'OK, I just rebuilt our entire mobile website to be really fast.' You're not listening."

Weapon of choice

As far as "reducing the bad," Facebook's fumbles with Alex Jones underscore the company's weaknesses reining in fake news itself.

Jones' Infowars assumes the guise of a news organization to prolifically deliver conspiracy theories. It built a following of millions of people on Facebook but also had a track record of violating Facebook's rules on hate speech and harassment. Still, Facebook executives bumbled through inconsistent, confusing explanations for why they continued permitting Jones and Infowars to post there before terminating pages this month.  

02-women-of-facebook

Campbell Brown was a television news journalist before joining Facebook.

Sarah Tew/CNET

News Feed ranking is Facebook's weapon of choice against fake news. Facebook doggedly resists removing disinformation outright unless it also violates community standards like those against harassment or hate speech, a la Alex Jones' Infowars. Its preference is to "downrank" disinformation -- effectively burying it at the bottom of News Feed.

Downranking reduces the spread of false news, while still "staying true to what we are, which is a platform for expression and connection," says Tessa Lyons, one of the misinformation product managers on the News Feed team in Menlo Park. "Why don't we delete things just for being false? Because we think that would cross that threshold."

This is a central conundrum for the Facebook news teams. For years, Facebook was loath to admit it was a media company, even as nearly half of US adults said they get their news from Facebook. So it also refused the responsibilities that come with media power, like having editorial standards.

"That control-without-liability thinking permeates so much of their rhetoric," says Brett Johnson, an assistant professor of journalism studies in the Missouri School of Journalism.

Facebook's identity as a neutral platform was great for business, even when outside groups exploited the network to sow division. People worked into a frenzy are an engaged audience, catnip for advertisers. But that neutrality is at odds with Facebook's latest kumbaya mission to bring the world closer together. How can Facebook reject the role of referee if it wants to keep people from brawling with each other over lies?

Brown and Hardiman would argue that Facebook isn't refusing its editorial role anymore.

The news teams decided their mission "meant actually having an opinion, taking responsibility for our content, and deciding that we were going to do a lot to actively prioritize quality journalism," Hardiman says.

Brown calls it "a big step for Facebook," this acknowledgment that Facebook must define quality news and promote it.

That sounds great. But outside skeptics haven't been impressed.

Move fast and break things

Between the Cambridge Analytica scandal and Facebook's struggles to contain fake news, the company continues to draw scrutiny from lawmakers and consumers -- and bitterness from publishers.

Nick Golebiewski

"The level of animosity [toward Facebook] is the highest I've seen, more so than two years ago," Kint says. "It still feels like they're inching up to the starting line," he says.

They're also being more circumspect about unintended consequences, Hardiman says. Facebook's bent toward democratizing information was great for diversity of voices, but News Feed's algorithm had the unintentional effect of giving top billing to sensationalist, hoaxy content, Hardiman says. 

"How do we ... see different unintended consequences downstream five years from now that we wish we would have caught?" she says. "We're really trying to think about [News Feed] ranking in a very different way now."

In addition to the $7 million dedicated to the two accelerators, Facebook is contributing $1 million to NewsMatch, which fund-raises for nonprofit US newsrooms; $1.25 million in journalism scholarships; and $1 million to the News Literacy Project to expand a virtual classroom for middle- and high-school students. The company has expanded the news teams and invested in training programs, fellowships and other programs, but it hasn't publicly disclosed figures for other direct investment in journalism beyond that $10.25 million.  

Several million dollars to fund better journalism sounds noble, but it's still spare change for Facebook. Last year, the company spent almost the same amount -- $10 million -- on personal security details for Zuckerberg and Chief Operating Officer Sheryl Sandberg .

Different realities

So Facebook has an excess of catastrophes and a shortage of women. But by addressing its gender shortcomings in the area of news, it may stand a better chance of heading off more disasters.

Why? Data shows women in the workplace get more stuff done.

Studies have demonstrated that businesses with women leaders outperform peers. One study in 2015 found that Fortune 1,000 companies with women CEOs have triple the stock returns versus S&P 500 companies run predominantly by men. Another study found financial performance was more than a third higher at companies with the most women in senior management versus those with the least.

Meet the women fighting fake news on Facebook

See all photos

Catherine Ashcraft, a senior research scientist with the National Center for Women & Information Technology, says diverse teams have been shown to perform better on a number of measures -- innovation, problem solving, number of creative solutions, time on task. Part of the reason, she says, is that women tend to score higher on things like social sensitivity and turn taking. These traits aren't biologically baked-in; women are socialized for them, she says.

Gender parity, however, hasn't made huge strides at Facebook.

When the company issued its first diversity report, in 2014, its total workforce was 31 percent women. Now, five years after Sandberg's "Lean In" became a bestseller, Facebook has increased the level of women to 36 percent.

Nick Golebiewski

"People like to brag about moving forward a percentage point at a time. I don't see that as successful movement at all," says Brenda Darden Wilkerson, the CEO of AnitaB.org, an organization that advocates for more women and diversity in the tech industry . If a Facebook product improved at that rate, the company would scrap it, she says.

In May, Facebook shuffled leadership in its biggest shakeup since it was founded. All but one of the 14 freshly empowered leaders were men. When I bring up that male-dominated snapshot, the women on Facebook's news teams grimace slightly.

"It's not the reality that I live in every day. The leaders that surround me are mostly women, whether it's Alex or Campbell or Fidji [Simo, Facebook's head of video]," says Mona Sarantakos, a product manager for news formats. Conversations with Sarantakos are almost always punctuated by her laugh, an outburst guffaw that can be heard through walls. This, however, she doesn't make light of.

"When I also saw that image," she says, "like everybody else, I was like, 'Oh, wow ... there is a lot of work to do.'"

Though not apparent on the outside, the internal reaction to the shakeup was similar to the one critics had on Twitter. Discussed on internal channels and at all-hands meetings, "there was an acknowledgement of what was lacking in that picture, and there was an immediate and proactive kind of statement of, 'We are aware that this is something we need to work on,'" says Mollie Vandor, product manager for news credibility.

In a world...

Like many of the women I spoke with, Vandor compares her experience on Facebook's news teams favorably to more male-dominated tech groups where she's worked before.

"Often in situations where you're the only woman or you're one of the small handful of women, you often end up doing a lot of the emotional labor for the broader group," she says. "The beauty of this team is that we share that, and it feels like the weight is less on any one of our shoulders."

One benefit about working with a lot of women, Vandor says, is that she doesn't have to actively monitor herself. It "frees up time and energy to focus on just doing my job," she says. She also says that under Hardiman, it's the first time she's worked for a products leader "that I can see myself in, and that I can aspire to be."

07-women-of-facebook-news

Mona Sarantakos, Meredith Carden and Mollie Vandor (left to right) handle initiatives like formats for breaking news, fact-checking features and tools that help Facebook users gauge the credibility of the news they see there. 

Sarah Tew/CNET

During team meetings, I noticed subtle ways that the presence and leadership of women may create differences from typical tech workplaces. When Hardiman praises a manager for a job well done, that woman in turn credits people junior to her. It reminded me of amplification, a strategy women in the Obama White House used when a female colleague's idea went unnoticed: Restate it and credit her for suggesting it. This felt like an inverse amplification, what may happen when women don't need to fight to be heard.  

Hardiman and Brown attest that upper management has their backs.

"A change from the past is that the shift to quality required a lot of really deep conversations with the executive team all the way up, and buy-in from Mark, Sheryl and the rest of the leaders of the company," Hardiman says.

But how much influence can any team have at Facebook, be it led by men or women, in a world where Mark Zuckerberg always writes the endings?

Hardiman, in her understated, measured way, offers a sentiment repeatedly, whether discussing how Facebook helps legitimate news or how it addresses its own gender shortfalls.

"We still, to be honest, have work to do."