Neal Mohan, YouTube's chief product officer, is staring at a picture I've pulled up on my phone. It's of my 5-year-old cousin, with brown curly hair and a wide smile. He's big into zombies, a fascination that might be unusual for kids his age, and for a while was obsessed with the Slender Man, an urban legend who stalks children. He found both on YouTube.
When I saw him a few weeks ago, he was grumbling to his aunt about being relegated to watching YouTube Kids, a version of the Google-owned video app for children under 13. He wanted to watch the full-on video streaming service, because that's where the zombie videos are.
I tell this story to Mohan, YouTube's de facto No. 2 executive after CEO Susan Wojcicki, as we're discussing recent troubles at YouTube involving children on the platform, including pedophilia rings, as well as YouTube Kids, which has struggled to gain traction. We're sitting in his office at YouTube headquarters in San Bruno, California, about 15 miles south of San Francisco. I tell him I know the onus is also on parents, but YouTube knows there are children on its site who shouldn't be there. I ask Mohan, 45, to imagine my cousin is in the room and to explain to him why he shouldn't spend so much time watching YouTube.
Mohan, who has three kids, including a 5-year-old daughter, doesn't address the zombies. Instead -- ever the product guy -- he extols the virtues of YouTube Kids by highlighting parental controls for setting limits on the type and amount of content children can watch.
"There's a limit," says Mohan. "Just like for every other type of content, there's a limit to it." He says YouTube Kids has 20 million weekly users -- minuscule by YouTube standards of 2 billion monthly users on the main app -- but says it's "an area of investment."
These kinds of questions aren't foreign to Mohan anymore. As the issues tech companies face evolve -- everything from having to fend off foreign interference in elections to addressing people's greater desire for privacy online to backlash from creators -- Mohan represents a model of a product leader whose role has also evolved. Now more than ever, he has to think about more than just new features, bells and whistles.
He's also factoring in the ill-effects of YouTube's products, like how its automated recommendation engine could push extremist content or promote fake news during a crisis. That makes his job as much about safeguarding the world's largest video hosting platform as expanding it. Whenever YouTube executives talk about the work of securing its services, they refer to it as "responsibility," as if it has a capital R.
"Susan has asked him to step up and take a major role in defining the work around responsibility," says Jennifer Flannery O'Connor, director of product management at YouTube, and Mohan's former chief of staff. "It is now an ongoing problem that is a stable part of his day-to-day and week-to-week job."
Though he isn't the CEO, Mohan, an 11-year Google veteran, is playing a bigger and bigger role in figuring out what the sprawling video empire will become. As YouTube grows, its relationship to society and democracy is becoming more complicated. That puts Mohan in the spotlight, since he presides over our initial point of contact with YouTube, the technical bits and code that make up its app and homepage, even as its policies and guidelines make headlines.
"YouTube has an identity crisis," one former executive tells me. "That starts with product. It doesn't start with content."
On Thursday, Mohan unveiled new tools to help video creators make money on YouTube, during his keynote speech at VidCon, the annual celebration of online video culture in Anaheim, California. One new feature, according to a copy of the speech provided to CNET in advance, lets fans support their favorite creators by subscribing to their channels through different paid membership tiers. Another tool builds on a feature introduced two years ago called Super Chat, which lets people pay to have their comments stand out during live streams. Now creators will be able to offer viewers digital stickers they can buy during streams.
During a wide-ranging interview in late June ahead of VidCon, Mohan and I also discussed whether YouTube is a media company that should be held accountable to the rules and responsibilities that entails. (Mohan says it isn't.) And we talked about the grueling job of content moderators, who are responsible for blocking objectionable content in real time if automated filters don't catch it. Mohan said he's never experienced a full shift of content moderation, like the 10,000 workers contracted by YouTube and Google around the world. But Mohan told me he's committed to doing one.
Tech companies have historically been reluctant to accept responsibility when their platforms are abused. But as Silicon Valley comes to grips with a backlash from lawmakers, regulators and the public, the industry is becoming more proactive. At Twitter, head of product Kayvon Beykpour talks about the "health" of the conversation on the platform. Facebook hasn't had an official product chief since the departure of Chris Cox, who said in March he was leaving. But CEO Mark Zuckerberg says the company has a "broader view of responsibility" for dealing with the unintended consequences of its services.
Mohan still thinks his job, "first and foremost," is building out YouTube's services. That includes developing new features for products like YouTube Music, a Spotify competitor, and YouTube TV, a cable cord-cutter service. But he acknowledges his role must go beyond that. Mohan says part of managing YouTube is "finding a balance" between the site's open platform -- anyone can post a video on the site -- and its community guidelines that ban hate speech and abuse, a mission set forth by Wojcicki.
"I view [dealing with the scandals] as part of focusing on the products," he says. "Susan's laid out this vision for YouTube. And my job -- taking that direction and executing on that -- consists not just of all this product innovation, but addressing what I feel like we should be on the hook for as part of our responsibility as this global platform. And I think they go hand in hand."
YouTube declined to make Wojcicki available for an interview. But in a statement, she echoed Mohan's sentiments. "His leadership and problem-solving skills have helped us increase our focus on responsibility and protecting the YouTube community while preserving the magic of the open platform," Wojcicki said.
Or, as Hank Green, VidCon's co-founder, tells me: "YouTube is powerful and everyone has noticed. You're going to be held to account more now."
At VidCon, Mohan will turn the focus away from controversies and onto new features. The Southern California confab, which celebrates its 10th anniversary this year, is the world's largest gathering of online video personalities. On Thursday, YouTube is unveiling new features aimed at giving creators more ways to make money beyond ads, the primary cash grab on the site.
One new tool is membership tiers. YouTube already lets people pay monthly fees to creators who have at least 30,000 subscribers. The idea is to support them like an arts patron or PBS donor would. The $4.99 fee gives subscribers access to things like unique badges, custom emoji, members-only posts and exclusive live streams. Now YouTube will let creators set up five different levels of membership at various price points from 99 cents to $49.99. The model is similar to rival platform Patreon, which also lets creators set up membership subscriptions.
Mohan also said Super Chat, the feature that lets people pay to have their comments highlighted during live streams, is active on 90,000 channels, with some creators making as much as $400 per minute. (YouTube won't say how many.) The company said it's the top revenue stream for nearly 20,000 channels. YouTube is taking the tool a step further with the introduction of Super Stickers, which will let fans purchase digital illustrations and emoji during live streams.
YouTube is also unveiling an educational product called Learning Playlists that groups together videos around certain topics and organizes them into chapters. It categorizes lessons by difficulty, starting from beginner to more advanced. The company is partnering with services like Khan Academy and TED-Ed for the feature. YouTube will also hide video recommendations from watch pages in Learning Playlists to encourage people to focus on the lessons they're currently watching.
Giving YouTube personalities more ways to make money is a notable move because YouTube has faced scrutiny for its ad business model, which has historically prioritized user engagement. Critics say the financial incentive incites video creators to be more outlandish, provocative or extreme, which leads to much of the toxic or fringe content on the platform.
The economics of the creator business made headlines last month, when YouTube was criticized for failing to shut down the channel of Steven Crowder, a conservative comedian who spewed racial and homophobic slurs at Carlos Maza, a progressive journalist who is Latino and gay. YouTube instead demonetized Crowder, an approach that deprives him of his portion of shared ad revenue. Crowder mocked the move by calling it ineffective, saying he could still make money outside of YouTube by selling merchandise.
Mohan defends the practice of demonetization, calling it "an important lever." (YouTube declined to disclose how much it generates in sales overall, or for creators through ad revenue sharing.)
"I can't speak for the particular channel [Crowder's], but my experience is that monetization is an incentive for many creators on the platform," he says. "And revoking that privilege for a code of conduct reason or policy violation, generally, does have an impact."
Mohan also announced at VidCon that YouTube is updating its abuse policy, especially when it comes to creator-on-creator harassment. The company said the move wasn't spurred by the incident between Crowder and Maza. YouTube won't reveal any more details, but said the change is coming later this year.
"This work is just as critical to the future of the YouTube community as any product launch," Mohan, according to the advance copy of the speech.
Even as YouTube introduces new tools for alternative revenue streams, Mohan, a veteran of Google's juggernaut ads operation, says there probably won't be any seismic shift in the business model anytime soon. "Ads are the primary way that creators generate money on the platform," he says. "I don't see that changing in the foreseeable future."
When YouTube's leaders talk about the platform's issues, they often use the same analogy: a growing city. When the video site was founded in 2005, it started as a small town, with a small population and simple rules. After it was acquired by Google a year later in an all-stock deal valued at $1.65 billion, the city began to grow. Now it's a sprawling metropolis with its own cultures and customs, but also its own crime and safety issues, panhandlers, graffiti and messed-up roads. It takes a bigger police force, hospitals and social services to keep the city -- with all its unique neighborhoods -- humming.
Wojcicki has used the analogy when she's spoken at conferences throughout the past year. Mohan used it during our more than hour-long interview in June.
It's fitting, then, that Mohan's mother has a Ph.D. in urban geography, studying architecture and how cities work. His father was a civil engineer, who worked on big projects like nuclear power plants and airports. He got his Ph.D. in engineering at Purdue University after emigrating from India in 1973. Mohan was born on campus.
Eventually, Mohan's father wanted to take his engineering skills back to India, so the family moved and Mohan attended high school in Lucknow. He came back to the US to attend Stanford in 1992, earning a degree in electrical engineering. He got his MBA at the university in 2005.
Mohan joined Google through the company's acquisition of the ad tech firm DoubleClick in 2008, a $3.1 billion buyout that helped cement Google's dominance in digital advertising. (Mohan served as an executive there.) Wojcicki, who's credited with turning Google's ad business into a $100 billion a year behemoth, tapped him to become her deputy after she took over as YouTube's CEO in 2014.
Now Mohan is one of the most powerful product executives in the world. He meets with YouTube's most popular creators, like Lilly Singh and MatPat, and pals around with NBA All-Star Kevin Durant. At YouTube, Mohan has worked to expand the slate of product offerings. Under his watch, the company has added YouTube Premium, a paid subscription service with Hollywood-produced content and no ads; YouTube Music; and YouTube TV. Still, Mohan says he doesn't think YouTube is a media company because it's mostly an open platform for people to upload content. When pressed, he doesn't directly address what issues could come from regulators who might disagree.
Mohan says YouTube is making "big" investments in those subscription and streaming services. If you watched the NBA playoffs, it was apparent how serious the company is about spreading the word on YouTube TV. The service ran ads during big games and plastered its logo on the hardwood of NBA courts. YouTube declined to disclose how much it spent on marketing its new TV service.
Mohan, a diehard basketball fan with Golden State Warriors season tickets, always knew getting live sports on YouTube TV would be critical. "Neal has been really instrumental in making sure that YouTube had a great relationship with the sports leagues," says Christian Oestlien, vice president of product management for YouTube TV. "Because of where he sits, he can actually speak to where the world is headed, not just for creators, but also for traditional media."
A former YouTube employee who worked with Mohan said he's a "capable" leader, and as a longtime Googler, "well-positioned" when it comes to being a product chief. But YouTube's recent scandals -- and the damage they may have done to the brand -- could hurt his cause.
For the subscription services, "maybe they should consider changing the entire name to something else," and just call the free version YouTube, said the former employee. The paid services could then get a fresh start as a new brand. "YouTube is getting hammered" by bad press, that person said. "It's got to have an impact."
Mohan's office looks like the physical embodiment of the YouTube homepage. It's lined with red trim and red shelves in the same shade as the company's iconic logo. The couch beneath a massive window has four throw pillows that look like YouTube play buttons. A large ship's wheel hangs on a wall above a conference table, an apt piece of decor for one of YouTube's most senior leaders.
The room is vibrant and joyful. It's adorned with sports memorabilia, an '80s style YouTube-branded boom box and several awards Mohan has received during his tenure as a Google exec. It seems designed to celebrate everything good about YouTube. But there's one item that hints at the platform's current struggles. A mug on Mohan's desk that reads, "Harmful in the context conveyance of false information."
It nods at YouTube's role in spreading disinformation. Along with Facebook and Twitter, YouTube faced blowback after the 2016 US presidential election for helping the Russians spread demonstrably fake news. After the Parkland High School shooting last year, YouTube's trending feature prominently showcased a video falsely claiming one of the teen survivors, David Hogg, was a paid crisis actor.
But even in less outrageous cases, YouTube's tools can still unwittingly help spread false information. When the Notre Dame cathedral in Paris went up in flames in April, YouTube's systems did the right thing and only surfaced authoritative news coverage, from media sources like France 24 and NBC News. But below some of the live video streams, YouTube's algorithm accidentally placed a blurb with information about the 9/11 terrorist attacks. The platform's automated tools miscategorized the video. Ironically, the blurb feature, announced last year, was launched to help debunk fake news videos, like those from 9/11 truthers. After the Notre Dame fire, YouTube said its software made the "wrong call."
"Nobody here, including myself, was happy about that," Mohan says now. "We want it to work in the right way. But the technology is not always going to be 100% perfect. So the best we can do is try to correct that as quickly as possible."
Compared to neo-Nazi and conspiracy theory videos, which have also cropped up on YouTube, the Notre Dame misstep is relatively innocuous. But it illustrates how, at YouTube's scale -- more than 500 hours of video are uploaded every minute -- small mistakes can become amplified.
Google prioritized the site's growth from the very beginning, YouTube co-founder Steve Chen tells me. As the search giant worked to close the YouTube acquisition in 2006, then-CEO Eric Schmidt pulled him aside during a meeting.
"You guys get to completely run the ship," Chen recalls Schmidt telling him. "As long as we agree on this simple checkbox." The goal wasn't monetization or revenue. Instead it was about growing video views, uploads and number of users, Chen says. Schmidt, through his company Schmidt Futures, didn't return a request for comment.
Today, Mohan doesn't believe YouTube is too big. Instead, he argues that YouTube's size benefits society because it gives people a voice. "The amount of good that it does in terms of all of our users across the world I think profoundly outweighs some of the challenges that exists in terms of addressing the controversial content on the platform," he says. He adds he doesn't "think too much" about antitrust concerns, even as lawmakers and regulators in the US and Europe call to break up big tech companies.
YouTube's biggest woes recently have been controversial content on the platform. It's hard to please everybody, but Mohan insists the company doesn't make content decisions based on who might be offended -- especially as Republicans and President Trump lob accusations of anti-convervative bias at YouTube and Google, as well as Facebook and Twitter.
"When we're having a discussion [internally] about violent extremism on the platform, we're trying to make sure that we're doing our utmost to protect our users and to eradicate the platform from that type of content," Mohan says,"without any sort of nod to, 'Well, if we do this, this is how some constituency is going to react,' and blah, blah, blah."
Like its industry peers, YouTube has doubled down on human moderators to assist its automated tools when it comes to scouring through toxic content. The work can be punishing, with some of Facebook's contractors reportedly feeling symptoms of post-traumatic stress disorder. Moderation decisions ultimately fall under Wojcicki's purview, but Mohan acknowledged it's important for leadership to know what the moderators go through.
"Unless you're in those shoes, you don't really understand," he says. He's never done a full shift of YouTube content moderation, but says he has "no issue" doing it. When asked in our interview if he'd commit to doing one, he says yes.
YouTube has also been criticized for its recommendation engine, which has been accused of leading viewers down rabbit holes into content that gets more and more extreme -- like videos of white supremacists or sexualized children. The feature, called the "Up Next" tool, has been blamed for turning YouTube into "the Great Radicalizer."
Green, the VidCon co-founder, says YouTube should be more open about its recommendation algorithm. He doesn't think YouTube should share it with the public because it could be gamed by creators and bad actors looking to ratchet up engagement or influence people's viewpoints. But he says the company could give the data to university researchers to study patterns about viewing habits so YouTube could learn how to be more responsible about recommendations.
"They have the data to understand all these problems," Green says. "If they're not studying it because they don't think they need to, or because they're afraid of what they'll find, that's problematic."
Mohan also declined to comment on whether YouTube could follow Facebook's example by deferring to a third-party content oversight committee -- a sort of outside Supreme Court -- that would make rulings on what videos and channels stay up or come down from the site.
After spending time with Mohan, it's clear he enjoys talking about products like YouTube Music a lot more than, say, the employee backlash against YouTube over LGBTQ rights. But he says he understands that obligation. It's especially urgent because, at YouTube's scale, every misstep could mean a creator getting bullied or a child surfing a site that isn't meant for them because the kids alternative isn't compelling.
At VidCon, Mohan doubled down on that message of responsibility.
"Sometimes this work moves more slowly than you would like -- and, frankly, more slowly than I would like," he said, according to the speech. "But we are making good progress." ●