X

Google sees alleged child porn in man's email, alerts police

A Houston man is charged after police say Google tips them off to alleged child porn in his e-mail.

Chris Matyszczyk
3 min read

skillern.png
The accused. KHOU screenshot by Chris Matyszczyk/CNET

Does Google have the right to tip off police if it sees that you have allegedly illegal content in your email?

This question will concern some after news emerged that a Houston man, John Henry Skillern, was arrested by police for possession of child pornography.

Police told KHOU-TV that Google has spotted three allegedly pornographic images of children in Skillern's email and had tipped off the National Center for Missing and Exploited Children.

"He was trying to get around getting caught, he was trying to keep it inside his email. I can't see that information, I can't see that photo, but Google can," Det. David Nettles told KHOU.

Skillern, 41, who works at Denny's and is a registered sex offender, was arrested after police obtained a warrant. Police say they found more evidence of child pornography on Skillern's devices.

I have contacted Google to ask whether the police's declaration of the company's involvement was accurate. Moreover, I asked whether this was part of standard operating policy.

Google has never made a secret of the fact that it scans email content. Its Terms of Service was updated in April to explain that the company's "automated systems analyze your content (including emails)."

The company made it even clearer by explaining: "This analysis occurs as the content is sent, received, and when it is stored."

These changes came shortly after the company was sued in California over its email scanning being used to deliver ads to college students.

At the same time, an attempt by non-Gmail users to create a class action suit against Google for non-consent of its scanning failed, as the judge ruled that that the parties were too disparate to claim class action status.

While no one could be against protecting children against predators, those who already have concerns about privacy will wonder what other circumstances might cause Google to inform authorities of one kind or another.

I will update should I hear from the company.

Updated 9:13 p.m. PT: A Google spokeswoman told me that the company doesn't comment on individual accounts. However, Google has been open about using technology -- other sites use the technology too -- that tries to identify child pornography throughout the Web.

It co-funds the Internet Watch Foundation, which, as Google Chief Legal Officer David Drummond describes, does "critically important work that few of us could stomach -- proactively identifying child abuse images that Google can then remove from our search engine."

He added: "While computers can detect the color of naked flesh, only humans can effectively differentiate between innocent pictures of children and images of abuse. And even humans don't get it 100 percent right."

In this case, my understanding is that a significant red flag that influenced Google was Skillern's previous history as a registered sex offender.

One of the more difficult questions in this case, as in many others is: What if Google is wrong about images being child porn? A recent case involving Instagram, in which a mom was accused of posting inappropriate images that seemed to many entirely innocent, showed the dangers inherent in using technology plus human oversight to catch the genuinely criminal.

There will always be the vexing question of whether Google should be the policeman at all.

Google's argument is that it doesn't want to reveal too much of what it does because it doesn't want criminals to know how they might be caught. It's a convenient argument, one relying wholly on trusting Google, something that has been proved too often to be not necessarily wise.

For Google, it seems that the public good in attempting to eradicate child pornography takes precedence over what some might consider private communication. It's an argument with inherent dangers.