How Facebook fights child porn

It's an Internet-wide problem that Facebook, among many others, is proactively trying to combat.

A WND.com screenshot of a Facebook profile page depicting alleged child pornography activity on Facebook. WND.com, screenshot by Larry Magid/CNET

It's hard not to be affected by an article titled "Kids Raped, Sodomized on Facebook Pages," the first of a four-part WND series about child porn and Facebook.

The article alleges that the blog "located dozens of child porn images after 'friending' many likely pedophiles and predators who trade thousands of pornographic photos on the social network."

Unlike legal "adult pornography," child porn depicts sexual exploitation of children, in some cases very young children. Child porn is illegal in the United States and many other countries. Anyone who knowingly produces, transmits, stores, or possesses child porn could face a long prison term and a lifetime on sex offender registries. Children depicted in these images and videos are victimized during the production and throughout their lives, as the images proliferate and continue to be viewed, often for many years.

While it's good that WND is bringing attention to the general problem of online child pornography, it's unfortunate that it is focusing only on Facebook. Child porn has been around for a very long time, and I can't think of any interactive service with user-supplied content that hasn't been used for this illegal activity.

I'm not new to this issue. Although I don't speak for the organization, I'm a longtime board member of the National Center for Missing and Exploited Children (NCMEC), and am founder of SafeKids.com and co-director of ConnectSafely.org, which receives financial support from Facebook. ConnectSafely is also on Facebook's Safety Advisory Board.

Reports to CyberTipline
I was on the podium in 1998, when senior federal officials and the National Center for Missing & Exploited Children announced the CyberTipline to serve as the official place for individuals, Internet service providers, and interactive services (such as Facebook) to report suspected child porn.

Since then, more than 1.4 million tips have been reported. More than 6,000 tips were reported during the first week of this month alone. The Child Victim Identification Program, which was launched in 2002, has reviewed more than 68 million images and videos and, in collaboration with law enforcement, and has identified more than 4,000 child victims over the past decade.

Most of the CyberTipline reports come from United States-based electronic service providers that are registered with NCMEC, but some come from the general public, law enforcement agencies, hotlines, and authorities in other countries. Of course, not all reported cases are of actual child porn. Some depict young adults who appear to be underage.

National Center for Missing & Exploited Children President Ernie Allen National Center for Missing & Exploited Children

Although I have never run into child porn on Facebook, I take WND at its word that its investigative staff was able to locate some. After all, Facebook is the world's largest photo-sharing network and, with nearly a billion members, it's inevitable that some people will find ways to use the service to share illicit photos. But, as NCMEC President Ernie Allen said in an interview, "Facebook has been very aggressive in looking for it and responding to it."

Facebook also works directly with law enforcement agencies. Los Angeles Police Lieutenant Andrea Grossman, who is regional commander of the Los Angeles Internet Crimes Against Children (ICAC) unit, said Facebook has been "extremely good at reporting and removing child porn from their site." She said her department has "had no problems with them responding to us, nor them contacting us when they've found material."

Jerry Strickland, communications director for the Texas Attorney General's office, said "We've been very active with Facebook at many levels, trying to determine whether or not crimes have occurred and children have been victimized, and trying to figure out more ways we can collaborate, with the ultimate goal of being able to stop criminal activity before it starts."

Facebook is the first company to use PhotoDNA technology that, according to Allen, was "developed for NCMEC by Microsoft and Dartmouth College to match images of the 'worst of the worst' content so that they can find it and remove it based on violations of their terms of use."

Allen called Facebook's approach "proactive," in reporting the content to NCMEC so that it can be provided it to the appropriate law enforcement agencies.

Allen said "this is a large, complex problem across the Internet. It is proliferating, and it is too early to say how successful anybody is in remedying it." He said reports to NCMEC "have tripled in the past two years (119,000 in 2009 to 326,000 in 2011)," and that "Facebook is a major reporter, but certainly not the only company doing it."

Allen said "images and videos (reviewed by NCMEC) have increased from 1 million in 2005 to 17.3 million in 2011, and we had one week in February 2012 in which our analysts reviewed 1 million images in a single week. So clearly, the numbers are continuing to grow." He said "this is a major problem across the Internet, but Facebook is aggressively looking for illegal content and reporting it to NCMEC. It is what we want every company to do."

How PhotoDNA works and why it's not a panacea
As a NCMEC board member, I had an early and in-depth look at PhotoDNA. It works by comparing images that are very similar but not necessarily identical.

Prior to using this technology, NCMEC's analysts would use a "hash filter" to identify images that were exact matches to known child porn images but that broke down if even a single pixel in the image was changed.

With PhotoDNA, it's possible to catch an image, even if it's been resized, cropped, or slightly altered. It's a great tool for analyzing large quantities of images to create probable matches, which are then reviewed by human analysts. There are, however, limitations. To begin with, it has to be similar to a known image (which is quite common), but if it's a new image, there will be nothing to match it to. Also, it works only with still images, not with video. Allen said companies are working to extend the technology to video.

You and your children not at risk of viewing child porn
Child pornography is a horrible crime, and the children depicted are victimized for life, but it is highly unlikely that you or your children will ever come across it on Facebook by accident. I spend a lot of time on Facebook and have never seen child porn. Also, as co-author of A Parent's Guide to Facebook, I spent time using an alias registered as a teenage girl to document how Facebook looks to minors, and "she," too, never came across any child porn or predator activity.

That's not to say that it can't be found by an investigative reporter, but it's rare for someone to stumble on it.

At ConnectSafely, we operate a forum where users report issues and problems they encounter on Facebook and other social-networking services, and of the thousands of reports we've handled in the past five years, our community manager recalls only one case of suspected child porn that has been reported to us by a Facebook user.

Click on Options link at bottom of photo to report

We can all help
In addition to technology, Facebook also relies on reports from members. If you ever come across an image, video, or anything else on Facebook or any other service that violates the company's terms of service, you should report it immediately to the company.

For posts (including posts with photos or video), you click on the X to the right of the post. For photos, there is an Option link where you can "Report this Photo." If the post, photo, or image it has anything to do with sexual exploitation of children, including child pornography or a suspected case of an adult seeking a sexual relationship with a minor, you should also report to the CyberTipline online or by calling 800-843-5678.

Disclosure: I'm co-director of ConnectSafely.org, a nonprofit Internet safety organization that receives financial support from Facebook and other technology companies.

About the author

Larry Magid is a technology journalist and an Internet safety advocate. He's been writing and speaking about Internet safety since he wrote Internet safety guide "Child Safety on the Information Highway" in 1994. He is co-director of ConnectSafely.org, founder of SafeKids.com and SafeTeens.com, and a board member of the National Center for Missing & Exploited Children. Larry's technology analysis and commentary can be heard on CBS News and CBS affiliates, and read on CBSNews.com. He also writes a personal-tech column for the San Jose Mercury News. You can e-mail Larry.

 

Join the discussion

Conversation powered by Livefyre

Don't Miss
Hot Products
Trending on CNET

HOT ON CNET

iPhone running slow?

Here are some quick fixes for some of the most common problem in iOS 7.