X

Are fake videos next?

There's been a rash of fake photos on the Web. Now, a Dartmouth professor crafts tools to spot doctored video and audio. Photo hoaxes

Michael Kanellos Staff Writer, CNET News.com
Michael Kanellos is editor at large at CNET News.com, where he covers hardware, research and development, start-ups and the tech industry overseas.
Michael Kanellos
5 min read
Dartmouth Professor Hany Farid already devised software tools to detect when someone has tampered with digital photos. His next challenge: determining whether video or audio files have been retouched.

"I thought, 'This is going to be so much easier,' but it turns out to be much harder," Farid said. "In a minute (of) video, you are talking about thousands of images. Just the sheer mass of data that you have to contend with is challenging. You have memory and run-time issues that you don't have with (still) images."

Hany Farid
Hany Farid
Dartmouth professor

The Dartmouth Image Science Group is also releasing a series of tools that will enable law enforcement officials, scientists and media outlets to detect photo fraud more easily, he said.

Faster processors, enhanced editing software and a worldwide audience have made fake and retouched photographs into a major phenomenon.

And the fakery hasn't been limited to small-time pranksters. Reuters, an international news wire service, caught heat by publishing a Beirut battle photo that contained an extra plume of smoke for dramatic effect. (Farid's software helped reveal that enhancement.) During the 2004 presidential election, fake photos disparaging both the John Kerry and George Bush camps made the rounds on the Internet. Child pornographers also employ photo retouching to skirt felony laws.

Photo trickery also has a supporting role the controversial movie "Death of a President," which opened Sunday at the Toronto Film Festival. Criticism of the fictional film has centered in part on a publicity still that purports to show President Bush being shot--digital techniques were used to superimpose the president's head on an actor's body--along with digital melding of movie actors into actual footage of the president and his staff.

Although media fraud has centered mostly on photos, there's no reason it won't migrate on a larger scale to digital audio and video streams.

"Audio is not that difficult to tamper with. Our auditory system is fairly forgiving," Farid said. "Video is very hard to tamper with. The tools to tamper with video are not as sophisticated as those for photos, but we might as well get a jump on it."

His work with video and audio files, so far, is fairly preliminary. Farid, and graduate student Weihong Wang, have published a paper on video forensics, and they have three more papers in the pipeline. It may take two years or so before software emerges that can conduct forensic tests on video.

Devil in the details
Software for detecting fraud in video or audio will likely be similar to the kind employed to smoke out photo hoaxes. Roughly speaking, the software will look for unnatural anomalies in the digital transcript. Video, for instance, is interlaced: Individual images contain only half the horizontal lines that make up a picture. The succeeding frame contains the missing lines. Run rapidly together, the brain perceives a cohesive whole.

Software that highlights hiccups in the interleaving pattern could reveal edits. Different tools could conceivably be created to ferret out inexplicable light patterns, chromatic anomalies, duplication of scenes or images, or even inconsistencies with the underlying metadata. (Was a night-shot function used? Is it consistent with the image? Was the original data subsequently altered?)

Photo hoaxes

Scene discontinuity--that is, small but inexplicable jumps--in video streams may also prove handy in detecting fraud, but so far it's been difficult to quantify continuity from one scene to another.

Similarly, unexpected patterns in background noise and duplication detection could be employed in examining audio transcripts.

Probability plays a significant role in fraud detection, but so does an underlying understanding of the hardware. Digital still cameras from different manufacturers, and often different cameras from the same manufacturer, operate under different JPEG quantization tables, Farid said. These tables determine that rate at which a camera will drop data in compressing a photograph. Farid's group has come up with software for examining the quantization tables among different cameras.

Adobe Photoshop, meanwhile, has its own distinct quantization table. As a result, the software can tell if a photo has been run through Photoshop or came from a source other than claimed.

"I can't tell you the serial number of the camera, but I can tell you this did not come from a Canon PowerShot. It came from a Nikon," he said. "You can also tell if it came through Photoshop. It won't tell you what happened to the image, but it tells you it did not directly come out of the camera."

In a recent court case, the police submitted photos that were taken with a security camera. An analysis of a photo submitted for evidence revealed that it had been run through Photoshop, according to Farid, who served as an expert witness. Farid said that it did not appear that the police tampered with the images in an objectionable way--often photos get cropped slightly with Photoshop--but the incident underscored how fraud could enter into a court case.

In a civil case involving personal injuries, an analysis of the pictures submitted by the plaintiff revealed that the photos were all produced under different quantization tables. "That's weird," he said.

Examining those photos
In the meantime, the Dartmouth group is porting its photo forensic tools to Java. This should enable a larger number of organizations to exploit them. So far, six of the tools have been ported to Java while the group is finalizing the port on two others: one that detects anomalous lighting and one that looks for unusual color distortions. The software porting work should be complete toward the end of the year.

Starting in 2007, the group will then likely begin to train police agencies and selected media outlets to use the software. The FBI forensics lab in Quantico may help conduct the training sessions, which will likely last a few days. The tools run on ImageJ, a freely distributed application.

"You really have to understand the algorithms to understand the code. If you are going to run the JPEG quantization table, you really have to know what a JPEG quantization table is," he said. "In the hands of someone who doesn't understand the algorithms, it can be dangerous, because they could make incorrect inferences."

Distribution remains a problem. Broadly disseminating the technology could help crack down on photo fraud; on the other hand, it could also help potential fraudsters spoof the safeguards. Most likely, distribution will be limited: Photo editors, but not freelance photographers, at mainstream media outlets may get the software.

"You do diminish the power the software if you make it completely, widely available," he said.

Safeguards to prevent copying will also likely be employed. Farid, however, emphasizes that neither he nor Dartmouth is seeking royalties or patents on the software.