Spy reportedly used AI-generated photo to connect with targets on LinkedIn

A fake account had links to politically connected figures in Washington, the Associated Press reports.

Steven Musil Night Editor / News
Steven Musil is the night news editor at CNET News. He's been hooked on tech since learning BASIC in the late '70s. When not cleaning up after his daughter and son, Steven can be found pedaling around the San Francisco Bay Area. Before joining CNET in 2000, Steven spent 10 years at various Bay Area newspapers.
Expertise I have more than 30 years' experience in journalism in the heart of the Silicon Valley.
Steven Musil
3 min read

Katie Jones is part of "a vast army of phantom profiles" on professional job networking site LinkedIn, according to the Associated Press.

Associated Press

The use of artificial intelligence for facial recognition has raised privacy concerns with many. But now a related application of AI technology is raising espionage fears. The technology was recently used by a spy to create a fake profile photo on LinkedIn to attract would-be targets, the Associated Press reported Thursday. The photo appeared on the LinkedIn account of one Katie Jones, a 30-something redhead.

Among her 52 connections were links to Washington political figures, including a deputy assistant secretary of state, a senior aide to a senator and a prominent economist being considered for the Federal Reserve, according to the AP.

The news outlet found that despite her claims of working for years as a "Russia and Eurasia fellow" at the Center for Strategic and International Studies, the DC-based think tank had no record of her employment. Similarly, the University of Michigan could find no record of her claimed degree in Russian studies.

That's because the woman described in the profile doesn't exist. According to the AP, Jones is part of "a vast army of phantom profiles" hiding on the professional job networking site.

AI is one of the hottest trends in the tech world, with companies like Google , Apple and Amazon using the tech to let your phone recognize real-world objects, help you better run your smart home or find a photo in your camera roll. 

But new uses can be troubling. AI-generated deepfakes let people create videos that show people saying and doing things that in reality they didn't. And two years ago, researchers at Nvidia created a neural network algorithm that can separate aspects of an image and learn to generate new images. The graphics chip maker published a study in 2017 detailing how it used two neural networks to create photo-realistic images of fake celebrities.

The photo of Jones appears to have been created using programs called generative adversarial networks, or GANs. The process has become popular in the past year, spawning a website called  This Person Does Not Exist, which uses a complicated algorithm to create a facial image from scratch.

Because the program creates a unique image, it can't be traced using a reverse image search -- a common technique for avoiding various internet scams.

The Jones account activity is typical of espionage efforts on LinkedIn, experts told the AP. William Evanina, director of the US National Counterintelligence and Security Center, said foreign spies often use fake accounts on the networking site to get close to American targets. China, he said, has engaged in "mass scale" spying on LinkedIn.

"Instead of dispatching spies to some parking garage in the US to recruit a target, it's more efficient to sit behind a computer in Shanghai and send out friend requests to 30,000 targets," he said in a written statement.

The Jones account has since been deleted, and LinkedIn said it doesn't tolerate phony accounts on its social network.

"A fake profile is a clear violation of our terms of service," Paul Rockwell, head of Trust & Safety at LinkedIn, said in a statement. "When they are uncovered, we take swift action to remove them.

"Our members come to LinkedIn to have respectful and constructive conversations with real people, and we're focused on ensuring they have a safe environment to do just that."