Security scrutiny for Facebook apps

A voluntary program will give some third-party apps a badge to show they can be trusted. But will it be enough?

After booting applications from Facebook this summer for violating user privacy, the social-networking company is gearing up to vet apps for trustworthiness as part of a voluntary validation program.

The validation badge will give Facebook members a gauge to use in deciding whether to add a particular app or not. Experts praise Facebook's effort, but say apps posing security risks will still be around despite that, partly because of the popularity of the network.

Facebook image

Facebook gives a tremendous level of access to its APIs, which has enabled developers to create more than 24,000 apps for the platform since spring of 2007, many of which are routinely used by millions of people every day.

The company has erred on the side of permissiveness in its development environment so as to foster fast growth, and it has succeeded. As a result of the app deluge, it has had to kick some applications off the site and suspend others until they fixed the problem.

Facebook representatives declined to say how many or what proportion of the total apps have been suspended, but said it was a small number and that most were not intentionally malicious. "In some cases it was an improper security of data and longer retention than was allowed on their servers," said Chris Kelly, chief privacy officer at Facebook.

The barriers to entry on the site are low for developers. Pretty much anybody with a Facebook account can submit an application; they just need a valid e-mail address and a certain number of friends on the network in order to release an app, Kelly said, adding: "We do want to make it reasonably easy for the kid in the dorm room to create an app and have it spread virally throughout the network."

Then there is oversight of development, he said. Each app is assigned an application key that is used to track its requests for access to user data and developers must follow terms of service that set technical rules. For example, the API forbids access to contact information and only allows apps to retrieve data from user's who have agreed to add the app to their profiles, Kelly said.

Applications have access to a user's full profile and friend list. However, users can change their settings to not provide access to particular data and then make exceptions for specific apps as desired. Developers can keep user data on their servers for only 24 hours and are encouraged to do real-time data requests instead, he said. They can, however, keep the randomly generated user ID indefinitely, but it can't be tied back to a specific person, a Facebook representative said.

Researchers who successfully created a botnet through a Facebook app recently , suggested that social networks should be more restrictive in the interactions they allow between their sites and the rest of the Internet, and that they should restrict the use of JavaScript.

Upholding tech culture
Facebook does impose some restrictions on what the apps can display through JavaScript to prevent the spread of malware, Kelly said. He disregarded the rest of the advice, though, saying it was "counter to an innovative technical culture."

"The mere fact that something bad can be done with the technology doesn't mean you should hobble the technology," he said.

Another potential security risk comes from Facebook allowing developers to host the apps on third-party servers. "If there's a security defect on that server the data could be exposed," said Nitesh Dhanjani, senior manager and leader of application security services at Ernst & Young.

But hosting all the apps itself isn't the answer, either, Dhanjani said. That "would diminish the value of some of the apps," he said. "Some of these apps are really, really cool. They feed your data to a flash object that gives you a visual representation of your friend list and things like that."

Facebook tries to limit any threats that might come from outside its network by requiring developers to use the Facebook Markup Language for data that will be displayed on the site.

"We think we have good security measures in place such that we don't necessarily need to say that everything has to be run on our servers in order to deliver a good and consistent user experience," Kelly said.

Apple may take a look at all the iPhone apps before they are released, but that's not a reasonable expectation for Facebook, Dhanjani said. "There are hundreds of Facebook apps every day and they won't be able to audit all of them," he said. "From a practical perspective they can't vet them all.

Instead, Facebook plans to launch within weeks a Facebook Verification Program that will subject submitted apps to a more thorough review of things like whether the data they are requesting matches the purpose of the app and how securely they are managing the data. "Their privacy policies need to be as strict or stricter than Facebook's," Kelly said.

This doesn't mean the unchecked apps can't be trusted; it just means that users can add the verified apps with a higher level of assurance than the apps that haven't been checked, he said.

Facebook is also taking steps to stop the spread of spam and worms through postings on peoples' Walls on their pages. The company released a new security feature on Friday.

"I have to give credit to Facebook for trying, but I think it's a very difficult problem to solve and make everybody happy at the same time," Dhanjani said.

One Web application security expert points the finger at the browsers, saying they should protect Facebook users by restricting dubious activities based on policies set out by the social network.

"I would place the blame or responsibility on the browser vendors," said Jeremiah Grossman, chief technology officer at White Hat Security. "Facebook should be able to describe to the browser what these apps are allowed to do programmatically. But it can't because the browsers have no facilities for that."

Mozilla has a prototype project called Site Security Policy that could resolve this problem, he noted. "This is something that is really, really needed," he said.

No question, Facebook wouldn't be what it is today if it had taken a hard line with developers.

"Any (extreme) steps to lock down the development environment would cut down the number of apps and may devalue the platform," Robert Phan, chief technology officer at RemoTV, which is working to extend its video programming channels to Facebook, said at DemoFall this week.

"There's always the chance that an unscrupulous developer will attempt to abuse the system," said Justin Smith, editor of the Inside Facebook blog. "By allowing developers access to user information Facebook has enabled creation of entirely new classes of Web applications, many of which never existed before."

 

Join the discussion

Conversation powered by Livefyre

Don't Miss
Hot Products
Trending on CNET

HOT ON CNET

Up for a challenge?

Put yourself to the real tech test by building your own virtual-reality headset with a few household items.