When will a buck-stops-here culture finally reach the software industry?
Not soon enough, ifis any indication.
From September to January, Lenovo shipped more than two dozen laptop models with Superfish software that inserted its own ads in Web search results. (It's widely estimated that means millions of computers, though the company hasn't gone into detail.) More than that, Superfish exposed the laptops and their Internet traffic to hackers in a way security experts have described as egregious and easily exploitable.
Lenovo's chief technology executive claimed the company was just trying to improve the user experience. "Our teams did not understand the significant security problem that [Superfish] presented," Peter Hortensius said Tuesday. "We're desperate to understand why we missed that." The company on Friday issued a statement pledging to reform its ways.
Superfish, too, pleaded ignorance. Founder and CEO Adi Pinhas blamed a small Israeli startup called Komodia. It's Komodia's software that allowed Superfish to decode Internet traffic and insert ads. Komodia did not respond to a request for comment, but in a 2009 blog, CEO Barak Weichselbaum detailed working on a security program designed to hijack secure Internet traffic.
Lenovo's Superfish debacle highlights a growing problem in the software world: As more software components are outsourced, consumers are placed at greater risk than ever before. Software used by billions of consumers and businesses almost always relies on components made by development companies far removed from the final product, each trusting the other to do their due diligence. Few are, however, and that's putting you at risk, experts say.
Imagine that software is a jigsaw puzzle, with everyone from at-home hobbyists to multinational conglomerates supplying pieces they trust to be well-made, said Herb Lin, a senior researcher for cyber policy at Stanford University's Hoover Institute. The problem is they are "too trusting" of their partners, he said.
"Testing is known to not be sufficient," said Lin. "The usual way of vetting software is that I give you a specification and you give me back a program," he said. "I then test it to see if it meets those particular specs. But it's only a part of a program, and it hasn't been tested with all [the other] components." The smaller software component may even work perfectly until it's built into a larger program or app, at which point the flaw gets introduced.
Third-party software havoc
Holes made by third-party software that are ripe for exploitation by hackers go far beyond Lenovo. Security researchers last year discovered major vulnerabilities in two widely used open-source software tools, dubbing the flaws Heartbleed and . Although they were accidentally introduced, they had survived for decades because companies trusted that the small teams of volunteers developing the software had thoroughly checked the software.
There's also the intentional security hole the National Security Agency is accused of inserting into a tool made by the RSA Corporation that scrambles user data to protect it. It's highly unlikely companies would have paid RSA to protect their data had they known, andthat it knew about it.
To be sure, Lenovo's Hortensius said his company has taken steps to ensure few users can still run into Superfish. But it was only after security experts began howling about Superfish's behavior that some security programs -- from Microsoft, Symantec and McAfee -- detected and removed the software.
A matter of trust
To keep its position atop the global PC market, Lenovo has vowed to stop including unnecessary additional programs in its PCs.
"The events of last week reinforce the principle that customer experience, security and privacy must be our top priorities," Lenovo said in its statement Friday. With this in mind, we will significantly reduce preloaded applications. Our goal is clear: to become the leader in providing cleaner, safer PCs."
Whether that's enough to regain consumer trust is another issue, and one that every company suffering a supply chain screwup must overcome. "For all these vulnerabilities we're seeing, somewhere in the pipeline the trust got broken," said Robert Olson, a professor at the State University of New York at Fredonia who specializes in information security and ethics.
Smaller companies, especially new startups, may not have the resources or the corporate culture to ensure that they're properly checking how their partners have built their software. "Superfish's policies may not be your policies," said Justin Troutman, a security and privacy researcher and book author. "If you don't look into the code, you're blindly accepting risks."
In theory, trust is also an issue between software makers. The burden of proving a program is safe must be borne by each subsequent vendor: Alex must convince Brenda who must convince Christy that their software is safe. "It's a social negotiation, not a technical issue," Lin said. But, he added, the cost of making sure there aren't security flaws in software makes the program significantly more expensive to produce. "The cost per line of avionics software [for controlling aircraft], stuff that really has to work right, is 10 to 100 times more than ordinary code," he said.
What's the solution? Not necessarily regulation
Whatever the process at Lenovo was that broke down or didn't exist and let Superfish through, there's no doubt the goal of the software was to infiltrate a user's Web traffic and change what that person sees. That's a problem for Dan Kaminsky, the security researcher who discovered malicious software from Sony BMG infected more than half a million computers in 2005. "Imagine a supermarket inserting their own sweetener into Diet Coke," he said. "That's not normal. That's weird, the kind of thing you should be sued for."
More regulation would seem to be the obvious answer to solving the problem, and it's not unheard of in the security software world. The PCI Security Council, a standards body for governing how software handles financial data, has a certification process to make sure credit card numbers and transactions are handled with the proper precautions, said Avivah Litan, a security analyst with Gartner. "They look at the code, the quality assurance it goes through, who's managing the [security] keys. It's probably a good idea to extend it to other tech sectors," she said, but noted there are risks associated with doing so.
"In financial services, there have been so many breaches caused by third parties that regulators have put rules in, but it's really slowed down innovation and procurement," Litan said.
Even if broad regulations demanding higher standards for software security are devised, they may not be effective, for two reasons: many free-to-use, open-source components offer a lot of features to developers at no cost, and software security rarely works with a one-size-fits-all approach. "It's difficult to define what 'secure' means broadly across all software," said Jason Schmitt, who runs Hewlett-Packard's Fortify division, which focuses on software security. "What you have to define is the process of making it secure."
Bloatware is here to stay
Preloaded software isn't going to go away, no matter how loudly consumers and security experts howl. In addition to padding their bottom lines by bundling Windows with third-party software such as Adobe Reader, McAfee's antivirus software and the Bing Toolbar, manufacturers are convinced the software benefits consumers. Lenovo's Hortensius pointed to a system update tool his company adds to each new machine that updates drivers, small pieces of software that tell hardware components like printers how to interact with the computer. "We try to improve the user experience with every piece of software we load," Hortensius said.
Sometimes, supply chain mistakes can cost an executive his or her job. Andrew Lack, the Sony BMG CEO at the time of the company's malware-on-disc problem, lost his job over the incident. Lenovo hasn't signaled yet if there will be an executive shuffle for the Superfish incident, but there's already been one by a Lenovo laptop owner.
"This signals to participants in the supply chain that if they intentionally put in software that makes machines vulnerable, they're going to be taken to task for it," Kaminsky said.