Former White House cybersecurity czar Howard A. Schmidt says programmers need training--and they need to be held accountable for flaws.
Many people know that developers are often under intense pressure to deliver more features on time and under budget. Few developers get the time to review their code for potential security vulnerabilities. When they do get the time, they often don't have secure-coding training and lack the automated tools to prevent hackers from using hundreds of common exploit techniques to trigger malicious attacks.
So what can software vendors do? In a sense, a big part of the answer is relatively old fashioned; the developers need to be accountable to their employers and provided with incentives, better tools and proper training.
Just as in most jobs on the planet, employees get regular performance evaluations, typically receiving raises and promotions for good performance, and pointers for growth and improvements as needed. In the software industry, developers who produce securely coded software should have that information considered as part of their performance reviews. When taken out of context, holding individual developers accountable for the security of their code may sound like a draconian step, but in reality it is little more than saying that developers are responsible to develop robust, innovative and secure code for their employers.
I recently bought a sports jacket, and in one of its pockets I found an "inspected by" tag. Software developers could be similarly connected to their final work through a quality assurance process that issued a version of such a tag. In fact, employers should consider providing a system of financial rewards for developers who write secure code as a way to offer positive incentives. Many developers take considerable pride in the quality of their code, and they should be compensated for the quality and security of it.
Software vendors also stand to benefit from a financial rewards system because security flaws are typically easier and less costly to fix early in the software development life cycle when developers are initially writing the code. By contrast, plugging vulnerabilities later in the development process or after a product ships is a frustrating and expensive undertaking. Further, clear incentive systems help management communicate the company's security values and offer benchmarks for success.
Some have suggested that the way to reduce software vulnerabilities is for customers to sue vendors or take other legal action. I always have been, and continue to be, against any sort of liability actions as long as we continue to see market forces improve software. Unfortunately, introducing vendor liability to solve security flaws hurts everybody, including employees, shareholders and customers, because it raises costs and stifles innovation.
After all, when companies are faced with large punitive judgments, a frequent step is often to cut salaries, increase prices or even reduce employees. This is not good for anyone.
Needless to say, employers need to provide workers with the tools and training they need to succeed. Developers certainly understand this concept, as there are literally thousands of developer tools on the market. Despite this proliferation, until recently application-focused security tools were ponderous and had limited effectiveness because they had limited feature sets and served up numerous false positives.
By contrast, today there is an entire category of source-code analysis products that offer automated security testing capable of checking software code against databases containing thousands of common coding flaws. These products arm developers with powerful tools to fix major problems before they ship, saving considerable time, money and receding hairlines.
In addition to helping developers, this new generation of security tools from vendors such as Fortify Software (disclosure: I'm on Fortify's board), Ounce Labs and others offer adaptable security metrics that can be used for a number of management purposes. Managers can track the number of high, medium and low-level security vulnerabilities in an individual developer's code as part of an incentive system designed to reduce flaws and train developers in how to avoid making the same errors in the future. Though the topic here is source code, we should not forget Web apps and internally developed apps.
Lastly, developers simply need more training in secure coding. Developers have learned their craft in many ways--in tech schools, through self instruction or from computer-science classes. Until recently, even the people who received formal education in development were rarely taught common best practices in secure coding. Books by Gary McGraw, Michael Howard and David LeBlanc are some of the best resources on the topic.
In the end, what security requires is the same attention any business goal needs. Employers should expect their employees to take pride in and own a certain level of responsibility for their work. And employees should expect their employers to provide the tools and training they need to get the job done. With these expectations established and goals agreed on, perhaps the software industry can do a better job of strengthening the security of its products by reducing software vulnerabilities.