X

The open-source programmer who means business

Linux expert and MBA candidate Alan Cox talks about the GPL, patent politics and the drive to the desktop.

Ingrid Marson
7 min read
Alan Cox is so well-regarded in the open-source software community that he can pull in a crowd of eager techies to discuss theoretical software stability on a Sunday afternoon, as he did at last year's FOSDEM conference in Brussels.

Cox wrote much of the original networking subsystem in Linux more than a decade ago, and he has maintained and contributed code toward various kernel releases. Now employed by Linux vendor Red Hat, he is a leading figure in the open-source software community. He has spoken out frequently against issues that he feels jeopardize freedom, such as software patents and the Digital Millennium Copyright Act.

Last week, after Cox's talk at a Trusted Computing conference in London, ZDNet UK spoke to him about a wide range of topics, including the next version of the General Public License, software patents, the kernel development process and Linux on the desktop.

Q: The first public discussion draft of GPL 3 (General Public License version 3) was released a couple of weeks ago. What are your initial thoughts on it?
Cox: The majority of it looks very sensible, such as letting copyright information be displayed in an "about" box, rather than relying on command line instructions (as is the case in GPL 2). Some of the more contentious stuff has sensibly been made optional.

One of the other nice things is the work to make the GPL compatible with other licenses. That's really important--it will allow people to share more code.

What do you think about the new provision in the GPL 3 draft that opposes digital rights management?
Cox: From the kernel perspective, it doesn't really matter. DRM is generally used by applications, so it's more a question for things like the (GNU) C library. (Editor's note: Shortly after ZDNet UK spoke to Alan Cox, Linux founder Linus Torvalds spoke out against GPL 3, saying that he won't convert Linux to the new version, as he objects to the proposed digital rights management provisions.)

Last year, Sony BMG was criticized after it was discovered that some CDs automatically installed copy-restriction software that it had hidden using a rootkit-like technology. In your talk at the Trusted Computing conference, you said that the potential problem with DRM was highlighted by the recent Sony debacle and that there is going to be "an almighty power struggle" between the content industry and users. Where do you think the balance of power is at the moment?
Cox: I'm not sure where the balance of power is. There is a lot of evidence that it's on the music and computer industry's side. But I think Sony has learned its lesson, and it's been quite an expensive lesson.

There needs to be a clear understanding of what's allowed--a computer is private property, but we don't know what this means legally. I think some of it's going to come down to government competition regulation--how you may or may not use DRM, in particular if you're in a monopoly position.

Last year, the software patent directive was rejected by European Parliament. The debate around such patents has now reopened, with the European Commission's launch of a public consultation into how the patent system should be changed. As one of the people who campaigned against software patents the first time around, how do you feel about the fact that they're back on the agenda?
Cox: I'm astounded. On one hand, we have Microsoft being threatened with multimillion-pound fines by the EU, and on the other hand, they're being offered software patents--Microsoft is one of the big influencers of this issue.

At the end of the day, software should not be patentable.

It is worrying that they're back on the agenda. It's a sign of more fundamental problems in the EU. The democratic process (of the European Parliament) is being devolved (to the unelected European Commission). It's what people call policy laundering: "It's a good idea, but we'll never get it past the electorate, so let's slip it through and then pass it on to the individual governments."

If that's the case, what can people do to campaign against software patents?
Cox: The first thing is to write to MEPs (Members of European Parliament). It's not even necessarily about content; it's about demonstrating the sheer number of people that care about this issue. What we did last time wasn't about the fineness of letters--the FFII (Foundation for a Free Information Infrastructure) got 300,000 signatures. The Commission can ignore this, but parliament has to get re-elected.

It will be very hard, though. The fact that there are almost no lobbying laws in the EU is a very big problem. In other places lobbyists are accountable.

The Open Source Development Labs (OSDL) has launched a patent library, to aggregate information on patents that have been pledged to the open-source community. How important do you think such initiatives will be?
Cox: That work is going to be very important, but at the end of the day, software should not be patentable. There is a challenging area where you have hardware and software together, but it is the hardware bit that should be patentable.

A number of technology companies, including IBM and Microsoft, have called for the reform of the U.S. patent system. How much hope does this give you?
Cox: Things are slowly turning the right direction, but it's really only baby steps. Companies are being forced to admit that maybe there is a problem, but no one's said how to fix it.

On to other topics. As a longtime Linux kernel developer, what changes have you seen in the kernel development process over recent years, as the operating system has become more commercialized?
Cox: Well, there are more patches posted Monday to Friday, rather than weekends. But the biggest change has not been commercialization--it's been quality.

In the early days, people were building Linux. It now does everything it's required to do, so all the changes are about faster, cleaner and better ways of doing things. Nowadays, someone will say, "How do I get it to run 2 percent faster?" or add a new device.

It's the sum of things that all users want from it, which is really good. If you had said in the start that you wanted an operating system that runs on mainframes, PCs and Palm Pilots, people would have said that wasn't possible. Now, every time we get change that breaks something, we have a cycle of making things work for all platforms.

The biggest change in the kernel has not been commercialization--it's been quality.

The kernel is very modular, so one area rarely affects another. But it does get harder to improve Linux as it gets better. Wikipedia will face the same issue: At the moment, people are adding new things, so any contribution is a positive improvement. But over time, random changes could make it worse.

Many kernel developers work for companies nowadays. For example, lead kernel maintainers Linus Torvalds and Andrew Morton work for OSDL, while you work for Red Hat. How many independent kernel developers are there nowadays?
Cox: Probably not that many. There are some students who do work on the kernel. One thing that drives students to work on the kernel is that it offers good job prospects. If you're a good kernel developer, you'll soon get e-mails from large companies offering you a job.

What I think is interesting about the kernel development process, unlike some other projects such as Debian, is that there is no formal process for becoming a developer. Isn't it risky that anyone can get involved and change the code?
Cox: There is a lot of control and review. Every bit of code has been read by several people. We don't have a formal process for training developers, but there are things that have been done, such as the "kernel newbies" project, which is a way for people to learn how things get done--or the "kernel janitors" project, where people work on small things, such as clean-ups and reviewing code.

If some random person makes a change to the kernel, we will get somebody to review it. We get a lot of people who make just one change, and we never hear from them again. For example, they install Linux and discover their USB stick doesn't work, so they fix that.

Having a formal process would be a negative thing, as it would stop people from making such contributions. The people who make one-line contributions are clearly very good developers; they're just not kernel developers.

In August 2003, you took a one-year sabbatical to work on an M.B.A. Why did you decide to do an M.B.A.? Is it finished now?
Cox: Engineers look at sales and marketing people and wonder what they do. When I became more senior in the company (Red Hat), I needed to talk more to salespeople and had to understand what they were doing.

I worked part-time on the master's over a year, and have now finished it. I've only just got the results for the research part of my master's, which investigated Linux on the desktop, and I'll be publishing this fairly soon.

So, what were your findings?
Cox: It's starting to happen. People are deploying it, particularly in environments where there were computers that are only being used for basic word processing. Thin-client Linux is being deployed a lot, for example, at call centers and hotels. Large companies are in some ways finding it easier to switch; smaller companies have less technical people and tend to run more applications on one machine.

The French tax agency plans to deploy OpenOffice.org software on 80,000 PCs, but hasn't yet decided whether it will migrate to Linux afterwards. How important do you think OpenOffice is in promoting the use of Linux on the desktop?
Cox: A lot of people that I talk to who have been doing migrations to Linux, started using OpenOffice on Windows. For some people that's their only migration--OpenOffice saves them a fortune. It's a big first starting step and is a very important application for Linux on the desktop.