Debunking Linux kernel myths, Greg Kroah-Hartman style

Think you know how the Linux kernel is developed? Think again

Greg Kroah-Hartman

I'm not sure how I missed this presentation when it was first delivered in 2006, but I'm grateful to Digg for resurfacing it. In it Greg Kroah-Hartman, a Linux kernel developer employed by Novell, identifies and debunks a range of myths about Linux kernel development. It makes for excellent, insightful reading.

Among other salient points, I particularly liked Greg's swatting down of the myth that suggests Linux lacks support for disparate devices. The exact opposite is true, as Greg points out:

[W]e support more things than anyone else. And more than anyone else ever has in the past. Linux has a very long list of things that we have supported before anyone else ever did....

No other "major" operating system even comes remotely close in platform support for what we have in Linux. Linux now runs in everything from a cellphone, to a radio controlled helicopter, your desktop, a server on the internet, on up to a huge 73% of the TOP500 largest supercomputers in the world.

And remember, almost every different driver that we support, runs on every one of those different platforms. This is something that no one else has ever done in the history of computing. It's just amazing at how flexible and how powerful Linux is this way.

Greg goes on to talk through suggestions that the Linux kernel adopt a rigid API design, as well as myths about how the code is actually developed and whether it's possible for an outside developer to get code committed to the kernel.

Comparing API policies between Windows, for example, Greg writes:

Now Windows has also rewritten their USB stack at least 3 times, with Vista, it might be 4 times, I haven't taken a look at it yet. But each time they did a rework, and added new functions and fixed up older ones, they had to keep the old api functions around, as they have taken the stance that they can not break backward compatibility due to their stable API viewpoint. They also don't have access to the code in all of the different drivers, so they can't fix them up. So now the Windows core has all 3 sets of API functions in it, as they can't delete things. That means they maintain the old functions, and have to keep them in memory all the time, and it takes up engineering time to handle all of this extra complexity. That's their business decision to do this, and that's fine, but with Linux, we didn't make that decision, and it helps us remain a lot smaller, more stable, and more secure.

There's value in both approaches, of course, and Greg's point is that there's a lot of value in the Linux approach that often goes unnoticed. Well worth a read.

Tags:
Tech Culture
About the author

    Matt Asay is chief operating officer at Canonical, the company behind the Ubuntu Linux operating system. Prior to Canonical, Matt was general manager of the Americas division and vice president of business development at Alfresco, an open-source applications company. Matt brings a decade of in-the-trenches open-source business and legal experience to The Open Road, with an emphasis on emerging open-source business strategies and opportunities. He is a member of the CNET Blog Network and is not an employee of CNET. You can follow Matt on Twitter @mjasay.

     

    Join the discussion

    Conversation powered by Livefyre

    Don't Miss
    Hot Products
    Trending on CNET

    HOT ON CNET

    Find Your Tech Type

    Take our tech personality quiz and enter for a chance to win* high-tech specs!