Starting with the upcoming version 7 of Java, Oracle will deliver the Mac OS X version of the software. Also: past Java friction between Apple and Oracle.
Stephen Shanklandprincipal writer
Stephen Shankland has been a reporter at CNET since 1998 and writes about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertiseprocessors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, scienceCredentials
I've been covering the technology industry for 24 years and was a science writer for five years before that. I've got deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and other dee
After years running Java for Mac OS X as an in-house project, Apple is handing control to the Oracle, the companies announced Friday.
With Oracle's acquisition of Sun Microsystems in January, Java stewardship moved to the Redwood Shores, Calif.-based software giant, which sells Java server software among other products. Java is widely used on servers, common on mobile phones, but never met its potential on personal computers as a tool to let developers span different varieties of desktop computers with the same program.
In the Oracle handoff, Apple will transfer its Java work to OpenJDK, the open-source project under which Java is developed. Apple will maintain its current version of Java Standard Edition 6 for Mac OS X 10.6, aka Snow Leopard, and 10.7, aka Lion, but Oracle will release Java SE 7 for the Mac, the companies said.
The move is no surprise. With the release of an updated Java SE 6 package in October, Apple deprecated use of its Java--in other words, told programmers they should make alternate plans if they relied on it.
What's more interesting about the Apple-Oracle-Java situation is the history supplied by James Gosling, father of the technology, who has been revealing interesting nuggets ever since turning down the transfer to Oracle and the prospect of working for Larry, Prince of Darkness, as Gosling refers to chief executive Larry Ellison.
Apple embraced Java when it was in a weaker position and, like IBM, Hewlett-Packard, and others, took over responsibility for providing the virtual machine software that would let Java programs run on its computers.
"In the early days, they were insistent on doing the port themselves. They put terrific energy into it. They did a good job," Gosling said in a blog post in October. "But then, as OS X took hold and Apple was able to convince developers to target their non-portable/proprietary environment, Apple's fundamental control-freak tendency took over and they put less and less energy into Java."
The juicier part of the tale, though, concerns the difficulties that arose around discussions about Apple unshouldering its Java burden. It turns out that the company employed application programming interfaces (APIs) not available to others, Gosling said.
"The biggest obstacle was their use of secret APIs. Yes, OS X has piles of secret APIs...The big area (that I'm aware of) where these are used is in graphics rendering," Gosling said.
In one specific case, he said, the Java graphics specification had "careful wording" to allow Apple's approach to graphics. Apple required antialiased rendering--an ages-old graphics technique that uses intermediate-colored pixels to smooth away otherwise jagged edges that result curves or diagonal lines made out of square pixels. Java could handle either aliased or antialiased rendering, and Apple's approach didn't sit well with one Java developer.
"Most authors fixed their apps so that they worked in both cases," Gosling said, meaning aliased or antialiased. "But one developer took a serious 'f**k you' attitude on this issue and forced Apple to implement aliased rendering--which they kept secret because it was such an awful thing to have to do. The 'one developer?' Oracle, of course."