X

Apple focuses on opening up camera to developers

Opening up its iOS 8 camera API to developers means better camera apps -- and hopefully better photos -- for all iPhone and iPad photographers.

Lori Grunin Senior Editor / Advice
I've been reviewing hardware and software, devising testing methodology and handed out buying advice for what seems like forever; I'm currently absorbed by computers and gaming hardware, but previously spent many years concentrating on cameras. I've also volunteered with a cat rescue for over 15 years doing adoptions, designing marketing materials, managing volunteers and, of course, photographing cats.
Expertise Photography, PCs and laptops, gaming and gaming accessories
Lori Grunin
2 min read

645-pro-mk-ii780.jpg
Because they have to route around some aspects of the operating system, some apps might have stability issues and nonstandard interfaces. Screenshot by Nate Ralph/CNET

One of the biggest holes in the iPhone camera's feature set is the ability to go beyond point and shoot. You can apply filters up the wazoo, but Apple has traditionally not allowed developers access to the camera's preshot settings. There are a handful of apps that provide some manual-like controls, such as 645 Pro Mk II, but creating these apps requires extra work on the part of the developer in order to provide the features and a need to route around some parts of the operating system; that can lead to less-stable operation. But no longer. With the announcement of iOS 8 at its Worldwide Developer's Conference (WWDC), Apple announced that its application programming interface (API) will now allow programmers to control at least some of the basic functions of the camera.

Apple iOS 8 at WWDC 2014 (pictures)

See all photos

How extensive the API calls will be remains to be seen, and I doubt that Apple will pass through raw data the way that some Nokia Lumia models can, or the way Google plans for Android. That said, even allowing some basic abilities such as control over white balance and exposure to be built into other developer's apps will ultimately help improve photo quality.

For instance, apps that now apply basic adjustments like brightness, contrast, and white balance have to process the already highly processed image files handed off to them by the operating system. By operating on them sooner in the image pipeline, there's the possibility of less degradation. It also opens up the way for new types of effects; if the app can control the shutter speed, it might be able to create some more interesting types of blur effects, or apply effects to a series of burst images. And for video, being able to manually set the shutter speed to a desired rate gives you more control over the look of the final video.

Right now it's all speculative, of course, as I haven't had a chance to look at the API documentation and see exactly what calls will be available. But even the basics are a step in the right direction.