One of the biggest holes in the iPhone camera's feature set is the ability to go beyond point and shoot. You can apply filters up the wazoo, but Apple has traditionally not allowed developers access to the camera's preshot settings. There are a handful of apps that provide some manual-like controls, such as 645 Pro Mk II, but creating these apps requires extra work on the part of the developer in order to provide the features and a need to route around some parts of the operating system; that can lead to less-stable operation. But no longer. With the announcement of iOS 8 at its Worldwide Developer's Conference (WWDC), Apple announced that its application programming interface (API) will now allow programmers to control at least some of the basic functions of the camera.
How extensive the API calls will be remains to be seen, and I doubt that Apple will pass through raw data the way that some Nokia Lumia models can, or the way Google plans for Android. That said, even allowing some basic abilities such as control over white balance and exposure to be built into other developer's apps will ultimately help improve photo quality.
For instance, apps that now apply basic adjustments like brightness, contrast, and white balance have to process the already highly processed image files handed off to them by the operating system. By operating on them sooner in the image pipeline, there's the possibility of less degradation. It also opens up the way for new types of effects; if the app can control the shutter speed, it might be able to create some more interesting types of blur effects, or apply effects to a series of burst images. And for video, being able to manually set the shutter speed to a desired rate gives you more control over the look of the final video.
Right now it's all speculative, of course, as I haven't had a chance to look at the API documentation and see exactly what calls will be available. But even the basics are a step in the right direction.