If you're intrigued by I am, you've probably been following it pretty closely and are at least impressed by its potential.as much as
But what could it do better? According to Apple, taking the "touch" out of multitouch would be a good first step.
According to an article on AppleInsider, Apple has issued a 30-page patent that touches on the implementation of proximity sensors into its multitouch technology on devices larger than the iPhone.
The multitouch sensors combined with proximity sensors would let users interact with the given interface without actually having to touch the screen. Now, this seems a tad ridiculous to me, and is anyone really too lazy to move their finger an extra inch? Yeah they are, but that doesn't make it a good idea.
Apple sees some different applications for the technology. According to the company, users would have the capability to turn off the entire touch-screen panel, or just portions of it. In addition, users would able to power down one or more of the computer's systems by dimming or brightening the screen as they see fit.
Awesome, huh?! Alas, no. OK, I may be missing something, but why would you need a proximity sensor to do this? You could just move your finger another inch and accomplish the same thing. The only unique feature Apple cited from the filing was the idea that you could highlight virtual buttons on a display without touching them. This could prepare the button for actually being pushed. Again, how is this useful?
I may be shortsighted (probably am) but the only advantage I can see for this technology is that you wouldn't have to worry about scratching or smudging your screen anymore.
In the filing, Apple reportedly states that the proximity sensors could be made of infrared transmitters and IR receivers. It speculates that a grid of IR receivers could be placed on the panel behind the touch screen like the ones found in the latest iMacs and MacBooks.
Each sensor would be able to detect the presence or absence of an object within its vicinity. Using the data received from multiple receivers, it could then be used to determine the positioning of an object above the panel.
AppleInsider quoted the filing as saying: "The transmitters and receivers can be positioned in a single layer, or on different layers. In some embodiments, the proximity panel is provided in combination with a display. The display can be, for example, a liquid crystal display (LCD) or an organic light emitting diode display (OLED display). Other types of displays can also be used. The IR transmitters and receivers can be positioned at the same layer as the electronic elements of the display (e.g., the LEDs of an OLED display or the pixel cells of an LCD display). Alternatively, the IR transmitters and receivers can be placed at different layers."
OK, so now I know how it works. Still not sold on how useful being able to interact with the panel an inch above it is useful. Unless the proximity can sense movement much more than an inch.
Maybe this is just some Apple engineering geek's fantasy about starting us down the road that would eventually lead to a Minority Report-like interface? OK, I've used that link before, but I'm still impressed by it.