It appears the touch technology in Apple's iPhone has a decent leg-up on the competition. Recently a number of other touch-enabled mobile devices have surfaced (Droid and Nexus One), and like the iPhone have a large front screen to accept gesture input. While each has been touted as having "touch" technology, and all have been sold on features such as the network, operating system, and app availability, the interface in the devices have not been given much attention.
Perhaps people assumed that a touch panel is a relatively uniform and simple technology, and therefore have not bothered to compare them between devices. In fact, the accuracy and precision of the touch panel may make all the difference in how the device is perceived. Looking back at computers, a dirty trackball mouse compared to a smooth optical mouse is like night and day.
Recently, as reported by AppleInsider, the folks over at MOTO labs have put together a touchscreen stress test to see how accurate and precise the devices are at tracking input. The test was a simple approach, which was to take a drawing program and slowly input a cross-hatch pattern using a single finger, testing with different pressures. The results show that Apple's touch implementation is far more accurate than the competition; however, there is a fair amount of nonlinearity in the tracking at the edge of the display, especially at higher pressures.
The response warping at the edge of the screen may be a matter of how Apple's drivers average the perceived direction of the gesture as the touch "footprint" (fingerprint?) gets smaller when the finger goes off the display. Despite this drawback, Apple clearly has a superior touch implementation, at least for single-finger touches. I wonder how the devices would compare in tests for multiple finger inputs?