Keeping your gloves on while using your smartphone is a new capability this year. Here's how extra-tuned-in touch screens work.
For all of your smartphone-owning life, you've been told you can't use your expensive device while wearing gloves, no matter how low the mercury plummets. You weren't really sure why, you just knew it wasn't going to happen.
And so rocketed sales of fingerless gloves, conductive gloves, and even conductive thread for those brave (or thrifty!) enough to hack their own touch-screen hand coverings.
Then something wonderful happened that obliterated this sad certainty of specialty winterwear dependence forever, and that was Nokia.
Starting with the Nokia Lumia 920 and Nokia Lumia 820, the industry got a smartphone touch screen that could register taps and gestures made with fingernails and many gloves (though I wouldn't try heavy-duty whompers).
Nokia didn't just make the supersensitive screen setting an option for its highest-end phones, either. The budget-friendly Lumia 720 and Lumia 520 incorporate this touch-sensitivity option as well.
For a while, Nokia was alone in the supertouchy screen game, until Huawei introduced its Ascend P2 and Samsung followed suit with its Galaxy S4 flagship Android phone.
It's no coincidence that all these phones emerged with the same capabilities as most of Nokia's Lumia lineup; they use the same touch-screen supplier, Synaptics, a Santa Clara, Calif., company whose technology drives the supersensitive train.
Have you seen a diagram of a mobile phone display? It's a lot more than the cover glass you're worried about shattering when you drop your phone.
There are layers that stack up to form the whole package, from the coated cover glass on top through filters, substrate glass, and screen material, like the LCD or OLED sheaves that actually turn pixels on and off to create the picture you see on the screen.
It also helps to have a basic understanding of how a touch screen works in the first place. There's a lot of electrical engineering involved, but the gist of it is that electrodes in the screen assembly help create and hold an electric field around the screen.
When you touch your phone's face, your fleshy finger -- a conductor in its own right -- disturbs that electric charge where you come in contact with the screen.
In the case of multitouch actions, like pinching and zooming, the screen plots coordinates for multiple points of contact. Synaptics' touch technology recognizes up to 10 points of contact at a time, even though you usually use one or two.
The touch sensors -- which detect your taps -- don't reside alone. There's also the touch controller chip, which zips off your electric signals and coordinates to a more powerful processor that then kicks off a task. So for example, you touch the screen on this icon here, and an instant later, you've opened an app.
The same basic touch principles still apply when you're talking about operating the phone with gloves or fingernails in place of your bare fingertip. What's changed, says Synaptics' Technology Strategist, Andrew Hsu, comes down to processing power in the touch controller chip.
Here's a common problem: how do you know what's a finger and what isn't?
While human skin is conductive, the signal it contributes to the electric field is extremely low, which creates a challenge for identifying anything that is not a finger pressing directly onto the phone face.
More-sophisticated processors can handle the kinds of complex algorithms that can identify your finger, nail, gloved hand, or stylus as it descends upon the the touch screen. But they also need to be smart enough to know when to turn on the super-sensing jets and when to back off so that a simple motion won't get things going at unwanted times.
What's more, smartphones using this extra sensitive technology can interpret among the different types of touch, and can also filter out the kind of false-positive noise created by other things floating around your environment, like a stray leaf, strong gust of wind, or another person's index finger hovering above your phone ready to chip in on a game.
Noise is, in fact, a huge part of problem that companies like Synaptics have been working to surmount, and silencing that electrical noise is one of the touch controller chip's most constant and grueling jobs on a super-sensing screen.
Synaptics worked closely with Nokia, Samsung, and Huawei, Hsu said, to get a deep understanding of the structures that generate competing noise from within the phone itself.
What's an example of such noise that can mess with touch-screen control? How about the display itself. Electrodes in the display layer (the LCD or AMOLED material) fire up pixels to shine through your phone's glass topper, arranging the image you see on the screen. That action also creates electric noise that the touch sensors pick up, in addition to trying to work out if you're touching the phone with a nail or a glove or a fingertip.
Although supersensitive screens don't require this, there's a certain way to place the touch sensor electrodes that can help automatically cut down on noise and boost electric signal performance.
Called in-cell implementation (PDF), the electrodes integrate right into the LCD display material (AMOLED phones typically use a different on-cell configuration).
The benefit here is that the same electrodes lighting up the display also detect those changes in voltage (the electric field) when you press on the screen. Since two functions use the same electrodes, the display can't possibly confuse the touch sensors by blasting out signal at the same time.
It's still early days when it comes to supersensitive screens, and I'm certain we'll see even more-sophisticated capabilities come down the line.
In the meantime, with three phone-makers using display tech that lets you keep your gloves on, you can count on others soon snatching up the trend.
Smartphones Unlocked is a monthly column that dives deep into the inner workings of your trusty smartphone.