Life after touch - how will the Apple patent impact innovation?
I'm no patent expert, but it's clear after a little research that patent laws were put into place for two reasons: 1) they want to encourage secretive inventors to stop stashing their cool ideas under a mattress somewhere and make them public and 2) they
I’m no patent expert, but it’s clear after a little research that patent laws were put into place for two reasons: 1) they want to encourage secretive inventors to stop stashing their cool ideas under a mattress somewhere and make them public and 2) they want to rock the boat.
Apple has never been accused of keeping new ideas under wraps, but by securing their new patent for “multifunction” touch technology like pinch, rotation, and swipe, they have certainly rocked the boat.
We won’t know how or if the boat will be righted until a few million dollars are spent on lawsuits, but those in the mobile and consumer electronics industry seem to be either ignoring the issue (using the lawsuit reasoning stated above) or they have the knee jerk reaction that Apple is ruining it for everyone – that the company is reverting to Pre-Open-Source, Big-Meany Corporate status.
And yet, isn’t Apple doing us a favor by rocking the boat? The reason behind the existence of patents is sound – to spur innovation and excite competition, the argument being that if there was no payoff for new products, services, or technologies there would be less incentive to push for change and improvement. Instead of ignoring the issue or getting angry about it, companies ought to be putting their energy and resources into coming up with something new. If Apple owns “touch,” what’s next?
There was an interesting email exchange bouncing around the frog design studio in Austin the other day that seemed to entertain a world beyond touch – or at least an admission that touch was in some ways limiting. “My emotional connection to a device is through its content,” not its touch screen, wrote one emailer. As the thinking developed, there was talk of combining capacitive strips with touch UI or taking the gestural technology behind the Wii Remote as an example of “indirect touch,” albeit an imperfect one. “Its gestures are rough approximations and I hate using it to enter text,” was the response.
Touch and the possibilities of the touch interface still seem so new to most that it takes courage to think beyond the now and the wow. Purposefully directing your focus away from the popular culture is risky in a business sense because the MO of business is to capitalize on what everyone wants now. It also verges on the anti-social. When your friends and colleagues are just now getting iPhones and you’re already geeking out about its limits at a cocktail party, it’s hard not to come off as a bore. Ah, the price of early adoption….
If trying to figure out how to trump touch technology is anti-social, one guy who probably never leaves his laboratory is Adam Greenfield, author of Everyware: The dawning age of ubiquitous computing. After a read through the book, one wonders what the difference is between touch screens and stone tablets. They both seem archaic up against the notion of haptic interfaces (how what you feel can be enacted virtually and vice versa), the possibilities surrounding RFID tags, and, yes, voice recognition. These aren’t new technologies but could they be developed further? Could these be integrated into everyday life so that they “dissolve into behavior,” as Greenfield puts it? Imagine never having to use a keyboard again, or for that matter, ever having to pull out a “phone” or walk over to a computer monitor. Information and content comes and goes through “interfaces” that have disappeared into how we act and move within our surroundings – wall paint, video tattoos, motion sensors, and chip implants.
The technology is available, or nearly so. I remember in the late 90s when people were experimenting with surgically implanted RFID chips that would communicate with home technologies. Walk into your house and your favorite music would start to play, the oven would start to preheat for dinner, and the television would switch to a pre-programmed news channel. Now there’s talk of RFID implants to prevent credit card identity theft. I don’t know about you, but that gives me the chills. Being indebted to a credit card company already feels like they have their hooks in me. I don’t need a physical reminder.
And of course, herein lies the designer's dilemma. How do you move beyond human habits so drastically without alienating people and the technology along with it? Most would say you do it gradually. Don’t let the technology get too far ahead of human aptitude. Then again, today’s phones look and act nothing at all like they did 10 years ago, so gradual is relative. What are the possibilities in the next 10 years? Thanks to Apple, they may not have anything to do with touch.