X

The computer that can smell, hear and touch

IBM has predicted that in the next five years, computers will be able to mimic all of the human senses in many different ways.

Lexy Savvides Principal Video Producer
Lexy is an on-air presenter and award-winning producer who covers consumer tech, including the latest smartphones, wearables and emerging trends like assistive robotics. She's won two Gold Telly Awards for her video series Beta Test. Prior to her career at CNET, she was a magazine editor, radio announcer and DJ. Lexy is based in San Francisco.
Expertise Wearables, smartwatches, mobile phones, photography, health tech, assistive robotics Credentials
  • Webby Award honoree, 2x Gold Telly Award winner
Lexy Savvides
3 min read

IBM has predicted that in the next five years, computers will be able to mimic all of the human senses in many different ways.

(Screenshot by CBSi)

As part of its annual "5 in 5" report, IBM forecasts the big trends to look forward to in the coming years. While a computer that can smell may conjure up memories of such ill-fated concepts like smell-o-vision, there is more to it than simply novelty value.

Here are the five senses, and how IBM envisages they will be applied in the computers of the future:

Touch

We are used to concepts like a vibrating gaming controller and haptic feedback on phones. IBM predicts that our handsets will be able to receive and deliver information about a real-world experience, for example the texture of fabric or the freshness of produce.

A series of vibrations emitted by the phone could replicate the sensation of texture across a multitude of surfaces, referring back to a Product Information Management (PIM) system that is like a dictionary. The PIM receives visual information that can then be referenced against products and their associated data.

Sight

At the moment, we are constantly surrounded by visual information. Imagine if a computer could interpret an image in a similar way to how the human eye does. That's another of IBM's predictions, and it has ramifications not only for everyday tasks like cataloguing photos, but also for doctors who are looking to diagnose conditions.

One of the challenges of getting computers to "see" is that traditional programming can't replicate something as complex as sight. But by taking a cognitive approach, and showing a computer thousands of examples of a particular scene, the computer can start to detect patterns that matter, whether it's in a scanned photograph uploaded to the web, or some video footage taken with a camera phone.

Hearing

In IBM's example, a smartphone app could understand the nuances in baby sounds. In this example, it's not so much about replicating what the human ear can already do, but actually interpreting the sound and providing feedback that's correlated with a repository of other information to determine what the baby actually needs.

Taste

Not sure whether you should tuck into that bowl of delicious-looking soup because of your diet? Soon, computers could assist you in making healthy-eating choices that optimise flavour while still maintaining a healthy serve of nutrients.

Our culinary-creation system has access to large databases of recipes from online, governmental and specialised sources. The repository allows the system to learn what we consider to be good food. For example, from 50 recipes of quiche, the system can infer that a "good" combination of ingredients for any variation of quiche would include eggs, at least one vegetable and three spices.

Smell

A series of miniature sensors could be embedded into phones that would allow it to analyse a range of different smells. Current examples include a breathalyser that takes a sample of breath to work out a blood alcohol reading. Future applications could include medical diagnosis of conditions like liver and kidney disorders.

IBM is also encouraging people to vote for the cognitive ability that they predict will appear first on this poll.