Human Senses in Computers!!! The Next Big Thing by IBM

IBM unveiled the seventh annual IBM 5 in 5 #ibm5in5 on December 17th, 2012. A list of innovations that have the potential to change the way people work, live and interact during the next five years. This year, IBM’s 5 in 5 focuses on the five basic human senses. Yes you heard right, computers will emulate 5 senses viz Touch, Sight, Hearing, Taste and Smell. Human senses in computers surely is “The Next Big Thing” in computing world. IBM thinks cognitive computers that can adapt to their surroundings and will be a large part of our future.


  • Touch: You will be able to reach out and touch through your phone
  • Sight: A pixel will be worth a thousand words
  • Hearing: Computers will hear what matters
  • Taste: Digital taste buds will help you to eat healthier
  • Smell: Computers will have a sense of smell

In the era of cognitive computing, systems learn instead of passively relying on programming. As a result, emerging technologies will continue to push the boundaries of human limitations to enhance and augment our senses with machine learning, artificial intelligence (AI), advanced speech recognition and more. No need to call for Superman when we have real super senses at hand.

Within the next five years, your mobile device will let you touch what you’re shopping for online. It will distinguish fabrics, textures, and weaves so that you can feel a sweater, jacket, or upholstery – right through the screen. So when a shopper touches what the webpage says is a silk shirt, the screen will emit vibrations that match what our skin mentally translates to the feel of silk.


IBM says the development of a “Product Information Management” (PIM) database system that acts as a dictionary to match the vibration patterns to relevant physical objects will allow texture information to be easily matched with specific items. Useful to retailers and farmers – who will be able to determine the health of their crops by comparing it to the texture of a healthy plant – and doctors – who can literally get a feel for an injury to help with a diagnosis.

IBM Research thinks that computers will not only be able to look at images, but help us understand the 500 billion photos we’re taking every year (that’s about 78 photos for each person on the planet). The cognitive computing technology will allow computers to examine thousands of images and try to understand and recognize patterns and distinct features to determine the content.


Consider example of the sunset scenes, the computer might recognize certain color distributions that are common to such images, while for a downtown cityscape it might learn that certain distributions of edges are what sets them apart.

In medical field, doctors see diseases before they occur. Take dermatology. Patients often have visible symptoms of skin cancer by the time they see a doctor. By having many images of patients from scans over time, a computer then could look for patterns and identify situations where there may be something pre-cancerous, well before melanomas become visible.


Imagine knowing the meaning behind your child’s cry, or maybe even your pet dog’s bark, through an app on your smartphone. In the next five years, you will be able to do just that thanks to algorithms embedded in cognitive systems that will understand any sound.


IBM research is also aiming to give us superhuman hearing by translating ultrasonic frequencies into audio that we can hear. This could potentially give humans the ability to talk to the animals, such as dolphins and dogs.

They will be able to interpret sounds in the environment too. What does a tree under stress during a storm sound like? Will it collapse into the road? Sensors feeding the information to a city datacenter would know, and be able to alert ground crews before the collapse.

Forget to hit “mute” while on that conference call at work? Your phone will know how to cancel out background noise – even if that “noise” is you carrying on a separate conversation with another colleague!


Imagine a system that analyzes food down to its atomic structure and combines this information with psychophysical data and models on which chemicals produce “perceptions of pleasantness, familiarity and enjoyment.”


IBM says such technology won’t just create meals that tickle our taste buds, but also ones that are healthy and meet nutritional requirements. Such a system could create nutritional school cafeteria lunches that students actually want to eat or allow those with limited ingredients, such as those in the developing world, to create meals that optimize flavor.

Many communities in sub-Saharan Africa only have access to a few base ingredients for any given meal. But limited resources should not eliminate the enjoyment of food. A creative computer can optimize flavor profiles within these constraints, creating a variety of never thought of meals that please the palate, encourages consumption, and helps prevent malnutrition.


Within the next five years, your mobile device will likely be able to tell you you’re getting a cold before your very first sneeze. Tiny sensors that ‘smell’ can be integrated into cell phones and other mobile devices, feeding information contained on the biomarkers to a computer system that can analyze the data.

Similar to how a breathalyzer can detect alcohol from a breath sample, sensors can be designed to collect other specific data from the biomarkers. Potential applications could include identifying liver and kidney disorders, diabetes and tuberculosis, among others.

Where in the past, physicians relied on visual clues and patient descriptions to form a diagnosis, just imagine how helpful it will be to have the patient’s own body chemistry provide the clues needed to form a more complete picture.

Whats your take on cognitive computing? Is IBM on to something with PCs that can taste, smell, touch, hear and see? How would you use the technology? Share your thoughts in the comments.

Source: IBM, Gizmag