Cognitive computing to go for real in 5 years

2012-12-18     itersnews
(iTers News) -Cognitive computing is what’s coming as a Next Big Thing, promising to change the way that people interact with a machine.

As the name suggests, according to IBM, the cognitive computer will see, smell, touch, taste, and even smell.

IBM highlights 5 key building block technologies that will underlie the cognitive computing in the annual 7th IBM 5-in-5 – a list of breakthrough technologies that hold the promise of changing the way people work, live and interact during the next five years.

What are at the core of the 5 game-changing breakthroughs are a next generation of sophisticated and ground-breaking sensing technologies that will collectively allow computers of the future to feel, see, smell, taste, and even hear as if people do.

Under the hood, of course, we need a series of very powerful chip that can convert all the natural sensibilities into streams of digital bits and then process and output back them in real physical world format.

Computers feel touch as if you do

We are already familiar with a haptic technology – tactile feedback information of real, physical world.

For example, the very basic and popular form of the haptic technology in wide use is vibration signal when you smartphone get a incoming call o a silent mode.

Or, you play an on-line car racing game on your game console, and whenever you bump your way against other virtual cars, or vehemently hop up and down on rugged off-road terrain, you feel the touch of collision on boh of your hand that hold a game machine, or a remote controller.

A series of tiny piezo-electronic vibrator motor, a microcontroller and a sophisticated software algorithum work together to work the magic.

IBM scientists plan to take this technology one step further to allow you to feel the fabric of the silk clothes on your smartphone when you do online-shopping for your bride’s wedding dress. Or, you and your kids can do a virtual on-line tour to a zoo using your tablet PC to see and touch a lion’s mane as if it is for real.

IBM said that scientists at IBM are developing applications for the retail, healthcare and other sectors using haptic, infrared and pressure sensitive technologies to simulate touch, such as the texture and weave of a fabric -- as a shopper brushes her finger over the image of the item on a device screen.

Utilizing the vibration capabilities of the phone, every object will have a unique set of vibration patterns that represents the touch experience: short fast patterns, or longer and stronger strings of vibrations. The vibration pattern will differentiate silk from linen or cotton, helping simulate the physical sensation of actually touching the material, IBM said in the statement.

Peek into pixel information

Pixels are the very basic elements that form still and video images.

IBM scientists are working on the technology that can analyze and translate several millions of pixel information to detect changes into color, texture, and pattern, and then read them in meaningful context. Applications of this pixel-probing technology range from healthcare to medical areas.

In the next five years, according to IBM scenario, the systems will not only be able to look at and recognize the contents of images and visual data, they will turn the pixels into meaning, beginning to make sense out of it similar to the way a human views and interprets a photograph.

In the future, "brain-like" capabilities will let computers analyze features such as color, texture patterns or edge information and extract insights from visual media. This will have a profound impact for industries such as healthcare, retail and agriculture.

For example, these capabilities can make sense out of massive volumes of medical information such as MRIs, CT scans, X-Rays and ultrasound images to capture information tailored to particular anatomy or pathologies.

What is critical in these images can be subtle or invisible to the human eye and requires careful measurement. By being trained to discriminate what to look for in images -- such as differentiating healthy from diseased tissue -- and correlating that with patient records and scientific literature, systems that can "see" will help doctors detect medical problems with far greater speed and accuracy.

Computers hear the faintest sound of natural movements

Today, IBM scientists are beginning to capture underwater noise levels in Galway Bay, Ireland to understand the sounds and vibrations of wave energy conversion machines, and the impact on sea life, by using underwater sensors that capture sound waves and transmit them to a receiving system to be analyzed.

Within five years, a distributed system of clever sensors will detect elements of sound such as sound pressure, vibrations and sound waves at different frequencies.

It will interpret these inputs to predict when trees will fall in a forest or when a landslide is imminent. Such a system will "listen" to our surroundings and measure movements, or the stress in a material, to warn us if danger lies ahead.

Raw sounds will be detected by sensors, much like the human brain. A system that receives this data will take into account other "modalities," such as visual or tactile information, and classify and interpret the sounds based on what it has learned. When new sounds are detected, the system will form conclusions based on previous knowledge and the ability to recognize patterns.

For example, "baby talk" will be understood as a language, telling parents or doctors what infants are trying to communicate. Sounds can be a trigger for interpreting a baby's behavior or needs. By being taught what baby sounds mean – whether fussing indicates a baby is hungry, hot, tired or in pain – a sophisticated speech recognition system would correlate sounds and babbles with other sensory or physiological information such as heart rate, pulse and temperature.

Digital taste buds will help you to eat smarter

What if we could make healthy foods taste delicious using a different kind of computing system that is built for creativity?

IBM researchers are developing a computing system that actually experiences flavor, to be used with chefs to create the most tasty and novel recipes. It will break down ingredients to their molecular level and blend the chemistry of food compounds with the psychology behind what flavors and smells humans prefer. By comparing this with millions of recipes, the system will be able to create new flavor combinations that pair, for example, roasted chestnuts with other foods such as cooked beetroot, fresh caviar, and dry-cured ham.

A system like this can also be used to help us eat healthier, creating novel flavor combinations that will make us crave a vegetable casserole instead of potato chips.

The computer will be able to use algorithms to determine the precise chemical structure of food and why people like certain tastes.

These algorithms will examine how chemicals interact with each other, the molecular complexity of flavor compounds and their bonding structure, and use that information, together with models of perception to predict the taste appeal of flavors.

Not only will it make healthy foods more palatable -- it will also surprise us with unusual pairings of foods actually designed to maximize our experience of taste and flavor.

In the case of people with special dietary needs such as individuals with diabetes, it would develop flavors and recipes to keep their blood sugar regulated, but satisfy their sweet tooth.

Smell: Computers will have a sense of smell

During the next five years, tiny sensors embedded in your computer or cell phone will detect if you're coming down with a cold or other illness.

By analyzing odors, biomarkers and thousands of molecules in someone's breath, doctors will have help diagnosing and monitoring the onset of ailments such as liver and kidney disorders, asthma, diabetes and epilepsy by detecting which odors are normal and which are not.

Today IBM scientists are already sensing environmental conditions and gases to preserve works of art.

This innovation is beginning to be applied to tackle clinical hygiene, one of the biggest challenges in healthcare today.

For example, antibiotic-resistant bacteria such as Methicillin-resistant Staphylococcus aureus (MRSA), which in 2005 was associated with almost 19,000 hospital stay-related deaths in the United States, is commonly found on the skin and can be easily transmitted wherever people are in close contact.

One way of fighting MRSA exposure in healthcare institutions is by ensuring medical staff follow clinical hygiene guidelines.

In the next five years, IBM technology will "smell" surfaces for disinfectants to determine whether rooms have been sanitized.

Using novel wireless "mesh" networks, data on various chemicals will be gathered and measured by sensors, and continuously learn and adapt to new smells over time.

Due to advances in sensor and communication technologies in combination with deep learning systems, sensors can measure data in places never thought possible.

For example, computer systems can be used in agriculture to "smell" or analyze the soil condition of crops. In urban environments, this technology will be used to monitor issues with refuge, sanitation and pollution – helping city agencies spot potential problems before they get out of hand.

(Credit : IBM)