IBM already works on an application that will enable you to feel the structure of the material through the vibrations of the phone. Through the "power" of the vibration it will be possible, for example, to differ cotton from a silk shirt. Through the vibration we will able to "lie" the brain that we have object in our hands, that is not actually there.
With better understanding of images, the computers will begin to give more meaning with a simple sum of pixels. In some pictures, colors are more important, for example the landscapes or pictures of beaches, while in other the edges of the elements are more important, such as images of buildings or similar. When computers will be able to "learn" the differences and how to "understand" images, the use of images in medicine, especially in diagnostics will be invaluable. For example, in future, comparisons with pictures of previous and present body state, could diagnose skin cancer at an early stage.
If the Microsoft speech translation looked amazing, the prediction of IBM is even more stunning. The computer will be able to understand the language of the babies and it will translate to the parents why the baby cries and what the baby needs. In addition, with setting of sound sensors in soil, the computer will be able to predict if somewhere there will be a flood or landslide.
With the dissection of food molecules from which the food is composed, and the release of flavor, the computer can tell you whether or not you will like a dish before you even try it. Similarly, you can "generate" dish that is healthy and also delicious.
Similar to eating, the smell can be determined by the molecules that computer "breathes". If these sensors are built into the home or as in smartphones, the devices will be able to "smell" if you are sick before symptoms of the disease.