We have robots that can walk, see, talk and hear, and manipulate objects in their robotic hands. There’s even a robot that can smell. But what about a sense of touch? This is easier said than done and there are limitations to some of the current methods being looked at, but we’re developing a new technique that can overcome some of those problems. For humans, touch plays a vital role when we move our bodies. Touch, combined with sight, is crucial for tasks such as picking up objects—hard or soft, light or heavy, warm or cold—without damaging them. In the field of robotic manipulation, in which a robot hand or gripper has to pick up an object, adding the sense of touch could remove uncertainties in dealing with soft, fragile and deformable objects. Quantifying touch in engineering terms not only requires the precise knowledge of the amount of external force applied to a touch sensor, but you also need to know the force’s exact position, its angle, and how it will interact with the object being manipulated. Then there is the question about how many of these sensors a robot would need. Developing a robot skin that could contain hundreds or even thousands of touch...