Robots now see the world with an ease that once belonged only to science fiction. They can recognise objects, navigate cluttered spaces and sort thousands of parcels an hour. But ask a robot to touch something gently, safely or meaningfully, and the limits appear instantly.
As aresearcher in soft roboticsworking on artificial skin and sensorised bodies, I’ve found that trying to give robots a sense of touch forces us to confront just how astonishingly sophisticated human touch really is. My work began with the seemingly simple question of how robots might sense the world through their bodies. Develop tactile sensors, fully cover a machine with them, process the signals and, at first glance, you should getsomething like touch.
Except that human touch is nothing like a simple pressure map. Our skin containsseveral distinct types of mechanoreceptor, each tuned to different stimuli such as vibration, stretch or texture. Our spatial resolution is remarkably fine and, crucially, touch is active: we press, slide and adjust constantly, turning raw sensation into perception through dynamic interaction.
Read Full Article on Daily Maverick
[paywall]
Engineers can sometimes mimic a fingertip-scale version of this, but reproducing it across an entire soft body, and giving a robot the ability to interpret this rich sensory flow, is a challenge of a completely different order. Working on artificial skin also quickly reveals another insight: much of what we call “intelligence” doesn’t live solely in the brain. Biology offers striking examples – most famously, the octopus.
Octopuses distribute most of their neurons throughout their limbs. Studies of their motor behaviour show an octopus arm cangenerate and adapt movement patterns locallybased onsensory input, with limited input from the brain. Their soft, compliant bodies contribute directly to how they act in the world.
And this kind of distributed, embodied intelligence, where behaviour emerges from theinterplay of body, material and environment, is increasinglyinfluential in robotics. Touch also happens to be the first sense that humans develop in the womb. Developmental neuroscience shows tactile sensitivity emerging from around eight weeks of gestation, then spreading across the bodyduring the second trimester.
Long before sight or hearing function reliably, the foetus explores its surroundings through touch. This is thought to help shape how infants begin forming an understanding of weight, resistance and support – the basic physics of the world. This distinction matters for robotics too.
For decades, robots have relied heavily on cameras andlidars(a sensing method that uses pulses of light to measure distance) while avoiding physical contact. But we cannot expect machines to achieve human-level competence in the physical world if they rarely experience it through touch. Simulation can teach a robot useful behaviour, but without real physical exploration, it risks merely deploying intelligence rather than developing it.
To learn in the way humans do, robots need bodies that feel. One approach my group is exploring is giving robots a degree of “local intelligence” in their sensorised bodies. Humans benefit from the compliance of soft tissues: skin deforms in ways that increase grip, enhance friction and filter sensory signals before they even reach the brain. This is a form of intelligence embedded directly in the anatomy.
[/paywall]