Updated June 2nd, 2022 at 20:21 IST

University of Glasgow experts develop smart skin to provide robots human-like sensitivity

Robots might soon have their touch-sensitive generation as University of Glasgow researchers have developed a smart artificial skin sensitive to touch.

Reported by: Harsh Vardhan
Image: University of Glasgow | Image:self
Advertisement

In a bid to create a new generation of smart robots with human-like sensitivity, a team of researchers from the University of Glasgow has developed what they call computational electronic skin (e-skin). This is basically a prototype of an artificial skin that uses a new type of processing system. In their official report, the experts noted that this system is based on "synaptic transistors, which mimic the brain’s neural pathways" that enable the robot to learn to feel the pain. 

The University of Glasgow even shared a video explaining the mechanism of the artificial e-skin. Notably, the robot hand in the explainer video showed a remarkable ability to learn to react to external stimuli.

The idea of touch-sensitive artificial skin

So far, scientists have spent decades working on the development of touch-sensitive artificial skin for robots and the most widely-explored method is spreading an array of contact or pressure sensors across the skin's surface. When these sensors come in contact with an object, they send the data to a computer that processes the information and then responds. While it definitely sounds smart, this method causes a delay in response which ultimately reduces the skin's effectiveness in real-world tasks. 

To overcome this limitation, the Glasgow team drew inspiration from how the human peripheral nervous system interprets signals from the skin in order to eliminate latency and power consumption. This is because when our skin is exposed to a stimulus, our peripheral nervous system begins processing it at the point of contact, reducing it to only the vital information before it is sent to the brain. Scientists say that this idea of 'localised learning' reduces the amount of sensory data. With a limited amount of data, there is an efficient use of the communication channels and thus our brain receives the sensation of touch immediately.

Now since the idea was in hand, the Bendable Electronics and Sensing Technologies (BEST) Group, led by Professor Ravinder Dahiya needed to work on a model that mimics the way sensory neurons work in the human body. The experts printed a grid of 168 synaptic transistors made from zinc-oxide nanowires directly onto the surface of a flexible plastic surface. Then, they connected the synaptic transistor with the skin sensor present over the palm of a fully-articulated, human-shaped robot hand. Now, when the robotic hand was touched (like in the video) it responded by drawing its hand backward. The robot's reaction was also based on the intensity of the stimuli; the harder the touch, the harder the response. 

"We believe that this is a real step forward in our work towards creating large-scale neuromorphic printed electronic skin capable of responding appropriately to stimuli", Professor Dahiya said in a statement.

Advertisement

Published June 2nd, 2022 at 20:21 IST