robotic skin
(Image Credit: Xinge Yu, City University of Hong Kong)

Robots with Feelings: New Robotic Skin Reproduces the Human Experiences of Touch and Pain

Chinese researchers have developed a robotic e-skin that brings robots one step closer to humans by mimicking our ability to touch, and even sense “pain” when encountering potentially dangerous surfaces.

As companies like Tesla push robots toward a fully human level of capability, recreating the sense of touch is essential not only for understanding the environment but also for navigating it safely. The team behind the robotics advancement revealed their work in a recent paper published in the Proceedings of the National Academy of Sciences.

The Importance of Pain

While pain may be among the least desirable human experiences, it plays an essential role in self-preservation. The spinal cord acts as a relay system to the brain, sending reflexive messages to our muscles in response to pain stimuli. For example, if we touch something hot, we withdraw our hand without thinking, thereby preventing a more severe burn. Alternatively, if we step on a sharp object, we lift our foot to avoid a deep wound. The signals involved in these actions are rapid, with the brain becoming aware of what has occurred only after the movement has begun.

Saving those precious seconds of processing time as the brain decodes sensory data into understanding (which results in a conscious response in humans) can make an enormous difference between receiving a minor abrasion and sustaining a serious injury. However, robots typically lack a swift, automatic system for processing external stimuli. Instead, sensors collect data, which is sent to a central processing unit (CPU).

The CPU compares the data against its program and generates an appropriate response, which is then transmitted over the robot’s data network to an actuator, which decodes the response and executes the CPU’s selected movement. While this may occur at an impressive speed, even a slight delay in action due to processing time can cause greater damage to the robot.

Challenging Environments for Robots

Automation, until now, has primarily been confined to highly controlled environments, specifically designed to safely accommodate robotic machinery, such as factory floors and laboratories.

Presently, advances in both mechanical robotics and artificial intelligence are seeking to change this. Companies such as Tesla, with its humanoid Optimus robot, are attempting to integrate robots into everyday environments to perform a variety of human tasks. Unfortunately, homes, hospitals, and workplaces are designed for humans, who can navigate with considerably more intuitive ease than pre-programmed machines.

To enable robots to match humans’ instinctive environmental responses as they move into our imperfect and sometimes hazardous world, Chinese scientists have developed a robotic e-skin (NRE-skin) that provides robots not only with a “sense” of touch, but also the ability to “feel” pain.

Previous attempts to provide robots with sensor skins have been much simpler, wrapping the robot in a sensor system that sends signals to a CPU for processing and response. By contrast, the NRE-skin processes the information obtained when a robot comes into contact with an object and identifies potentially dangerous contact (i.e., pain) within the skin itself, thereby reducing the time required for sending and receiving information.

robot skin
Modular, neuromorphic electronic skin capable of active pain and injury perception in robotic applications. Credit: Xinge Yu, City University of Hong Kong

Robotic NRE-Skin

The Chinese researchers developed their NRE-skin as a four-layer system. Like our own epidermis, the top layer features a protective coating that shields the delicate underlying components from the environment. Beneath that layer, the skin performs its functions, with layers of sensors and circuits designed to mimic human nerves. Even when nothing is touching the robot, the skin sends a “all clear” null result signal every 75-150 seconds, informing the CPU that the system is still operating correctly. If the skin is cut or damaged significantly enough, the lack of signal alerts the robot that damage has occurred in the area.

Most importantly, the skin registers touch with signals called “spikes.” These spikes occur in two forms, depending on the severity of the situation. Regular touch sends a spike to the CPU, which processes the data to understand the environment. When the skin detects an extreme event, it instead sends a spike directly to the robot’s actuators to produce an automatic response, thereby removing it from potential harm.

The team designed the skin not only to warn of real-world dangers but also to accept that harm will eventually occur in an uncontrolled environment. The skin is produced in swappable magnetic patches. While it cannot “heal” in the sense that a living creature does, it can quickly be mended by changing a patch without having to repair the entire skin covering.

Currently, the primary issue is that multiple points of contact can lead to confusion within the system. To overcome this, the next step for researchers will be to enhance the skin’s sensitivity and enable it to disambiguate between the many sensations experienced while moving through a range of environments.

The paper, “A Neuromorphic Robotic Electronic Skin with Active Pain and Injury Perception,” appeared in the Proceedings of the National Academy of Sciences on December 22, 2025.

Ryan Whalen covers science and technology for The Debrief. He holds an MA in History and a Master of Library and Information Science with a certificate in Data Science. He can be contacted at ryan@thedebrief.org, and follow him on Twitter @mdntwvlf.