“Does it hurt when you get shot?” asks Edward Walter Furlong, looking at a leather jacket with bullet holes. “I sense injuries, the data could be called pain,” answers Arnold Schwarzenegger. The telling lines from the Sci-fi blockbuster, Terminator 2: Judgement Day, anticipates ‘machines with a mind’.
Pain is relative–and whether or not it makes sense to hardwire machines to become sentient is a moot point.
Dr Ben Seymour from Cambridge University says, “Pain is the pinnacle of consciousness, of course not a pleasant one.” The university has released a short documentary, “Pain in Machine” to explore the concept further.
Coding robots with ‘human feelings’ has wider applications. Such bots can help people on the spectrum develop social skills. Social bots are available to help veterans cope with PTSD (Post-traumatic stress disorder), and as a companion to geriatrics. In Japan, a lot of people lean on robots to keep the depression at bay.
Dr Nikhil Agarwal, CEO – IIT Kanpur (FIRST, AIIDE and C3i Hub), says the definition of pain is not the same for humans and machines. “In a machine, pain is related to the activity of the machine, for example: if there is hardware and it is being used for a long time, which results in its wear and tear and requires replacement. In terms of software, if there is a bug in the programme, a virus, or a malicious code, this causes pain that needs to be cured.”
The human-like appearances of robots can put us off. The concept is called Uncanny Valley.
“Do machines feel pain?” is a very philosophical question, says Anuj Gupta, Head of AI, Vahan. “Some robots react when hit. Does it mean they ‘feel’ pain – no. Their reaction is a combination of sensors and software. It is like a toy that reacts to one’s hand gestures. Currently, machines can’t feel anything. They can be programmed to trick humans by simulating human emotions, including pain.”
Few years ago, scientists from Nanyang Technological University, Singapore, developed ‘mini-brains’ to help robots recognise pain and activate self-repair.
The approach embeds AI into the sensor nodes, connected to multiple small, less-powerful processing units that act like ‘mini-brains’ on the robotic skin. Then, combining the system with self-healing ion gel material, robots, when damaged, can recover their mechanical functions without human intervention.
Explaining the ‘mini-brains’, co-author of the study, Associate Professor Arindam Basu, from the School of Electrical & Electronic Engineering of the university, says, “If robots have to work with humans, there is a concern if they would interact safely. To ensure a safe environment, scientists worldwide have been finding ways to bring a sense of awareness to robots, including feeling pain, reacting to it, and withstanding harsh operating conditions. However, the complexity of putting together the multitude of sensors required and the resultant fragility of such a system is a major barrier for widespread adoption.”
Mikhail Lebedev, Academic Supervisor at HSE University’s Centre for Bioelectric Interfaces, says, “Robots can even stimulate sensations of pain: some forms of physical contact which has a normal feeling or a contact that causes pain. This contact drastically changes the robot’s behaviour. It starts to avoid pain and develop new behaviour patterns, i.e. it learns – like a child who has been burned by something hot for the first time”.
Affetto can distinguish between a light touch or a hard hit. The team behind the robot said it would help robots understand and empathise with humans.
Affetto is fitted with a “pain nervous system” powered by AI) and custom skin tech to react to sensations using a variety of facial expressions.
Minoru Asada, the lead researcher on the project, said, “Engineers and material scientists have developed a new tactile sensor and attached it to Affetto, who has a realistic face and body skeleton covered in artificial skin.”
Affetto can discriminate between soft and hard touches from the detected signals, and attaching skin sensors to Affetto is helping the robot avoid any touch that causes ‘pain’. Social robots are being programmed to show empathetic reactions to pain in others through a mirroring mechanism similar to that experienced by humans.”