Sensation in an artificial hand: Taking another step toward our posthuman future
Published 9:36 am Thursday, November 3, 2016
By BRIAN COONEY
Guest columnist
The summary in Science Daily of a recent breakthrough in neurotechnology was arresting: “Imagine being in an accident that leaves you unable to feel any sensation in your arms and fingers. Now imagine regaining that sensation, a decade later, through a mind-controlled robotic arm that is directly connected to your brain.”
The patient, Nathan Copeland, is a quadriplegic who lost all motor and sensory function in his arms and legs in an accident 12 years ago. Sensory input from his limbs can’t reach his brain because of damage to his spinal cord. He volunteered for a Brain Computer Interface (BCI) experiment at the University of Pittsburgh Medical Center (UPMC).
A team of researchers led by Robert Gaunt implanted arrays of microelectrodes in Copeland’s brain in two areas that are involved in sensory input from, and motor control of, the hand. (A microelectrode is small enough to be able to penetrate the membrane of a single neuron, and can either record or induce the cell’s characteristic electrochemical activity.)
The robotic arm was not attached to Copeland’s body except through the wiring of the BCI. When the experimenter manipulated the fingers of the robotic arm, a blindfolded Copeland was able to identify which fingers or pairs of fingers were touched and pressed when he was presented with a schematic drawing of a human hand. He said the sensations felt “natural.”
Gaunt’s group was building on the work of another UPMC team that had enabled a human subject to control the movements of the prosthetic arm via BCI. This team had developed algorithms by which a computer connected to the implanted microelectrodes could translate patterns of brain activity into electronic signals that moved fingers and other parts of a robot hand and arm. In December of 2014, they reported that a quadriplegic patient — Jan Scheuermann — who had “complete loss of upper limb motor control” was able to move “the almost human hand of a robot arm with just her thoughts to pick up big and small boxes, a ball, an oddly shaped rock, and fat and skinny tubes.”
The precision Scheuermann showed in moving the robot arm was remarkable because her sensory feedback was limited to vision. When we move our arms and hands to grip an object, we normally feel what’s going on — we have what are called “somatic” sensations. We see what we’re doing, but we also get cutaneous sensation such as touch, pressure, temperature and pain, as well as proprioceptive sensations from muscles, tendons and joints that tell us the relative positions of adjacent body parts and strength of the effort we’re making.
That is why, according to the Gaunt team’s report, Scheuermann’s “prosthetic limb movements were often slower than able-bodied movements … as might be expected when somatosensation is absent and vision is the sole source of sensory feedback.” Their limited goal was to enable their subject to experience touch and pressure in the robotic hand Scheuermann had moved. They were not yet trying to elicit proprioceptive sensation.
What the Gaunt team achieved was both very limited and yet momentous in its implications. It was limited in three ways:
1. The robotic arm was not a functioning prosthesis attached to the subject’s body;
2. The range of somatic sensations was limited to touch and pressure; and
3. The sensations were not experienced as feedback from the subject’s own controlling of the robotic arm.
To appreciate what was momentous in the Gaunt team’s experiment, we need to analyze the difference between visual awareness of our own bodies and how we experience them through somatic sensation. This will help us understand what Jennifer Collinger, a member of the UPMC team working with Jan Scheuermann, said in a personal communication to Sliman J. Bensmaia at the University of Chicago in 2015: “Thought-controlled neuroprostheses without somatosensory feedback are experienced by patients as disembodied robots, despite the fact that they can control them by mere thought.”
When I look at my hand in front of me, what do I see in it that makes me experience it as mine? The complete perception of my hand includes cutaneous and proprioceptive sensation. If I had voluntary control of my hand but only visual sensation of it, would I experience the hand as part of myself, or would it be just a thing I could move at will — something “disembodied?”
If I accidentally press the head of a pin, I could say that it hurt my finger or that it hurt me, since I hurt where my finger hurts. That finger, like any other skin area where a pin or needle penetrates, is part of me. Cutaneous sensations, by mapping onto the visual image of my body, are a major component of the experience of embodiment.
Somatic sensations belong to the category of feelings: types of experience that have in common a sense of self. Somatic sensations are localized feelings, linked to specific areas of our bodies; other feelings, such as sadness or joy, have little or no bodily location. Vision, on the other hand, is not a feeling. It is only the combination of seeing and feeling my body that yields the experience of embodiment.
These recent achievements in medical technology have as their goal to compensate people who have lost arms or legs, or sensorimotor function in those limbs from spinal damage. But they are moving us into a bionic future in which we will be able not just to replace what is broken, but also create bodies with enhanced capabilities. The transition will feel natural, since somatosensory feedback will enable us to experience these bodies as ourselves. The implications of this posthuman phase are dizzying.