UChicago neuroscientists expand possibilities for realistic prosthetic limbs

Prosthetics
In a 2019 study, a man who had his left hand amputated 13 years ago was able to perform several one-handed and two-handed tasks using a neuroprosthetic hand. Prof. Sliman Bensmaia worked with researchers from the University of Utah to develop the hand, which provided sensory feedback through electrodes connected to the nerves in his forearm. (Credit: George, et al, Science Robotics, July 2019)

Recent rapid advances in neuroprosthetics—robotic prosthetic limbs that connect directly to the nervous system—suggest that it’s only a matter of time before bionic arms are commonplace.

But the more scientists learn about how the brain directs intricate movements of the arm and hand, the more daunting the challenge of creating this transhuman future seems. University of Chicago neuroscientist Sliman Bensmaia, whose research on how the brain processes sensations of touch has been incorporated into many of these projects, says each advance reveals more about what these prosthetics can and cannot achieve.

“Our understanding of what’s possible and what’s not has gotten more sophisticated,” he said.

Controlling the arm vs the hand

In experimental settings, tetraplegic subjects—who suffer partial or total paralysis in all four limbs—have been able to control robotic arms with their thoughts through electrodes implanted in the brain, and amputees can grasp objects with prosthetic hands wired directly to nerves and muscles of the arm. Increasingly, these neuroprostheses restore not just the ability to move the limb, but also the ability to feel through the robotic fingertips.

One challenge of creating such technology is proprioception, or the sense of where our body and limbs are located in space. A robotic limb needs to restore proprioceptive feedback in order to replicate the natural feel and dexterity of a native limb. This sensation is critical to our ability to move our bodies; without it, simple activities of daily living like picking up a cup or turning a door handle become much more challenging and demand much more attention.

Most of the research on the neural mechanisms that give rise to proprioception has focused on the upper arm. Little is known about how the brain tracks the movements and posture of the hand, but emerging research shows that the process is quite different from the way it tracks the arm.

In a 2019 study published in Neuron, Bensmaia and his frequent collaborator Nicho Hatsopoulos—a fellow UChicago neuroscientist who studies how the brain controls movement—recorded the hand movements of monkeys as they grasped various objects. As the monkeys reached out for objects, they positioned their fingers to match the shape of each object before they grasped it. In areas of the brain that process movement and the sense of touch, neurons tracked the posture of the hand during these grasping movements. But during reaching movements, neurons in these structures tracked the speed of the arm, not its posture.

“The subtext here is that the hand is really different than the arm,” said Bensmaia, the James and Karen Frank Family Professor of Organismal Biology and Anatomy. “To move your arm, the brain basically sends your muscles a velocity signal and represents the movements of the arm. But what the brain seems to be telling the hand is to adopt a certain posture and to then track that posture without regard to the movement itself.”

Decoding complex movements of the hand

The human hand and fingers can perform incredibly intricate movements, from holding tools and grasping food to typing and playing the piano. This kind of manual dexterity is a major reason for our success as a species today—and becomes more complicated the closer you look.

The hand and finger joints have about 30 different dimensions of movement, meaning it would take 30 different knobs to control a hand, including some joints can move in several different ways. According to one school of thought in neuroscience, the brain simplifies this complexity by controlling coordinated movements of the joints – so-called synergies – thereby streamlining hand movements to about six dimensions. The idea is that, instead of 30 dimensions, only a small subset of all possible hand movements can be achieved under volitional control.

In another study published earlier this year in Nature Communications, Yuke Yan, a graduate student in Bensmaia’s lab, tested the complexity of these movements by asking eight human subjects to perform two manual tasks: grasping objects and signing letters in American Sign Language (only three of the volunteers familiar with ASL did this part). The researchers attached infrared markers on all the joints and along the fingers so cameras mounted around the room could track the precise movements.

In collaboration with Sara Solla from Northwestern University, the research team developed an algorithm to guess what object was about to be grasped based on the hand posture before contact. The algorithm allowed the researchers to do this nearly perfectly. Even after removing 28 dimensions of hand postures, they found that they could still identify the objects or ASL letters above chance. In other words, the last couple of hand postural dimensions were also under volitional control and not mere noise—all 30 dimensions have a purpose, so even the remaining two contribute to the precise hand posture in an object specific way.

To Bensmaia, that means that the brain isn’t taking shortcuts while you’re picking up a coffee mug or sending a text—it’s deliberately controlling the hand in its exquisite complexity.

“Think about vision. You’re constantly dealing with hundreds of dimensions of visual space, effortlessly,” he said. “You have 100 billion neurons in your brain to deal with this kind of complexity, so 30 dimensions of hand movement is not that complex in the end. It’s a rethinking about how the brain controls the hand.”

Reproducing natural feelings of touch and movement

One of the main lines of Bensmaia’s research is to understand the neural mechanisms that help produce the sense of touch—and then to use what his team has learned to restore the sense of touch via bionic hands. Bensmaia is a principal architect of the so-called “biomimetic” approach to building neuroprosthetics, which seeks to reproduce the biological, natural feelings of touch by mimicking the same patterns of activity in the nerve or in the brain that are produced when able-bodied individuals interact with objects.

In 2019 study published in Science Robotics, he worked with a team from the University of Utah to outfit a man who had his left hand amputated 13 years ago with a neuroprosthetic hand that provided sensory feedback through the remaining nerves in his forearm.

The research team was able to show that when sensory feedback was enabled, the patient was better able to control the precision of his grip to handle fragile objects. He could also distinguish between small and large objects, or hard and soft ones. Importantly, when that sensory feedback was biomimetic – that is, when they attempted to mimic the patterns of nerve activity that are produced in an intact arm – the patient was able to identify objects in his grasp significantly faster than if the signals were based on arbitrary intensity. He was also better able to grasp and transport fragile objects with biomimetic artificial touch.

“When you grasp things, you’re not responding to pressure per se, you’re responding to changes in pressure,” Bensmaia said. “Now we’ve built a model of that into this bionic hand. Basically, speaking the natural language of the nervous system improves the utility of these prosthetics.”

While every study seems to peel back another layer of complexity to the brain and nervous system, Bensmaia remains undaunted by the challenge of understanding these sensory processes.

“The field of neuroprosthetics is riddled with a lot of ‘gee-whiz’ results that do not stand the test of time,” he said. “What we do in our lab, and what I’m most proud of, is that we approach these questions using a rigorous scientific approach. We dissect each result to make sure it’s robust using a variety of experimental approaches, so we feel confident about it in the end.”