tim robotic arm

Brain-computer interface (BCI) technology currently is at the center of many medical studies as a possible tool for controlling prosthetic devices with your mind. BCI works by placing electrodes on the surface of the brain so that neural signals communicate with the prosthetic, allowing it to move. Calibration of the prosthetics involves having participants watch animations of arm and hand movements, as well as imagine them, to help their brains send signals to the electrodes that are connected to the prosthetics.

Never Miss a Beat!

Get Healthy Tips Sent to Your Phone!

Message and data rates may apply. Text the word STOP to opt out and HELP for help. Click here to view the privacy and terms.

Brain Computer Interface at UPMC

Since 2011, University of Pittsburgh researchers have been studying brain-computer interface (BCI) technology through clinical trials. So far, their research has allowed trial participants to control a prosthetic arm to reach for, grasp, and place objects.

First Trial: High-Five

Paralyzed man moves robotic arm with his thoughts

Tim Hemmes, who was paralyzed during a motorcycle accident, was the first participant to reach out using the robotic arm and hand designed by the Johns Hopkins University Applied Physics Laboratory. An electrocortigraphy (ECoG) grid the size of a large postage stamp was placed on the surface of Tim’s brain by UPMC neurosurgeon Elizabeth Tyler-Kabara, MD, PhD. Co-principal investigators Michael Boninger, MD, Wei Wang, MD, PhD, and Andrew Schwartz, PhD, watched and recorded data as Tim was able to lift his arm and give a high-five. The project was chosen for a Popular Mechanics’ Breakthrough Award in 2012, and the findings were published in February 2013.

Second Trial: Seven Dimensions

In 2012, Dr. Tyler-Kabara put two small, square electrode grids on Jan Scheuermann’s brain in the area that controls right arm and hand movement. Co-investigators, including, Jennifer Collinger, PhD, Dr. Schwartz, and Dr. Boninger, found that Jan could reach in and out, left and right, and up and down. She also was eventually able to flex the wrist back and forth, move it from side to side, and rotate it clockwise and counter-clockwise, as well as grip objects. All of these movements add up to what is known as 7D control.

One Small Nibble, One Giant Bite

With control of all seven dimensions, Jan was able to pick up blocks, tubes of different sizes, a ball, and a stone, and put them down on a tray that was close by. She also picked up cones from one base to restack them on another a foot away. Jan, too, could reach out to give a high-five, and she even fed herself chocolate with the robotic arm.

One Small Nibble, One Giant Bite

Second Trial, Part Two: Ten Degrees

In addition to the 7D control, Jan was able to increase her movements to 10D in the second part of the trial. She successfully replaced the simple pincer grip with the following movements:

  • Finger abduction — fingers spread out
  • Scoop — last fingers curl in
  • Thumb opposition — thumb moves outward from the palm
  • Pinch of the thumb, index, and middle fingers

These findings were published in the Journal of Neural Engineering in December of 2014.

For more information on brain-computer interface technology, please visit the UPMC website.

About Physical Medicine and Rehabilitation

At UPMC, we strive to improve your function after injury or illness. We help people recover from functional, pain-related, and neurological conditions, with both inpatient and outpatient care available. We are dedicated to providing you with exceptional clinical care and focused on developing new technologies and treatments to help you achieve mobility and maintain independence. Find a provider near you.