tim robotic arm

What Is Brain-Computer Interface (BCI) Technology?

Brain-computer interface (BCI) technology currently is at the center of many medical studies as a possible tool for controlling prosthetic devices with your mind. BCI works by placing electrodes on the surface of the brain so that neural signals communicate with the prosthetic, allowing it to move. Calibration of the prosthetics involves having participants watch animations of arm and hand movements, as well as imagine them, to help their brains send signals to the electrodes that are connected to the prosthetics.

Brain Computer Interface at UPMC

Since 2011, University of Pittsburgh researchers have been studying brain-computer interface (BCI) technology through clinical trials. So far, their research has allowed trial participants to control a prosthetic arm to reach for, grasp, and place objects.

First Trial: High-Five

Tim Hemmes, who was paralyzed during a motorcycle accident, was the first participant to reach out using the robotic arm and hand designed by the Johns Hopkins University Applied Physics Laboratory. An electrocortigraphy (ECoG) grid the size of a large postage stamp was placed on the surface of Tim’s brain by UPMC neurosurgeon Elizabeth Tyler-Kabara, MD, PhD. Co-principal investigators Michael Boninger, MD, Wei Wang, MD, PhD, and Andrew Schwartz, PhD, watched and recorded data as Tim was able to lift his arm and give a high-five. The project was chosen for a Popular Mechanics’ Breakthrough Award in 2012, and the findings were published in February 2013.

Second Trial: Seven Dimensions

In 2012, Dr. Tyler-Kabara put two small, square electrode grids on Jan Scheuermann‘s brain in the area that controls right arm and hand movement. Co-investigators, including, Jennifer Collinger, PhD, Dr. Schwartz, and Dr. Boninger, found that Jan could reach in and out, left and right, and up and down. She also was eventually able to flex the wrist back and forth, move it from side to side, and rotate it clockwise and counter-clockwise, as well as grip objects. All of these movements add up to what is known as 7D control.

With control of all seven dimensions, Jan was able to pick up blocks, tubes of different sizes, a ball, and a stone, and put them down on a tray that was close by. She also picked up cones from one base to restack them on another a foot away. Jan, too, could reach out to give a high-five, and she even fed herself chocolate with the robotic arm.

Second Trial, Part Two: Ten Degrees

In addition to the 7D control, Jan was able to increase her movements to 10D in the second part of the trial. She successfully replaced the simple pincer grip with the following movements:

  • Finger abduction — fingers spread out
  • Scoop — last fingers curl in
  • Thumb opposition — thumb moves outward from the palm
  • Pinch of the thumb, index, and middle fingers

These findings were published in the Journal of Neural Engineering in December of 2014.

For more information on brain-computer interface technology, please visit the UPMC website.