(MENAFN- NewsBytes) This robotic arm can be controlled with mind: Here's how
23 Jun 2019
For years, scientists have been developing brain-computer interfaces aimed at converting neural signals (brain activity) into understandable information or actions.
The technology is still in development stage, but a group of researchers has taken a major step in making it more capable than ever.
They have developed a BCI-based robotic arm that can directly take commands from the brain.
Here's all about it.
BCIs have evolved a lot over the years
BCI
From simply detecting neural signals to actually converting them to useful information, we have come a long way with BCIs.
The developments have led many to believe that the tech could simplify the lives of those who can't speak or move.
In fact, Facebook is already working on a BCI to help people navigate through AR, while Samsung is building one to control TVs.
However, reading brain signals has always been a problem
Issue
Despite achieving promising results, researchers have had major problems while working with BCIs.
Essentially, in order to convert neural signals into meaningful action, they first have to detect the signals, which revolves around invasive surgical implants or EEG headcaps.
Now, here's the thing, surgical implants are hardly used as they pose a risk to life, while headcaps don't offer highly accurate data.
Headcaps affect ultimate actions
Fact
The headcap-based technique is non-invasive but it captures brain signals with so much noise that the decoded information doesn't come out clearly and the resulting action is also not precise.
Mind-controlled robotic arm comes as a solution
Solution
To tackle the issue of data resolution, researchers from Carnegie Mellon University developed a BCI-linked robotic arm that achieves the middle ground.
They employed novel machine learning and sensing techniques to capture neural signals from outside the skull and control the robotic device with the same level of smoothness and precision that one expects from a brain implant.
Robotic arm followed a cursor by reading brain signals
Test
In a recent test, the researchers used their non-invasive BCI on a few participants and asked them to follow a cursor on a computer screen.
As the cursor moved, the participants tracked it and the BCI clearly read the signals and replicated the same for the robot.
The machine followed the cursor's movement smoothly and kept pointing at in real-time.
Future of BCI assisted technologies
Future
Now, the researchers plan to test this machine in clinical settings.
They hope that their system could aid paralyzed or physically disabled people by giving them an advanced way to interact with their environment.
"This work represents an important step in noninvasive brain-computer interfaces, a technology which someday may become a pervasive assistive technology aiding everyone, like smartphones," lead researcher Bin He concluded.
MENAFN2306201901650000ID1098675141
Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.