Current state of the art Brain-Computer Interfaces (BCIs) detect movement imaginations and translate them into real movements by means of a neuroprosthesis. However, they can detect only the limb which is subjected to the movement imagination, e.g. left hand or right hand, and use it as a control signal for the neuroprosthesis. For example, if the BCI detects a left hand movement imagination, it can be used as a control signal for the neuroprosthesis to open the right hand. Although this allows to restore some movement functionalities and can already improve the quality of life of spinal cord injured (SCI) persons, it does not feel natural to the user.
A first improvement would be to detect movement imaginations of the same limb like hand open/close, arm rotation, etc. This would increase the number of control signals and allow for a more natural control of the neuroprosthesis and that is exactly what the MoreGrasp project aims for. Furthermore, the MoreGrasp project will go one step ahead and also investigate movement direction decoding, i.e., decoding the movement direction of an imagined arm movement from the brain. This could put BCIs on a completely new level, because the actual imagined movement is decoded instead of only the brain state associated with movement imagination. And if we know more about the imagined movement we can control the neuroprosthesis in a more natural way. Users of a neuroprosthesis would probably not need to learn artificial associations between movement imagination and neuroprosthesis movement anymore, they can directly and intuitively imagine the movement they actually want to happen. This allows for a natural neuroprosthesis control and furthermore enhances the control possibilities of it.