• IE
Choose your location?
  • Global Global
  • Australia
  • France
  • Germany
  • Ireland
  • Italy
  • Poland
  • Qatar
  • Spain
  • UAE
  • UK

Brain/Computer Interface

19 January 2018
Ian Slater discusses the cutting edge research being undertaken in relation to the creation of a brain / computer interface for use with, specifically, upper limb control.

I recently had the honour to listen to Ian Burkhart speak at the ACRM conference in Atlanta, GA.


In conjunction with the Ohio State University Wexner Medical Center and the Battelle Memorial Institute, Ian has been engaged in a neural bridging study for some years now. He has a Blackrock Utah Array implanted in his motor cortex. For anyone who wants to see what one of these things look like please click here.


He's a brave man!


The signals from his brain are linked in real-time to muscle activation allowing some functional wrist and finger movement. Prior to hearing Ian speak, however, I hadn't fully appreciated all the challenges facing this particular area of research.


Battelle are a non-profit research and innovation company and this particular study is approved by the US FDA:


1. Chip in brain;

2. Algorithm to decode what Ian is thinking;

3. Re-code his thoughts into electrical impulses.


There is, however, day to day variability in brain activity so the researchers have to almost start afresh each day. Machine learning (or AI) is helping in this respect but simply using the decoding algorithms from one day to the next doesn't work. The algorithms have to be refined so that everything is running in real time and the 'lag' (between thought and movement) is reduced as much as possible. Total latency of the system as currently developed is @ 800 milliseconds: impressive…

It seems logical when one thinks about it but the combination of grip and movement produces 'noise' in the brain. If one thinks about hand / arm movement there is brain activity for the [in Ian's case] intact shoulder movement as well as the activity which the array is trying to decode for the hand. That means that the decoder has to be 'trained' to ignore the noise and focus just on hand grip…


There is also a need to develop new FES technology. At the moment Ian has the FES electrodes wrapped around his forearm but development is ongoing in relation to a new system with the sensors in a 'sleeve' combined with a glove. They also want to use accelerometers in the glove to allow the system to determine whether the palm is facing up or down.


Without even talking about wireless systems and the extent of computing power required it is clear we are some way off a breakthrough but, speaking personally, without the bravery of pioneers like Ian I think the world would be a poorer place (who can deny a man a bit of Guitar Hero?!!)


The nerve bypass: how to move a paralysed hand - YouTube