Gert-Jan Oskam, aged 40, broke his neck in a biking accident in 2011.
Researchers used a “digital bridge” to reestablish connections between the brain and legs, enabling Oskam to stand and walk naturally.
“A few months ago, I was able, for the first time after 10 years, to stand up and have a beer with my friends,” said Oskam.
The study, published in Nature, says, “We restored this communication with a digital bridge between the brain and spinal cord that enabled an individual with chronic tetraplegia to stand and walk naturally in community settings.”
Researchers have been working on brain-spine interfaces for several years, including a 2016 project that enabled a paralyzed monkey to move its legs and another which restored sensation to the hand of a man with a spinal cord injury.
This is the most comprehensive project yet, highlighting AI’s ever-growing role in novel medical applications.
Developing the brain–spine interface
The “digital bridge” is a brain-spine interface that reads neuronal activity from the brain, converts it into electric signals, and sends them across the spinal injury to healthy neurons on the other side.
Most spinal cord injuries don’t directly damage neurons – they disrupt the descending pathways that connect the brain to the spine and peripheral nervous system.
The role of AI
In order to re-connect the brain to the legs, electrodes were connected to Oskam’s brain to read electrocorticographic (ECoG) brain activity.
To walk, we use muscles in the hip, knee, and ankle muscles – the interface has to attribute brain activity to different muscle groups in both the right and left legs.
Previous projects found brain activity difficult to analyze, making it challenging to predict the intent of each thought.
This is where AI helps – researchers devised a method of filtering and decoding brain activity using machine learning (ML).
Researchers employed algorithms to perform two functions:
- One first model predicts the probability of the intention to move a specific joint.
- The other model predicts the amplitude and direction of the movement.
Once the interface was fitted, Oskam participated in a training program that required him to read visual cues through an interface.
The program instructed him on what movements to focus on, helping calibrate the AI model to “decode” his thoughts and stimulate the correct muscles.
The researchers conducted a series of tests, including 6 and 10-minute test walks, where Oskam managed to walk 100m, standing tests, stair climbing, and walking over rough and uneven terrain. His walking abilities were much improved and got better with each training session.
Additionally, the interface had a long-term effect on Oskam’s walking ability even when switched off. After 40 training sessions, Oskam could walk more confidently with his walking aids, highlighting the potential of such devices for long-term rehabilitation.
AI has supported similar applications in MedTech, such as brain-computer interfaces that turn thoughts into speech. These devices could restore speech for those who suffered a brain injury or neurodegenerative disease such as amyotrophic lateral sclerosis (ALS).
In time, these devices will become easier to develop and install and could even work without invasive surgery.