Imagine a prosthetic arm with the sensory capabilities of a human arm, or a robotic ankle that mimics the healthy ankle’s response to changing activity. Unfortunately, we aren’t quite there yet, but with new increasingly efficient machine learning and artificial intelligence, that vision is now closer than ever before.

There are interesting developments in prosthetic technology:

Sensory feedback

Modern bionics currently utilise EMG (electromyography) as signal input for the prosthetic, which is then digitised and used for computations resulting in user-directed movement. While this is already an amazing facility, a monumental obstacle currently exists that prevents precise movements in real-time.

Artificial intelligence and machine learning technologies are revolutionising prosthetic technology in this area by using sensors and other sources within the limb to gather information from the outside world (such as any potential hazard or the surface that’s being walked on) enabling the prosthesis to make real time adjustments and respond to the external environment more intuitively.

These devices can learn to mimic natural movements helping users perform everyday tasks more easily and can provide sensory feedback allowing the user to feel sensations like pressure and temperature improving their ability to interact with the world around them. ‘Smart skin’ is an example of this type of IA, with potential applications for upper-limb prosthetic devices.

One leading prototype combines touch sensitivity with an onboard learning system that enables artificial skin to react appropriately to stimuli.

Translating intention 

One high profile example of this type of technology is the Esper Hand, which has intuitive self-learning technology that can predict the intended movements of the user. The hand uses brain-computer interfaces (BCIs) to interpret the user’s neural signals and translate them into movements. It learns to correlate the patterns of nerve signals with specific hand movements, so the more the wearer uses it the faster and more accurate the hand becomes. The result is a remarkable degree of dexterity.

The Utah Bionic Leg builds intent detection into a lower-limb prosthesis. In the Utah Leg’s more powerful AI system, additional sensors gather input from the muscles in the residual limb and correlate those signals to the user’s intent. That extra layer of data supports even more natural, intuitive motion than a standard microprocessor knee.

Personalisation

Prosthetic devices equipped with AI capability also hold the potential to integrate more seamlessly with the user, adapting as their habits and bodies change. The most pertinent example of this type of AI technology are smart sockets. These devices are equipped with sensors that detect volume changes in the residual limb over time, then automatically adjust the socket to maintain a secure, comfortable fit.

All use some form of AI that make them responsive to the user’s tendencies, enabling the socket to “learn” whether they prefer a tighter or looser fit, or to anticipate individual pattern related behaviour.

Predictive maintenance

AI is expected to monitor the prosthetic’s performance and predict when maintenance or repairs are needed, reducing downtime and improving reliability. This can help users avoid unexpected breakdowns and ensure that their prosthetic is always functioning at its best.

Among those competing in this space are Ottobock, makers of the bebionic hand, ReWalk, who produce a powered walking assistance system. Icelandic firm Össur also produce a mind-bionic prosthetic for lower-limb amputees.

Advancements in artificial intelligence have opened new doors for the field of prosthesis and rehabilitation, enabling more precise, efficient, and personalised care for individuals with limb loss or dysfunction. Despite their potential, smart prosthetics are still a long way from becoming a reality for most people who need them, in large part due to the relatively small population and high costs. We watch with interest in the meantime.

This blog was co-authored by Associate, Jennifer Walsh.