PROSTHETIC technologies have advanced in leaps and bounds in the years since Livingston-based Touch Bionics became the first company to develop an electrically powered prosthetic hand with five independently powered fingers.

From the simple and somewhat crude hooks of yesteryear, the technology has moved on to mind-controlled robotics, but now researchers have developed a prosthetic hand that is able to “see” objects in front of it.

The Newcastle University team – whose findings were published in the Journal of Neural Engineering – used a Logitech webcam to enable the hand to see the objects and software to assess and grasp them.

Co-author Kianoush Nazarpour, senior lecturer in Biomedical Engineering, said: “Using computer vision, we have developed a bionic hand which can respond automatically.

“In fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction.”

Prosthetics are currently controlled through electrical signals sent to the limb’s muscles from the brain.

The team says the problem here is that they do not respond quickly enough, and their hand is more intuitive.

They used neural network software to train it to recognise a variety of objects, and the sort of grip that is required for them – such as the difference between a stick, a TV remote and a mug.

It can also recognise the type of grasp required for specific objects which it has not previously encountered, using the camera to “see” them and automatically pick the most appropriate grasp.

The new technology has been tried with two amputee volunteers who had previously used split-hook prosthetics.

Now the researchers are looking to offer the hand to patients at Newcastle’s Freeman Hospital to enable them to develop it further – with the ultimate aim of creating a bionic hand that can sense pressure and temperature.

Nazarpour said: “It’s a stepping stone towards our ultimate goal, but importantly, it’s cheap and it can be implemented soon because it doesn’t require new prosthetics – we can just adapt the ones we have.”

The researchers say advanced prosthetic hands can dramatically improve the quality of a user’s life by enabling them to carry out normal, day-to-day activities with ease.

“Current commercial prosthetic hands are typically controlled via the myoelectric signals – that is the electrical activity of muscles recorded from the skin surface of the stump,” they said. “Despite considerable technical advances and improvements in the mechanical features, eg size and weight, of the prosthetic hands, the control of these systems is still limited to one or two degrees of freedom.”

The team said that in addition the process of switching a prosthetic hand into an grip mode appropriate for the target object, such as a pinch, can be cumbersome or would require a form of ad-hoc solution, such as using a mobile application or an “electrocutaneous menu” – the evoking of a tactile sensation using an electric current flowing through the skin.

They added: “We set out to translate the advances in deep learning in the robotics and computer vision research for control of hand prostheses.

“Benefiting from the flexibility that a deep learning structure offers, we developed an inexpensive vision-based system suitable for use in artificial hands.

“This solution can identify the appropriate grasp type for objects according to a learned abstract representation of the object rather than the explicitly-measured dimensions ... this approach is conceptually different from object recognition in which object details matter.”