Bionic hand behaves like a real one
Led by biomedical engineers at Newcastle University and funded by the Engineering and Physical Sciences Research Council (EPSRC), the bionic hand is fitted with a camera which instantaneously takes a picture of the object in front of it, assesses its shape and size and triggers a series of movements in the hand.
Bypassing the usual processes which require the user to see the object, physically stimulate the muscles in the arm and trigger a movement in the prosthetic limb, the hand ‘sees’ and reacts in one fluid movement.
A small number of amputees have already trialled the new technology and now the Newcastle University team are working with experts at Newcastle upon Tyne Hospitals NHS Foundation Trust to offer the ‘hands with eyes’ to patients at Newcastle’s Freeman Hospital.
Publishing their findings today in the Journal of Neural Engineering, co-author on the study Dr Kianoush Nazarpour, a Senior Lecturer in Biomedical Engineering at Newcastle University, explains:
“Using computer vision, we have developed a bionic hand which can respond automatically — in fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction. Responsiveness has been one of the
main barriers to artificial limbs. For many amputees the reference point is their healthy arm or leg so prosthetics seem slow and cumbersome in comparison. Now, for the first time in a century, we have developed an ‘intuitive’ hand that can react without thinking.” “We would show the computer a picture of, for example, a stick,” explains Miss Ghazaei, who carried out the work as part of her PhD in the School of Electrical and Electronic Engineering at Newcastle University. “But not just one picture, many images of the same stick from different angles and orientations, even in different light and against different backgrounds and eventually the computer learns what grasp it needs to pick that stick up. “So the computer isn’t just matching an image, it’s learning to recognise objects and group them according to the grasp type the hand has to perform to successfully pick it up. It is this which enables it to accurately assess and pick up an object which it has never seen before — a huge step forward in the development of bionic limbs.”
Grouping objects by size, shape and orientation, according to the type of grasp that would be needed to pick them up, the team programmed the hand to perform four different ‘grasps’: palm wrist neutral (such as when you pick up a cup); palm wrist pronated (such as picking up the TV remote); tripod (thumb and two fingers) and pinch (thumb and first
finger).