Skip to the content.

Brainstorming Ideas

Contents

Brainstorming Ideas

Topology

Sensors and feedback

Interface

Visual

Postures

Documents

Initially needed information

Other Ideas

Grip Lock

A real hand holding an object can be controlled by two group of muscles. The hand muscles control the gripping, and the arm muscles can move the whole arm. These two groups of muscles work independently, so a person cam move the arm while holding an object.

A prosthetic hand is controlled by arm muscles only, so it is impossible for a person to both hold an object and move the arm, because arm movement might be interpreted as some hand gesture and the object might be dropped.

The grip lock concept resolves this problem by switching the target of arm muscles.

Thus the process of grabbing and moving an object is:

To implement the grip lock functionality, it is required to have a special lock/unlock signal generate by either the arm muscles, or by other means.

Grasp taxonomy

There is a taxonomy of gestures where more gestures are being described. It would be nice if all these gestures are also supported. Currently some of the gestures are impossible to the virtual hand, because the thumb need additional rotation axis at its base to allow abducted and adducted position.

A few pointers:

Shoulder control

Most smartphones have sufficiently precise local motion sensors via gyroscope. A smartphone attached to the shoulder can be used as motion sensor for transradial, transhumeral and shoulder disarticulation amputees. For forequarter amputees the smartphone can be attached to the other arm.

This idea is based on the observation that the shoulder has at least two degrees of freedom and people have better control over its motions (compared to capturing signals via surface electromyography). Additionally, shoulder control is more consistent across different peopls, and it allows more complex pattern motions, compareable to writing simple characters.

Other advatnages of this approach is the availability of technology – it is assumed that most people already have smartphones, no there is no need to purchase new hardware. Attachment to the shoulder could be via a velcro surface, a shoulder belt or just a poket on the top or front of the shoulder.

Finally, the smartphone CPU can be used to process the shoulder motion and to control the prosthetic hand.

References

Grasp reference

A Sketchpad model of a human head making the sign language gestures for all letters and digits is available at Male Hands Alphabet Numbers. It can be used as a reference so that to verify whether all these gestures can be made with the virtual hand.

Additionally, The Japanese Sigh Language has a larger set of fingerspelling gestures, shown in Practice JSL Fingerspelling or in Japanese manual syllabary

Detailed hand

A Sketchpad model is a detailed 3D model of robotic/prosthetic hand. The license is CC-BY, so it is possible to download it and test whether Virtual Prosthetics can use this model. The model is jointed hands BJD.