Quantification of finger grasps during activities of daily life using convolutional neural networks: A pilot study
Quantifying finger kinematics can improve the authors’ understanding of finger function and facilitate the design of efficient prosthetic devices while also identifying movement disorders and assessing the impact of rehabilitation interventions. Here, the authors present a study that quantifies grasps depicted in taxonomies during selected Activities of Daily Living (ADL). A single participant held a series of standard objects using specific grasps which were used to train Convolutional Neural Networks (CNN) for each of the four fingers individually. The experiment also recorded hand manipulation of objects during ADL. Each set of ADL finger kinematic data was tested using the trained CNN, which identified and quantified the grasps required to accomplish each task. Certain grasps appeared more often depending on the finger studied, meaning that even though there are physiological interdependencies, fingers have a certain degree of autonomy in performing dexterity tasks. The identified and most frequent grasps agreed with the previously reported findings, but also highlighted that an individual might have specific dexterity needs which may vary with profession and age. The proposed method can be used to identify and quantify key grasps for finger/hand prostheses, to provide a more efficient solution that is practical in their day‐to‐day tasks.
Funding
A sensorimotor PROsthesis for the upper LIMB (PROLIMB)
Engineering and Physical Sciences Research Council
Find out more...DTP 2016-2017 University of Warwick
Engineering and Physical Sciences Research Council
Find out more...History
Author affiliation
College of Science & Engineering EngineeringVersion
- VoR (Version of Record)