Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published December 9, 2022 | Submitted
Report Open

Compositional coding of individual finger movements in human posterior parietal cortex and motor cortex enables ten-finger decoding

Abstract

Objective. Enable neural control of individual prosthetic fingers for participants with upper-limb paralysis. Approach. Two tetraplegic participants were each implanted with a 96-channel array in the left posterior parietal cortex (PPC). One of the participants was additionally implanted with a 96-channel array near the hand knob of the left motor cortex (MC). Across tens of sessions, we recorded neural activity while the participants attempted to move individual fingers of the right hand. Offline, we classified finger movements from neural firing rates using linear discriminant analysis (LDA) with cross-validation. The participants then used the neural classifier online to control individual fingers of a brain-machine interface (BMI). Finally, we characterized the neural representational geometry during individual finger movements of both hands. Main Results. The two participants achieved 86% and 92% online accuracy during BMI control of the contralateral fingers (chance = 17%). Offline, a linear decoder achieved ten-finger decoding accuracies of 70% and 66% using respective PPC recordings and 75% using MC recordings (chance = 10%). A compositional code linked corresponding finger movements of the contralateral and ipsilateral hands. Significance. This is the first study to decode both contralateral and ipsilateral finger movements from PPC. Online BMI control of contralateral fingers exceeded that of previous finger BMIs. PPC and MC signals can be used to control individual prosthetic fingers, which may contribute to a hand restoration strategy for people with tetraplegia.

Additional Information

The copyright holder for this preprint is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license. We thank participant N and participant J for making this research possible. We also thank Kelsie Pejsa and Viktor Scherbatyuk for administrative and technical assistance; Spencer Kellis for assistance with the robot hand. This study was funded by NIH National Eye Institute grants UG1EY032039 and R01EY015545, the T&C Chen Brain-machine Interface Center (C.G., T.A., K.K., J.G., R.A.A.), Amazon AI4Science Fellowship (C.G.), and the Boswell Foundation (R.A.A). The authors have declared no competing interest.

Attached Files

Submitted - 2022.12.07.22283227v1.full.pdf

Files

2022.12.07.22283227v1.full.pdf
Files (4.2 MB)
Name Size Download all
md5:8c0e3302bbcd5ab7c51d8cb6711f7060
4.2 MB Preview Download

Additional details

Created:
August 20, 2023
Modified:
December 22, 2023