Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published June 2023 | Published
Journal Article Open

Decoding and geometry of ten finger movements in human posterior parietal cortex and motor cortex

Abstract

Objective. Enable neural control of individual prosthetic fingers for participants with upper-limb paralysis. Approach. Two tetraplegic participants were each implanted with a 96-channel array in the left posterior parietal cortex (PPC). One of the participants was additionally implanted with a 96-channel array near the hand knob of the left motor cortex (MC). Across tens of sessions, we recorded neural activity while the participants attempted to move individual fingers of the right hand. Offline, we classified attempted finger movements from neural firing rates using linear discriminant analysis with cross-validation. The participants then used the neural classifier online to control individual fingers of a brain–machine interface (BMI). Finally, we characterized the neural representational geometry during individual finger movements of both hands. Main Results. The two participants achieved 86% and 92% online accuracy during BMI control of the contralateral fingers (chance = 17%). Offline, a linear decoder achieved ten-finger decoding accuracies of 70% and 66% using respective PPC recordings and 75% using MC recordings (chance = 10%). In MC and in one PPC array, a factorized code linked corresponding finger movements of the contralateral and ipsilateral hands. Significance. This is the first study to decode both contralateral and ipsilateral finger movements from PPC. Online BMI control of contralateral fingers exceeded that of previous finger BMIs. PPC and MC signals can be used to control individual prosthetic fingers, which may contribute to a hand restoration strategy for people with tetraplegia.

Additional Information

© 2023 The Author(s). Published by IOP Publishing Ltd. Original content from this work may be used under the terms of the Creative Commons Attribution 4.0 license. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI. We thank participant NS and participant JJ for making this research possible. We also thank Kelsie Pejsa and Viktor Scherbatyuk for administrative and technical assistance; Spencer Kellis for assistance with the robot hand. This work was supported by an Amazon AI4Science Fellowship, National Eye Institute Awards UG1EY032039 and R01EY015545, the T&C Chen Brain-Machine Interface Center, and the James G. Boswell Foundation. Data availability statement. The data that support the findings of this study are openly available at the following URL: https://dandiarchive.org/dandiset/000252.

Attached Files

Published - Guan_2023_J._Neural_Eng._20_036020.pdf

Files

Guan_2023_J._Neural_Eng._20_036020.pdf
Files (2.1 MB)
Name Size Download all
md5:7afe5bc6489fa5567ed86669284d71ce
2.1 MB Preview Download

Additional details

Created:
August 22, 2023
Modified:
December 22, 2023