Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published August 16, 2021 | Supplemental Material + Submitted + Published
Journal Article Open

Dimensionality reduction for classification of object weight from electromyography

Abstract

Electromyography (EMG) is a simple, non-invasive, and cost-effective technology for measuring muscle activity. However, multi-muscle EMG is also a noisy, complex, and high-dimensional signal. It has nevertheless been widely used in a host of human-machine-interface applications (electrical wheelchairs, virtual computer mice, prosthesis, robotic fingers, etc.) and, in particular, to measure the reach-and-grasp motions of the human hand. Here, we developed an automated pipeline to predict object weight in a reach-grasp-lift task from an open dataset, relying only on EMG data. In doing so, we shifted the focus from manual feature-engineering to automated feature-extraction by using pre-processed EMG signals and thus letting the algorithms select the features. We further compared intrinsic EMG features, derived from several dimensionality-reduction methods, and then ran several classification algorithms on these low-dimensional representations. We found that the Laplacian Eigenmap algorithm generally outperformed other dimensionality-reduction methods. What is more, optimal classification accuracy was achieved using a combination of Laplacian Eigenmaps (simple-minded) and k-Nearest Neighbors (88% F1 score for 3-way classification). Our results, using EMG alone, are comparable to other researchers', who used EMG and EEG together, in the literature. A running-window analysis further suggests that our method captures information in the EMG signal quickly and remains stable throughout the time that subjects grasp and move the object.

Additional Information

© 2021 Lashgari, Maoz. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Received: February 26, 2020; Accepted: July 27, 2021; Published: August 16, 2021. This publication was made possible in part through the support from an Investigator Sponsored Research grant from the Boston Scientific Corporation. This publication was also made possible in part through the support of the John Templeton Foundation and the Fetzer Institute. The opinions expressed in this publication are those of the author(s) and do not necessarily reflect the views of the John Templeton Foundation or the Fetzer Institute. Data Availability Statement: The WAY_EEG_GAL dataset is freely available and has become somewhat of a benchmark to test techniques to decode sensation, intention, and action from surface EMG and scalp EEG in humans performing a reach-and-grasp task (https://doi.org/10.6084/m9.figshare.c.988376). We know of no conflicts of interest associated with this publication. Author Contributions: Conceptualization: Uri Maoz. Formal analysis: Elnaz Lashgari. Methodology: Elnaz Lashgari. Supervision: Uri Maoz. Validation: Elnaz Lashgari, Uri Maoz. Visualization: Elnaz Lashgari. Writing – original draft: Elnaz Lashgari. Writing – review & editing: Elnaz Lashgari, Uri Maoz.

Attached Files

Published - journal.pone.0255926.pdf

Submitted - 2021.03.26.437230v1.full.pdf

Supplemental Material - journal.pone.0255926.s001.docx

Files

2021.03.26.437230v1.full.pdf
Files (4.5 MB)
Name Size Download all
md5:b4100d79f3ca974ecb9a9199c409345e
43.3 kB Download
md5:85267d6159a111146e68b6b22827729d
1.3 MB Preview Download
md5:d254155d5970f1368f891b0cd63b3d56
3.2 MB Preview Download

Additional details

Created:
August 20, 2023
Modified:
October 20, 2023