Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published March 2023 | Accepted Version
Journal Article Open

Capturing fine-grained details for video-based automation of suturing skills assessment

Abstract

Objectives: Manually-collected suturing technical skill scores are strong predictors of continence recovery after robotic radical prostatectomy. Herein, we automate suturing technical skill scoring through computer vision (CV) methods as a scalable method to provide feedback. Methods: Twenty-two surgeons completed a suturing exercise three times on the Mimic™ Flex VR simulator. Instrument kinematic data (XYZ coordinates of each instrument and pose) were captured at 30 Hz. After standardized training, three human raters manually video segmented suturing task into four sub-stitch phases (Needle handling, Needle targeting, Needle driving, Needle withdrawal) and labeled the corresponding technical skill domains (Needle positioning, Needle entry, Needle driving, and Needle withdrawal). The CV framework extracted RGB features and optical flow frames using a pre-trained AlexNet. Additional CV strategies including auxiliary supervision (using kinematic data during training only) and attention mechanisms were implemented to improve performance. Results: This study included data from 15 expert surgeons (median caseload 300 [IQR 165–750]) and 7 training surgeons (0 [IQR 0–8]). In all, 226 virtual sutures were captured. Automated assessments for Needle positioning performed best with the simplest approach (1 s video; AUC 0.749). Remaining skill domains exhibited improvements with the implementation of auxiliary supervision and attention mechanisms when deployed separately (AUC 0.604–0.794). All techniques combined produced the best performance, particularly for Needle driving and Needle withdrawal (AUC 0.959 and 0.879, respectively). Conclusions: This study demonstrated the best performance of automated suturing technical skills assessment to date using advanced CV techniques. Future work will determine if a "human in the loop" is necessary to verify surgeon evaluations.

Additional Information

© 2023 Springer Nature. We thank Daniel Sanford, Balint Der, Ryan Hakim, Runzhuo Ma, and Taseen Haque for data collection and grading of technical skill scores through video review. Mimic Technologies, Inc. provided access to the raw kinematic instrument data for each exercise. Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, California. Research reported in this publication was supported in part by the National Cancer Institute under Award No. R01CA251579-01A1. Ethics approval: All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. Our study complied with protocols put forth by the University of Southern California's IRB. Informed consent was obtained from all individual participants included in the study. Statement and Declarations: Andrew J. Hung has financial disclosures with Intuitive Surgical, Inc.

Attached Files

Accepted Version - nihms-1870690.pdf

Files

nihms-1870690.pdf
Files (876.4 kB)
Name Size Download all
md5:ad33b941efc37c31a28f53b8aada00bd
876.4 kB Preview Download

Additional details

Created:
August 22, 2023
Modified:
October 24, 2023