PURPOSE: The use of motion sensors is emerging as a means for measuring surgical performance. Motion sensors are typically used for calculating performance metrics and assessing skill. The aim of this study was to identify surgical gestures and tools used during an open surgery suturing simulation based on motion sensor data. METHODS: Twenty-five participants performed a suturing task on a variable tissue simulator. Electromagnetic motion sensors were used to measure their performance. The current study compares GRU and LSTM networks, which are known to perform well on other kinematic datasets, as well as MS-TCN++, which was developed for video data and was adapted in this work for motion sensors data. Finally, we extended all architectures for multi-tasking. RESULTS: In the gesture recognition task the MS-TCN++ has the highest performance with accuracy of [Formula: see text] and F1-Macro of [Formula: see text], edit distance of [Formula: see text] and F1@10 of [Formula: see text] In the tool usage recognition task for the right hand, MS-TCN++ performs the best in most metrics with an accuracy score of [Formula: see text], F1-Macro of [Formula: see text], F1@10 of [Formula: see text], and F1@25 of [Formula: see text]. The multi-task GRU performs best in all metrics in the left-hand case, with an accuracy of [Formula: see text], edit distance of [Formula: see text], F1-Macro of [Formula: see text], F1@10 of [Formula: see text], and F1@25 of [Formula: see text]. CONCLUSION: In this study, using motion sensor data, we automatically identified the surgical gestures and the tools used during an open surgery suturing simulation. Our methods may be used for computing more detailed performance metrics and assisting in automatic workflow analysis. MS-TCN++ performed better in gesture recognition as well as right-hand tool recognition, while the multi-task GRU provided better results in the left-hand case. It should be noted that our multi-task GRU network is significantly smaller and has achieved competitive results in the rest of the tasks as well.
Using open surgery simulation kinematic data for tool and gesture recognition.
阅读:4
作者:Goldbraikh Adam, Volk Tomer, Pugh Carla M, Laufer Shlomi
| 期刊: | International Journal of Computer Assisted Radiology and Surgery | 影响因子: | 2.300 |
| 时间: | 2022 | 起止号: | 2022 Jun;17(6):965-979 |
| doi: | 10.1007/s11548-022-02615-1 | ||
特别声明
1、本文转载旨在传播信息,不代表本网站观点,亦不对其内容的真实性承担责任。
2、其他媒体、网站或个人若从本网站转载使用,必须保留本网站注明的“来源”,并自行承担包括版权在内的相关法律责任。
3、如作者不希望本文被转载,或需洽谈转载稿费等事宜,请及时与本网站联系。
4、此外,如需投稿,也可通过邮箱info@biocloudy.com与我们取得联系。
