Heidari r., R.,
Khameneh e.a., E.,
Rastaghi, A.,
Dindarloo m.r., M.R.,
Ardakani, M.M.N.,
Ahmadi m.j., M.J.,
Mohammadzadeh, M.,
Riazi-esfahani, H.,
Lashay, A.,
Motaharifar, M. Publication Date: 2026
Scientific Reports (20452322)16(1)
Deep vitrectomy, a complex ophthalmic procedure, demands precise instrument control and remains challenging to master. This study evaluated the ARASH:ASiST robotic system as a platform for real-time, quantitative assessment of deep vitrectomy in a pre-clinical setting using cadaveric human eyes. The system records force and motion data without disrupting the surgical workflow. Cadaveric eyes were prepared under strict safety protocols, and a mannequin head with a customised eye holder was designed, refined through repeated testing, and approved by surgeons to fully imitate real-world conditions. Four surgeons with different levels of experience, defined by their years of practice, performed the “drunk walk” manoeuvre while intraoperative force, positional, and temporal data were captured. Despite the inherent limitations of human cadaveric eyes, the aim was to reproduce the drunk walk method as a precise, predefined trajectory for posterior vitreous detachment induction. In total, 14 surgical trials were recorded: 2 from one expert surgeon (right-handed), 4 from two fellows (right-handed), and 8 from one intermediate surgeon, who performed four procedures with each hand. Quantitative metrics, including Normalised Jerk, Force Range, Peak Force Magnitude, and Force RMS, were used to compare individual performances. Group-level comparisons between expert, fellow, and intermediate surgeons were conducted using ANOVA and Kruskal–Wallis tests. While not statistically significant, likely due to the limited sample size, the analysis of descriptive statistics and effect sizes indicated potential trends in performance metrics across experience levels, highlighting the feasibility of the platform for capturing such variations. During data acquisition, the robotic system’s remote-centre-of-motion was accurately calibrated with laser-based technology, while encoders and force sensors were also precisely calibrated. The system’s real-time graphical interface provided immediate feedback and enabled detailed postoperative analysis. Usability testing confirmed its practicality and non-intrusiveness. Overall, the ARASH:ASiST system demonstrates feasibility as an objective platform for the evaluation of deep vitrectomy based on ex vivo results. These findings support further validation of this prototype in synthetic and clinical environments. © The Author(s) 2025.