Standardising Surgical Movements for Smarter AI
Sat May 16 2026
Recent work has shown that machines can better understand what happens during an operation if they look at the smallest, intentional actions—like how a tool touches tissue. These tiny units are called gestures and they give a clearer picture than broad labels such as “cut” or “close. ” When AI systems analyze these gestures, they can link them to how skilled the surgeon is and even predict patient outcomes.
But the medical community has not agreed on a common language for describing these gestures. Different hospitals and research groups use their own terms, which makes it hard to share data or compare AI models. This lack of standardization stops progress in building reliable, generalizable surgical tools.
A group of experts from the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES) tackled this problem. Using a Delphi method—a structured round‑robin survey—they gathered opinions from many surgeons and researchers to reach consensus on a unified set of gesture definitions. The outcome is a taxonomy that everyone can use, making datasets compatible and models easier to reproduce.
The new standard will help researchers train AI that works across different hospitals, operating rooms, and even surgical specialties. It also creates a foundation for future studies that want to measure how learning new gestures improves surgeon performance.
By agreeing on what counts as a gesture, the surgical AI community can now move from scattered vocabularies to a common framework. This step is essential for turning promising research into real tools that improve safety and outcomes in the operating room.
https://localnews.ai/article/standardising-surgical-movements-for-smarter-ai-c595721c
actions
flag content