by Jill Rosen - A robot trained on videos of surgeries performed a lengthy phase of a gallbladder removal without human help. The robot operated for the first time on a lifelike patient, and during the operation, responded to and learned from voice commands from the team—like a novice surgeon working with a mentor.
The robot performed unflappably across trials and with the expertise of a skilled human surgeon, even during unexpected scenarios typical in real life medical emergencies.
The federally funded work, led by Johns Hopkins University researchers, is a transformative advancement in surgical robotics, where robots can perform with both mechanical precision and human-like adaptability and understanding.
"This advancement moves us from robots that can execute specific surgical tasks to robots that truly understand surgical procedures," said medical roboticist Axel Krieger. "This is a critical distinction that brings us significantly closer to clinically viable autonomous surgical systems that can work in the messy, unpredictable reality of actual patient care."
The findings are published today in Science Robotics.
In 2022, Krieger's Smart Tissue Autonomous Robot, STAR, performed the first autonomous robotic surgery on a live animal—a laparoscopic surgery on a pig. But that robot required specially marked tissue, operated in a highly controlled environment, and followed a rigid, predetermined surgical plan. Krieger said it was like teaching a robot to drive along a carefully mapped route.
But his new system, he says, "is like teaching a robot to navigate any road, in any condition, responding intelligently to whatever it encounters."
Hierarchical surgical robot transformer, SRT-H, truly performs surgery, adapting to individual anatomical features in real-time, making decisions on the fly, and self-correcting when things don't go as expected.
Built with the same machine learning architecture that powers ChatGPT, SRT-H is also interactive, able respond to spoken commands ("grab the gallbladder head") and corrections ("move the left arm a bit to the left"). The robot learns from this feedback.
"This work represents a major leap from prior efforts because it tackles some of the fundamental barriers to deploying autonomous surgical robots in the real world," said lead author Ji Woong "Brian" Kim, a former postdoctoral researcher at Johns Hopkins who's now with Stanford University. "Our work shows that AI models can be made reliable enough for surgical autonomy—something that once felt far-off but is now demonstrably viable."
Click here to continue reading from hub.jhu.edu.