Computerworld

Ultrasound key to autonomous robotic surgery, researchers say

Ultrasound imagery processed with machine learning could allow robots to splice and stitch without a human at the controls

Surgery carried out with the help of robots is already well established, with machines like the Da Vinci Surgical System being used at a number of hospitals in Australia.

Operations see human surgeons use hand grips to control the robot system – which is kitted out with arms capable of fluid movement, high-definition cameras, X-ray and CT scanners and custom surgical tools – to splice and stitch with incredible precision.

The Da Vinci machines at Macquarie University Hospital, Royal Prince Alfred and Concord Repatriation Campus have already helped hundreds of patients, specialising in prostate cancer operations. They reduce the invasiveness of procedures, meaning faster recovery time for patients.

“For the last 100 prostate cancer operations we have performed across the Royal Prince Alfred and Concord Repatriation Campus, we found that robot surgery meant less blood loss, shorter hospital stays and less opioid usage compared to open surgery,” explained urological and robotic surgeon Associate Professor Ruban Thanigasalam from the University of Sydney last week.

The ultimate ambition for the robots, however, is that one day they will be able to carry out non-invasive procedures without the need for a human surgeon at the controls.

“The next question is whether we can go hands-off so robots operate by themselves,” said cardiothoracic surgeon and director of University of Sydney’s Hybrid Theatre, Professor Paul Bannon. 

Unparalleled situational awareness

That dream is edging ever closer, with QUT’s School of Electrical Engineering and Computer Science busy developing a ‘new class’ of autonomous surgical robots.

In a paper, published this week in Medical Image Analysis, researchers reviewed current surgery bots, finding an “apparent lack of volumetric imaging technology that would assign real time situational awareness to medical robots”.

At this stage, fully autonomous systems are “not widespread because they require…very high levels of automation, especially in image processing, which are not yet available” the researchers say.

To function autonomously, robots would need to create a “detailed dynamic map of the surgical site” the researchers add.

Solving that issue, would allow for robots with “unparalleled situational awareness” particularly in situations with a reduced field of view, as encountered in minimally invasive or keyhole surgeries.

The researchers make the case for surgical robots which can ‘see’ inside a patient using real-time ultrasound imagery (the Da Vinci by comparison uses a 3D high definition video feed), processed with advanced machine learning algorithms.

“Tendons and ligaments, despite their many structural similarities, and several other soft tissues can be distinguished in ultrasound images,” said Dr Davide Fontanarosa, a co-author of the paper from QUT’s Faculty of Health.

“Ultrasound imaging is portable, completely harmless to patients and avoids side-effects such as claustrophobia but requires lengthy training to be able to interpret and use it,” he added.

By applying machine learning techniques, the robots could autonomously and instantly identify different tissue types and respond accordingly, paving the way for them operating without a human at the controls.

“This imaging modality has real time capabilities, is compatible with operating theatres and is harmless for the patient, all unique characteristics making it an important component in future advancements of robotic systems, possibly fulfilling the requirements for autonomous systems implementation,” the authors wrote.