Surgeons cautiously embrace medical metaverse

Join gaming leaders online at GamesBeat Summit Next this upcoming November 9-10. Learn more about what comes next. 


At the Future Surgery Show in London, it was clear that surgeons were cautiously embracing a medical metaverse to improve collaboration and medical outcomes. In many ways, the surgical industry has been a leader in embracing cutting-edge technologies like surgical robots, augmented reality, and improved patient modeling. It was equally clear that these pieces are still growing in siloed pockets that are just starting to come together.

“This is the year of robots,” declared Professor Shafi Ahmed, Chief Medical Officer at Medical Realities.

Established medical device leaders like Johnson & Johnson and Medtronic are introducing serious competition to early pioneers like Intuitive Surgical, who demonstrated the first Da Vinci robot in 1997. Medtronic recently secured European Approval for its Hugo line of robots, and Johnson and Johnson have been promoting its new Ottava system. Both companies have also partnered with NVIDIA as part of the Clara line of tools for building out the medical omniverse.

In some ways, this feels like the equivalent of Ford and GM finally jumping into the ring with Tesla. A validation that surgical automation is the next big thing, in the same way, that assisted-driving electric cars are the future of transportation.

But getting there will require not just better tools, but considerable efforts to transform data workflows and governance. Reliably capturing, organizing, and sharing medical data presents numerous cultural, institutional challenges that are more complicated than setting up a Google Street Maps program for human bodies.

Incremental progress

Ahmed, who is also a practicing surgeon at Barts Health NHS Trust in the U.K., has been a bit of a pioneer in this field, having introduced the world to the surgical metaverse in 2016. Over 55,000 people watched the 360-degree live surgical broadcast. The rest of the industry is just catching up with him.

A glance around the show floor revealed that most cutting-edge advances were making practical progress towards this aspiration. One vendor touted the clarity available from the introduction of 4K imaging to the operating room. Braun showed off a new 3D display that allows surgeons to look ahead with better ergonomics during extended operations rather than hunched over a microscope. And Epiqar was highlighting a Zoom-like service for the operating theater with privacy and compliance built-in.

In short, these kinds of incremental advances are likely to provide the most immediate value for the bulk of surgeons. One big issue is that surgeons are still sorting out the privacy and compliance issues with improving the surgical arts. The latest cameras make it easier to record detailed footage of how a surgeon managed to successfully perform a challenging operation. And surgical tool vendors also want to show off how their latest innovation made a big difference.

But the actual data is still a gray area. Hospitals allow surgeons to store and share it as long as it does not contain any identifying information. This gives surgeons some freedom to show off their best moves and learn from their peers. But it also limits the use of this information to improve patient outcomes in the long run. Daniel Goldberg, CEO of Epiqar, said, “This video is not integrated into the patient record, but it should be.”

The next significant advances in surgical automation could benefit from a deeper integration between surgical video and a patient’s medical record. This could help train AI capabilities that guide surgeons, much in the same way that Tesla’s constantly watching cameras could lead to more capable self-driving capabilities.

Waze for surgeons

In the short run, Ahmed believes surgeons are likely to see the most benefits from adding navigation capabilities to the surgical theater. A Google map of the body could help navigate surgical instruments to the right place or alert surgeons when they should consider removing an extra bit of suspect tissue. This could help even expert surgeons work more quickly and accurately, much like Google Maps helps people avoid traffic jams even when they know the route by heart.

But over time, these systems will improve, particularly when they can gather more data, not just about the video but how the surgical instruments are used themselves. For the most part, the existing surgical robots are still directly operated by a human surgeon. They are valuable because they can more control than might be possible with manually wielding a scalpel and forceps.

They also capture data about how the surgeons navigate various procedures. This data is already woven into a Da Vinci surgical simulator, which helps surgeons master new skills or practice before cutting someone open.

Down the road, these could support surgeons in more collaborative ways, much like driver assistance features can automatically break when required. In the immediate future, these systems will play more of a role in augmenting surgeons to do a better job rather than replacing them.

In the interim, surgeons, hospitals, and medical device makers will need to improve their approach to capture, manage, label, and use the data. Ahmed observed, “The hype is that AI will change the world. And now in 2022, AI can provide some good value, but it still does not do the data well.”


Originally appeared on: TheSpuzz

Scoophot
Logo