A University of Oklahoma researcher and a surgeon at OU Health, based in Oklahoma City, had a vision for surgery. What was that? Use AI to visualize the data of a XNUMXD computed tomography during a surgery. The mission was to improve every surgery.
"Compared to a pilot flying an airplane or even an ordinary Google Maps user on his way to work, surgeons have their instruments hanging on the wall behind them," said Mohammad Abdul Mukit, an MS student at " electrical and computer engineering ”at the University of Oklahoma and a graduate and researcher. His research focuses on computer vision, augmented reality (AI) and artificial intelligence (AI) applications in medical surgeries.
Surgeons today must make rigorous surgical planning, memorize the specifics of each unique case and know all the necessary steps to ensure the safest possible surgery. They then engage in complex procedures for several hours, with no aiming or positioning devices or screens placed on the head to assist them.
"They have to feel their way to their goal and hope that everything goes as planned," Mukit said. "Through our research, we aim to change this process forever. We make "Google Maps for surgeries".
To make this vision a reality, Mukit and OU Health plastic and reconstructive surgeon Dr. Christian El Amm have been collaborating since 2019. This journey, however, began in 2018, with El Amm collaborating with energy technology company Baker Hughes.
BH specializes in use augmented reality/ mixed reality and computed tomography scanning to create XNUMXD rock sample reconstructions. For geologists and oil and gas companies, this illustration is extremely useful because it helps them to plan and perform drilling work efficiently.
This technology caught the attention of El Amm. It was envisioned that this technology, combined with Artificial Intelligence (AI), could allow him to visualize XNUMXD computed tomography data during a surgery. This could also be used to see the steps he had planned during the surgery, without ever losing sight of the patient.
However, several key challenges had to be met to prepare a prototype augmented reality system for use in surgery.
This was a complex engineering project that was done in a very short time. After completing this prototype, the team had a better understanding of the limitations of such an installation and the need for a better system.
Surgeons can not infect their hands by touching a computer to check the system. The team realized that a new, more convenient and seamless system was necessary.
"I started working on building a better system from scratch in 2019, as soon as the formal partnership with BH ended," Mukit said. "Since then, we have moved most of them basic tasks on the screen itself that is placed on the doctor's head. "
"We have developed 'marker-free monitoring', which allows the placement of computed tomography or other images using artificial intelligence instead of cumbersome markers that would provide guidance," he added. "Then we eliminated the need for any manual camera calibration."
Finally, they added voice commands. Gradually, this enriched the system with various useful features and led to unique innovations.
See also: Can AI score your next test?
El Amm has begun using the device during surgery to improve the safety and effectiveness of complex reconstructions. Many of his patients come to him for craniofacial rehabilitation after injury or deformity.
So far, he has used the device on several occasions, including reconstructing an ear patient.
In another surgery, which required an 18-stage reconstruction of the face, the device overlaid the patient's CT scan over the actual bones.
Source of information: healthcareitnews.com