REAL-TIME SURGICAL NAVIGATION IN ORAL MAXILLOFACIAL SURGERY USING AUGMENTED AI AND DEEP CNNs

Authors

  • Dr. Balasaheb Balkhande Vasantdada Patil Pratishthan’s College of Engineering & Visual Arts, Sion, Mumbai, India.

DOI:

https://doi.org/10.29121/shodhkosh.v5.i1.2024.5922

Keywords:

Oral Maxillofacial Surgery, Surgical Navigation, Augmented Reality, Deep Learning, Convolutional Neural Networks, Image Segmentation, Real-Time Visualization, Computer-Assisted Surgery

Abstract [English]

Real-time intraoperative navigation during Oral and maxillofacial surgery (OMS) is essential for achieving accuracy and for a safe operation. In this paper, we show a modified artificial intelligence (AI) framework based on deep CNNs that can automatically execute segmentation, registration, and visualization for intricate maxillofacial structures. The platform combines a deep learning model with an AR interface that displays key anatomy on the operative field as the procedure is being performed. Results shown that the proposed method can obtain high segment accuracy and robust tool tracking to enhance the intra-operative decision and improve the surgery outcome. The result of this work makes it possible to combine deep CNNs and AR technologies toward intelligent, real-time surgical navigation in OMS.

References

X. Zhang, L. Liu, and H. Wang, “Augmented reality-assisted craniofacial surgery: A systematic review,” Comput. Methods Programs Biomed., vol. 190, 2020, Art. no. 105374.

Y. Kim, J. Park, and S. Lee, “3D U-Net convolutional networks for maxillofacial CT segmentation,” IEEE Trans. Med. Imaging, vol. 40, no. 3, pp. 832–842, Mar. 2021.

J. Patel, R. Shah, and M. Desai, “AI-based prediction models for surgical outcomes in oral and maxillofacial procedures,” J. Oral Maxillofac. Surg., vol. 77, no. 6, pp. 1150–1158, Jun. 2019.

S. Lee, H. Park, and K. Choi, “Real-time instrument tracking using CNN and Kalman filtering for oral surgery,” IEEE Access, vol. 10, pp. 21456–21465, 2022.

A. Fernandez, M. Diaz, and L. Morales, “Marker-based augmented reality system for oral surgery navigation,” Int. J. Comput. Assist. Radiol. Surg., vol. 13, no. 7, pp. 1059–1067, Jul. 2018.

M. Singh, R. Kumar, and P. Gupta, “Deep CNNs with attention mechanisms for nerve canal detection in CBCT,” Med. Image Anal., vol. 72, pp. 102115, Jan. 2021.

R. Chen, L. Wu, and Y. Zhao, “Multi-modal CNN framework for surgical navigation combining CT and optical data,” IEEE J. Biomed. Health Inform., vol. 27, no. 4, pp. 1321–1330, Apr. 2023.

L. Wang, Z. Huang, and J. Lin, “Gesture recognition integrated augmented reality system for enhanced surgeon interaction,” Comput. Methods Programs Biomed., vol. 193, 2020, Art. no. 105472.

B. S. Lee and J. H. Kim, “Advancements in surgical navigation for oral and maxillofacial procedures: A review,” Oral Surg. Oral Med. Oral Pathol. Oral Radiol., vol. 126, no. 5, pp. 425–435, Nov. 2018.

D. Martinez, S. Rodriguez, and C. Perez, “AI-assisted intraoperative guidance in maxillofacial surgery using deep learning and AR,” Sensors, vol. 21, no. 13, pp. 4567, Jul. 2021.

Downloads

Published

2024-01-31

How to Cite

Balasaheb Balkhande. (2024). REAL-TIME SURGICAL NAVIGATION IN ORAL MAXILLOFACIAL SURGERY USING AUGMENTED AI AND DEEP CNNs. ShodhKosh: Journal of Visual and Performing Arts, 5(1), 2717–2721. https://doi.org/10.29121/shodhkosh.v5.i1.2024.5922