Using augmented reality to guide bone conduction device … – Nature.com

Specimen preparation

Whole cadaveric heads were prepared with bilateral curvilinear post-auricular incisions with elevation of a soft tissue flap for exposure of the zygomatic root, posterior external auditory canal, and the mastoid tip. Eight 2mm bone wells were drilled outside of the surgical field to act as fiducial references for eventual image guidance calibration within the experimental arm. Areas of placement included the zygomatic root, bony external auditory canal, and the mastoid tip.

Using a prototype intraoperative cone-beam computed tomography scanner (Powermobil, Siemens, Germany), the cadaver heads were obtained, with an isotropic voxel size of 0.78mm12. Scans were evaluated for abnormal anatomy or evidence of previous surgery. Both the O-OSI and BB-FMT devices were imaged for surgical modelling by creating the virtual rendering of hearing device for projecting the overlay during the procedure. Materialise Mimics Medical 19.0 (Materialise NV, Belgium) was used to identify optimal placement of the devices with creation of virtual heads rendered from CT imaging using pre-set bony segmentation sequencing.

Implants were imported into Materalise Mimics as optimized triangulated surface meshes that moved independently from the bone. The experimental design is outlined in Fig.1. Each surgeons pre-operative planning included placement of four O-OSI devices and four BB-FMT devices in two separate sessions. Bone depth and avoidance of critical structures, such as the sigmoid sinus were major factors. O-OSIs were placed within the mastoid and clearance around the implant was ensured to avoid inadvertent contact with underlying bone. The three possible placements of the BB-FMTs included the mastoid, retrosigmoid, and middle fossa areas. Each surgeon underwent a brief 10-min session with surgical manuals to review optimal surgical technique for both implants. Each planning session lasted five minutes to allow for surgeons to guide exact placement.

Study protocol (CBCT cone beam computed tomography, O-OSI Osia osseointegrated implant steady-state implant, BB-FMT BoneBridge floating mass transducer).

Implantation followed a standardized protocol beginning with the control arm followed by the experimental AR arm (Fig.1). Within the control arm, surgeons utilized Materialise Mimics built-in measurement tool for eventual intraoperative reference during implant placement. Whereas in the experimental arm, device placement was projected onto the surgical field using GTx-Eyes (Guided Therapeutics, TECHNA Institute, Canada) via a PicoPro projector (Cellon Inc., South Korea)7,11. The AR setup is demonstrated in Fig.2 and seen in the supplementary video.

Integrated augmented reality surgical navigation system. (A) the projector and surgical instruments were trackedwith the optical tracker in reference to the registered fiducials on the cadaveric head. Optical tracking markers attached to the projector allows for real-time adjustments to image projection. The surgical navigation platform displaying a pre-operatively placed implant. Experimental AR projection arm setup. (B) Surgeons were encouraged to align the projector to their perspective to reduce parallax.

Following implant placement, CT scans were obtained of the cadaveric heads to capture the location of implantation for eventual 3D coordinates measurement analysis. Each surgeon performed four O-OSI placements followed by four BB-FMTs.

The integrated AR surgical navigation system consists of a PicoPro projector (Cellon Inc., South Korea), a Polaris Spectra stereoscopic infrared optical tracker (NDI, Canada), a USB 2.0-megapixel camera (ICAN, China), and a standard computer. A 3D printed PicoPro projector enclosure enabled the attachment of four tracking markers, which provide real-time three-dimensional tracking information (Fig.2). GTx-Eyes (Guided Therapeutics, TECHNA Institute, Canada) is a surgical navigation platform that utilizes open-source, cross-platform libraries included IGSTK, ITK, and VTK11,13,14,15,16. The developed AR system has demonstrated the projection accuracy at 0.550.33mm and has been widely adapted to the domains of Otolaryngologic and Orthopedic oncologic operations17,18,19,20. Recently, the software has evolved to include AR integration7,9.

The AR system requires two calibrations: (1) camera and instrument tracker, (2) camera and projector, which are both are outlined by Chan et al.9,11. The result allows the tracked tool to be linked with the projectors spatial parameters allowing for both translation and rotational movements.

The camera and tracking tool calibration defines the relationship between the cameras center and the tracking tool coordinates by creating a homogeneous transformation matrix, ({{}^{Tracker}T}_{Cam}), consisting of a 33 rotational matrix (R) and a 31 translational vector (t). The rotational parameter was represented with Euler angles (({R}_{x},{R}_{y},{R}_{z})). This calibration process requires photographing a known checkerboard pattern from various perspectives using the camera that is affixed to the projectors case. The instrument trackers position and orientation are recorded to compute the spatial transformation. The grid dimensions from each photograph are compared with actual dimensions (30mm 30mm in a 97 array) using an open-source Matlab camera calibration tool21. This calibration serves as the extrinsic parameter of the camera.

The intrinsic parameters (A) of the camera include the principal point (({u}_{0,}{v}_{0})), scale factors ((alpha , beta ),mathrm{ and the skew of the two image axes }left(cright))22,23,24. This is denoted as:

$$mathbf{A}=left[begin{array}{ccc}alpha & c& {u}_{0}\ 0& beta & {v}_{0}\ 0& 0& 1end{array}right]$$

When combining the extrinsic (R t) with intrinsic (A) parameters, three-dimensional space (({mathbf{M}=[X,Y,Z,1]}^{T})) can be mapped to a two-dimensional camera image (({mathbf{m}=[u,v,1]}^{T})). s is defined as the scale factors. This is represented by: (smathbf{m}=mathbf{A}left[mathbf{R} mathbf{t}right]mathbf{M}.)

This link defines the spatial relationship between the cameras centre and the projector to create a homogenous transformation matrix (({{}^{Cam}T}_{Proj})). A two-dimensional checkerboard image is projected onto a planar checkerboard surface, which was used in the previous calibration step. The camera captures both images from various perspectives. Using the projector-camera calibration toolbox, the transformation of the camera and projector (({{}^{Cam}T}_{Proj})) is now established25. The calibration requires linking the camera and the projector tracking markers, both of which are mounted on the projector enclosure (Fig.2). By combining both calibration processes, the resulting transformation matrix from the AR projector to the tracking marker is denoted by ({{}^{Tracker}T}_{Proj}={{}^{Tracker}T}_{Cam}*{{}^{Cam}T}_{Proj}) .

AR projection setup required confirmation of projection adequacy using an image guidance probe and a Polaris Spectra NDI (Fig.2). Using the image guidance probe, coordinates from the bony fiducials (drilled bone well) and the projected fiducials (green dots) were captured. The difference between coordinates served as the measurement of projection accuracy (Fig.3).

(A) Fiducials projection onto the surgical field was matched to the drilled wells and (B) subsequent accuracy measurements were obtained with a tracking pointer tool placed within the drilled wells where x-, y-, and z- coordinates were captured.

Post-operative and pre-operative scans were superimposed on Materialise Mimics and centre-to-centre distances as well as angular differences on the axial plane were measured (Figs.4, 5). For O-OSI placements, the centre of the O-OSI was used, whereas the centre of the FMT for BB-FMT.

Accuracy measurements for center-to-center distances and angular accuracy.

Post-operative CT scans (A) BB-FMT and (B) O-OSI following AR projector guided surgery with paired pre-operative planning rendering seen in (C) and (D). In images (A) and (B), there is the pre-operative planning outline superimposed. The blue arrow denotes post- operative placement whereas the red arrow denotes pre-operative planning.

All participants completed a NASA Task Load Index (TLX) questionnaire assessing the use of AR in addition to providing feedback in an open-ended questionnaire26. TLX results were used to generate raw TLX (RTLX) scores for the six domains and subsequently weighted workload scores were generated27.

Continuous data was examined for normality by reviewing histograms, quantilequantile plots, and the ShapiroWilk test for normality. Given the lack of normality and repeated measurements, Wilcoxon signed-rank testing was used for centre-to-centre (C-C) and angular accuracies comparisons between the control and experimental arms. All analyses were performed using SPSS 26 (IBM Corp., Armonk, NY).

All methods were carried out in accordance with relevant guidelines and regulations. This study was approved by the Sunnybrook Health Sciences Centre Research Ethics Board (Project Identification Number: 3541). Informed consent was obtained from all subjects and/or their legal guardian(s) by way of the University of Torontos Division of AnatomyBody Donation Program. All subjects provided consent in the publication of identifying images in an online open-access publication.

Go here to see the original:

Using augmented reality to guide bone conduction device ... - Nature.com

Related Posts

Comments are closed.