Proceedings Article | 15 February 2021
KEYWORDS: Biopsy, Prostate, Magnetic resonance imaging, Visualization, Image segmentation, Prostate cancer, Optical tracking, Optical simulations, Interfaces, Cancer
PURPOSE: Prostate cancer is the second most common cancer diagnosed in men. The rate is disproportionately high among men in sub-Saharan Africa where, unlike in North America and Western Europe, the screening process for prostate cancer has historically not been routine. Currently, as awareness regarding prostate health increases, more patients in this region are being referred to trans-rectal ultrasound guided prostate biopsy, a diagnosis procedure which requires a strong understanding of prostate zonal anatomy. To aid in the instruction of this procedure, prostate biopsy training programs need to be implemented. Unfortunately, current TRUS-guided training tools are not ideal for reproducibility in these Western African countries. To answer this challenge, we are developing an affordable and open-source training simulator for TRUS-guided prostate biopsy, for use in Senegal. In this paper, we present the implementation of the training simulator’s virtual interface, highlighting the generation and evaluation of the critical training component of zonal anatomy overlaid on TRUS. METHODS: For the simulator’s dataset, we registered TRUS and MRI volumes together to obtain the zonal segmentation from the MRI volumes. After generating ten pairings of TRUS overlaid with zonal segmentation, we designed and implemented a virtual TRUS training system, developed in open-source software. The objective of our simulator is to teach trainees to accurately identify the prostate’s anatomical zones in TRUS. To confirm the system’s usability for training zonal identification, we conducted a two-part survey on the quality of the zonal overlays with 7 urology experts. In the first part, they assessed the zonal overlay for visual correctness by rating 10 images from one patient’s TRUS with registered overlay on a 5-point Likert scale. For the second part, they labelled 10 plain TRUS volumes with zonal anatomy and the labels were compared to the labels of our overlay. RESULTS: On average, experts rated the zonal overlay’s visual accuracy at 4 out of 5. Furthermore, 7 out of 7 experts labelled the peripheral, anterior, and transitional zones in the same regions we overlaid them, and 5 out of 7 labelled the central zone in the same region we overlaid it. CONCLUSION: We created the prototype of a TRUS imaging simulator in open-source software. A vital training component, zonal overlay, was generated using publicly accessible data and validated by expert urologists for prostate zone identification, confirming the concept.