Proceedings Article | 12 February 2018
KEYWORDS: Tissues, Cancer, Tissue optics, Biopsy, Hyperspectral imaging, Convolutional neural networks, Head, Neck, Diagnostics
Successful outcomes of surgical cancer resection necessitate negative, cancer-free surgical margins. Currently, tissue samples are sent to pathology for diagnostic confirmation. Hyperspectral imaging (HSI) is an emerging, non-contact optical imaging technique. A reliable optical method could serve to diagnose and biopsy specimens in real-time. Using convolutional neural networks (CNNs) as a tissue classifier, we developed a method to use HSI to perform an optical biopsy of ex-vivo surgical specimens, collected from 21 patients undergoing surgical cancer resection. Training and testing on samples from different patients, the CNN can distinguish squamous cell carcinoma (SCCa) from normal aerodigestive tract tissues with an area under the curve (AUC) of 0.82, 81% accuracy, 81% sensitivity, and 80% specificity. Additionally, normal oral tissues can be sub-classified into epithelium, muscle, and glandular mucosa using a decision tree method, with an average AUC of 0.94, 90% accuracy, 93% sensitivity, and 89% specificity. After separately training on thyroid tissue, the CNN differentiates between thyroid carcinoma and normal thyroid with an AUC of 0.95, 92% accuracy, 92% sensitivity, and 92% specificity. Moreover, the CNN can discriminate medullary thyroid carcinoma from benign multi-nodular goiter (MNG) with an AUC of 0.93, 87% accuracy, 88% sensitivity, and 85% specificity. Classical-type papillary thyroid carcinoma is differentiated from benign MNG with an AUC of 0.91, 86% accuracy, 86% sensitivity, and 86% specificity. Our preliminary results demonstrate that an HSI-based optical biopsy method using CNNs can provide multi-category diagnostic information for normal head-and-neck tissue, SCCa, and thyroid carcinomas. More patient data are needed in order to fully investigate the proposed technique to establish reliability and generalizability of the work.