We introduce a deep learning-based approach utilizing pyramid sampling for the automated classification of HER2 status in immunohistochemically (IHC) stained breast cancer tissue images. Our deep learning-based method leverages pyramid sampling to analyze features across multiple scales from IHC-stained breast tissue images, managing the computational load effectively and addressing the challenges of HER2 expression heterogeneity by capturing detailed cellular features and broader tissue architecture. Upon application to 523 core images, the model achieved a classification accuracy of 85.47%, demonstrating the ability to counteract staining variability and tissue heterogeneity, which might improve the accuracy and timeliness of breast cancer treatment planning.
We present a virtual immunohistochemical (IHC) staining method based on label-free autofluorescence imaging and deep learning. Using a trained neural network, we transform multi-band autofluorescence images of unstained tissue sections to their bright-field equivalent HER2 images, matching the microscopic images captured after the standard IHC staining of the same tissue sections. Three pathologists’ blind evaluations of HER2 scores based on virtually stained and IHC-stained whole slide images revealed the statistically equivalent diagnostic values of the two methods. This virtual HER2 staining method provides a rapid, accurate, and low-cost alternative to the standard IHC staining methods and allows tissue preservation.
Immunohistochemical (IHC) staining of the human epidermal growth factor receptor 2 (HER2) is routinely performed on breast cancer cases to guide immunotherapies and help predict the prognosis of breast tumors. We present a label-free virtual HER2 staining method enabled by deep learning as an alternative digital staining method. Our blinded, quantitative analysis based on three board-certified breast pathologists revealed that evaluating HER2 scores based on virtually-stained HER2 whole slide images (WSIs) is as accurate as standard IHC-stained WSIs. This virtual HER2 staining can be extended to other IHC biomarkers to significantly improve disease diagnostics and prognostics.
KEYWORDS: Visualization, Augmented reality, Tumors, Luminescence, Tissues, Surgery, Information visualization, Data acquisition, Real time imaging, Navigation systems
Real-time visualization of imaging data constitutes a critical part of surgical workflow. Augmented reality (AR) is a promising tool to assist in conventional surgical navigation systems. We have been developing an AR framework for clinical imaging and guidance using an optical see-through head-mounted display (OST-HMD) and fluorescence lifetime imaging (FLIm) instrumentation. This framework supports in vivo scanning of FLIm data and the real-time visualization of diagnostic information overlaid on the interrogated tissue area. With the high discriminative power of FLIm, our FLIm-AR concept has the potential for indicating tumor margins and assisting with tumor excision surgery.
Breast cancer is the second most common cancer worldwide and by far the most frequent cancer among women. A major limiting factor for complete surgical resection is the physician’s ability to intraoperatively assess presence of tumor positive resection margins. Many surgeons still rely on visual or tactile guidance and this leads in incomplete cancer resection rate that ranges between 20% and 50%. In this study we use multi-spectral Time-Resolved Fluorescence Spectroscopy (ms-TRFS), allowing for dynamic raster tissue scanning by merging a 450 nm aiming beam with the pulsed fluorescence excitation light in a single fiber collection. We developed a device that combines multispectral time resolved fluorescence lifetime with state-of-the-art machine learning techniques to delineate tumor margins of excised breast cancer specimen in real-time. In order to train the classifier, we precisely registered ex-vivo specimen with histology slides using fiducial markers and piecewise shape matching. A probabilistic random forest classifier was trained to rapidly delineate tumor regions. Moreover, the system not only provides binary output on tumor regions, but also quantifies the classifier’s certainty of each prediction. This allows the surgeon to either rescan the ambiguous area to increase certainty or extend the resection area to decrease the probability of positive tumor margins. The outcome is visualized by a simple color scheme showing tumor in red and adipose and fibrous tissue in blue and green and the certainty is encoded in color saturation. The system has been evaluated for n=10 lumpectomy specimen showing promising agreement between the classifier’s predictions and histology.
An important step in establishing the diagnostic potential for emerging optical imaging techniques is accurate registration between imaging data and the corresponding tissue histopathology typically used as gold standard in clinical diagnostics. We present a method to precisely register data acquired with a point-scanning spectroscopic imaging technique from fresh surgical tissue specimen blocks with corresponding histological sections. Using a visible aiming beam to augment point-scanning multispectral time-resolved fluorescence spectroscopy on video images, we evaluate two different markers for the registration with histology: fiducial markers using a 405-nm CW laser and the tissue block’s outer shape characteristics. We compare the registration performance with benchmark methods using either the fiducial markers or the outer shape characteristics alone to a hybrid method using both feature types. The hybrid method was found to perform best reaching an average error of 0.78±0.67 mm. This method provides a profound framework to validate diagnostical abilities of optical fiber-based techniques and furthermore enables the application of supervised machine learning techniques to automate tissue characterization.
The current standard of care for early stages of breast cancer is breast-conserving surgery (BCS). BCS involves a lumpectomy procedure, during which the tumor is removed with a rim of normal tissue-if cancer cells found in that rim of tissue, it is called a positive margin and means part of the tumor remains in the breast. Currently there is no method to determine if cancer cells exist at the margins of lumpectomy specimens aside from time-intensive histology methods that result in reoperations in up to 38% of cases. We used fluorescence lifetime imaging (FLIm) to measure time-resolved autofluorescence from N=13 ex vivo human breast cancer specimens (N=10 patients undergoing lumpectomy or mastectomy) and compared our results to histology. Tumor (both invasive and ductal carcinoma in situ), fibrous tissue, fat and fat necrosis have unique fluorescence signatures. For instance, between 500-580 nm, fluorescence lifetime of tumor was shortest (4.7 ± 0.4 ns) compared to fibrous tissue (5.5 ± 0.7 ns) and fat (7.0 ± 0.1 ns), P<0.05 (ANOVA). These differences are due to the biochemical properties of lipid, nicotineamide adenine dinucleotide (NADH) and collagen fibers in the fat, tumor and fibrous tissue, respectively. Additionally, the FLIm data is augmented to video of the breast tissue with image processing algorithms that track a blue (450 nm) aiming beam used in parallel with the 355 nm excitation beam. This allows for accurate histologic co-registration and in the future will allow for three-dimensional lumpectomy surfaces to be imaged for cancer margin delineation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.