We present a fast virtual-staining framework for defocused autofluorescence images of unlabeled tissue, matching the performance of standard virtual-staining models using in-focus label-free images. For this, we introduced a virtual-autofocusing network to digitally refocus the defocused images. Subsequently, these refocused images were transformed into virtually-stained H&E images using a successive neural network. Using coarsely-focused autofluorescence images, with 4-fold fewer focus points and 2-fold lower focusing precision, we achieved equivalent virtual-staining performance to standard H&E virtual-staining networks that utilize finely-focused images, helping us decrease the total image acquisition time by ~32% and the autofocusing time by ~89% for each whole-slide image.
We demonstrate a reconfigurable diffractive deep neural network (termed R‑D2NN) with a single physical model performing a large set of unique permutation operations between an input and output field-of-view by rotating different layers within the diffractive network. Our study numerically demonstrated the efficacy of R‑D2NN by accurately approximating 256 distinct permutation matrices using 4 rotatable diffractive layers. We experimentally validated the proof-of-concept of reconfigurable diffractive networks using terahertz radiation and 3D-printed diffractive layers, achieving high concordance with numerical simulations. The reconfigurable design of R‑D2NN provides scalability with high computing speed and efficient use of materials within a single fabricated model.
We introduce a deep learning-based approach utilizing pyramid sampling for the automated classification of HER2 status in immunohistochemically (IHC) stained breast cancer tissue images. Our deep learning-based method leverages pyramid sampling to analyze features across multiple scales from IHC-stained breast tissue images, managing the computational load effectively and addressing the challenges of HER2 expression heterogeneity by capturing detailed cellular features and broader tissue architecture. Upon application to 523 core images, the model achieved a classification accuracy of 85.47%, demonstrating the ability to counteract staining variability and tissue heterogeneity, which might improve the accuracy and timeliness of breast cancer treatment planning.
We present a rapid, stain-free, and automated viral plaque assay utilizing deep learning and time-lapse holographic imaging, which can significantly reduce the time needed for plaque-forming unit (PFU) detection and entirely bypass the chemical staining and manual counting processes. Demonstrated with vesicular stomatitis virus (VSV), our system identified the first PFU events as early as 5 hours of incubation and detected >90% of PFUs with 100% specificity in <20 hours, saving >24 hours compared to the traditional viral plaque assays that take ≥48 hours. Furthermore, our method was proven to adapt seamlessly to new types of viruses by transfer learning.
The traditional histochemical staining of autopsy tissue samples usually suffers from staining artifacts due to autolysis caused by delayed fixation of cadaver tissues. Here, we introduce an autopsy virtual staining technique to digitally convert autofluorescence images of unlabeled autopsy tissue sections into their hematoxylin and eosin (H&E) stained counterparts through a trained neural network. This technique was demonstrated to effectively mitigate autolysis-induced artifacts inherent in histochemical staining, such as weak nuclear contrast and color fading in the cytoplasmic-extracellular matrix. As a rapid, reagent-efficient, and high-quality histological staining approach, the presented technique holds great potential for widespread application in the future.
We present a virtual staining framework that can rapidly stain defocused autofluorescence images of label-free tissue, matching the performance of standard virtual staining models that use in-focus unlabeled images. We trained and blindly tested this deep learning-based framework using human lung tissue. Using coarsely-focused autofluorescence images acquired with 4× fewer focus points and 2× lower focusing precision, we achieved equivalent performance to the standard virtual staining that used finely-focused autofluorescence input images. We achieved a ~32% decrease in the total image acquisition time needed for virtual staining of a label-free whole-slide image, alongside a ~89% decrease in the autofocusing time.
We present a deep learning-based framework to virtually transfer images of H&E-stained tissue to other stain types using cascaded deep neural networks. This method, termed C-DNN, was trained in a cascaded manner: label-free autofluorescence images were fed to the first generator as input and transformed into H&E stained images. These virtually stained H&E images were then transformed into Periodic acid–Schiff (PAS) stain by the second generator. We trained and tested C-DNN on kidney needle-core biopsy tissue, and its output images showed better color accuracy and higher contrast on various histological features compared to other stain transfer models.
We present a stain-free, rapid, and automated viral plaque assay using deep learning and holography, which needs significantly less sample incubation time than traditional plaque assays. A portable and cost-effective lens-free imaging prototype was built to record the spatio-temporal features of the plaque-forming units (PFUs) during their growth, without the need for staining. Our system detected the first cell lysing events as early as 5 hours of incubation and achieved >90% PFU detection rate with 100% specificity in <20 hours, saving >24 hours compared to the traditional viral plaque assays that take ≥48 hours.
We present a high-throughput and automated system for the early detection and classification of bacterial colony-forming units (CFUs) using a thin-film transistor (TFT) image sensor. A lens-free imager was built using the TFT sensor with a ~7 cm2 field-of-view to collect the time-lapse images of bacterial colonies. Two trained neural networks were used to detect and classify the bacterial colonies based on their spatio-temporal features. Our system achieved an average CFU detection rate of 97.3% at 9 hours of incubation and an average CFU recovery rate of 91.6% at ~12 hours, saving ~12 hours compared to the EPA-approved method.
We present a virtual immunohistochemical (IHC) staining method based on label-free autofluorescence imaging and deep learning. Using a trained neural network, we transform multi-band autofluorescence images of unstained tissue sections to their bright-field equivalent HER2 images, matching the microscopic images captured after the standard IHC staining of the same tissue sections. Three pathologists’ blind evaluations of HER2 scores based on virtually stained and IHC-stained whole slide images revealed the statistically equivalent diagnostic values of the two methods. This virtual HER2 staining method provides a rapid, accurate, and low-cost alternative to the standard IHC staining methods and allows tissue preservation.
Immunohistochemical (IHC) staining of the human epidermal growth factor receptor 2 (HER2) is routinely performed on breast cancer cases to guide immunotherapies and help predict the prognosis of breast tumors. We present a label-free virtual HER2 staining method enabled by deep learning as an alternative digital staining method. Our blinded, quantitative analysis based on three board-certified breast pathologists revealed that evaluating HER2 scores based on virtually-stained HER2 whole slide images (WSIs) is as accurate as standard IHC-stained WSIs. This virtual HER2 staining can be extended to other IHC biomarkers to significantly improve disease diagnostics and prognostics.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.