Open Access
23 June 2020 Point-of-care, multispectral, smartphone-based dermascopes for dermal lesion screening and erythema monitoring
Ross D. Uthoff, Bofan Song, Melody Maarouf, Vivian Y. Shi, Rongguang Liang
Author Affiliations +
Abstract

Significance: The rates of melanoma and nonmelanoma skin cancer are rising across the globe. Due to a shortage of board-certified dermatologists, the burden of dermal lesion screening and erythema monitoring has fallen to primary care physicians (PCPs). An adjunctive device for lesion screening and erythema monitoring would be beneficial because PCPs are not typically extensively trained in dermatological care.

Aim: We aim to examine the feasibility of using a smartphone-camera-based dermascope and a USB-camera-based dermascope utilizing polarized white-light imaging (PWLI) and polarized multispectral imaging (PMSI) to map dermal chromophores and erythema.

Approach: Two dermascopes integrating LED-based PWLI and PMSI with both a smartphone-based camera and a USB-connected camera were developed to capture images of dermal lesions and erythema. Image processing algorithms were implemented to provide chromophore concentrations and redness measures.

Results: PWLI images were successfully converted to an alternate colorspace for erythema measures, and the spectral bandwidth of the PMSI LED illumination was sufficient for mapping of deoxyhemoglobin, oxyhemoglobin, and melanin chromophores. Both types of dermascopes were able to achieve similar relative concentration results.

Conclusion: Chromophore mapping and erythema monitoring are feasible with PWLI and PMSI using LED illumination and smartphone-based cameras. These systems can provide a simpler, more portable geometry and reduce device costs compared with interference-filter-based or spectrometer-based clinical-grade systems. Future research should include a rigorous clinical trial to collect longitudinal data and a large enough dataset to train and implement a machine learning-based image classifier.

1.

Introduction

The rates of melanoma and nonmelanoma skin cancers (NMSC) have been steadily rising,1,2 and early diagnosis is key for improved outcomes.3 Because there is a shortage of board-certified dermatologists,4,5 particularly in remote or underserved settings where <10% of dermatologists practice,6 most of the burden of diagnosis and treatment falls on primary care physicians (PCPs) who are not extensively trained in dermatological care.3,7 Dermoscopy is a tool utilized to improve the in vivo diagnostic accuracy of benign versus malignant lesions, a unique skill that requires additional training, even among board-certified dermatologists. In remote settings, dermascopes may capture and document pigmented lesions that can be forwarded to expert colleagues through telemedicine for further analysis.8 Unfortunately, dermascopes and their accessories range from hundreds to thousands of dollars,9,10 which is potentially too expensive for general medical practice. Thus, there is a need for a low-cost, readily available dermoscopy tool to bridge this clinical need.

Lesion evaluation using visual, subjective methods such as the ABCDE criteria and seven-point checklist are useful tools for PCPs.3,11 The ABCDE criteria predict melanoma by a lesion’s asymmetry, border irregularity, coloration, diameter if >6  mm, and evolution, providing a sensitivity of 0.85 and specificity of 0.72.3,11 The seven-point checklist monitors a lesion’s change in size, shape, color, and looks for diameters >7  mm, crusting or bleeding, and sensory change, providing a sensitivity of 0.77 and specificity of 0.80.3 Continuous monitoring has shown to improve outcomes through early detection as evidenced by mole mapping techniques12,13 and the increase in sensitivity and specificity with the addition of the evolving in the ABCDE criteria.11

Adjunctive tools utilizing objective measures such as polarized multispectral imaging (PMSI) and polarized white-light imaging (PWLI) to map dermal chromophores [hemoglobin, deoxyhemoglobin (Hb), and melanin], quantify erythema, and perform image classification for lesion screening have the potential to increase early detection of melanoma by PCPs and even outside the physician’s office, leading to reduced need for biopsy and improved outcomes.1427 We propose a smartphone combined with LED illumination as the ideal platform for an adjunctive medical device, which will provide a portable system with easy-to-operate apps and native image capture, processing, and data transmission. These systems can reduce the costs associated with interference-filter-based14,15,20 or spectrometer-based21,23 systems while also providing a more compact, portable geometry for use in any testing environment compared with clinical-grade imaging systems.1719

2.

Materials

We have developed two point-of-care dermascope design concepts for skin lesion screening and erythema monitoring, implementing both PMSI and PWLI28 on an LG G5 (LG, Seoul, South Korea) smartphone platform. One system concept utilizes the embedded smartphone camera for imaging while the other uses a USB-connected camera module that connects to the smartphone. Both systems share a common illumination system and software application to enable PWLI and PMSI.

The PMSI and PWLI dermascope using the smartphone’s embedded rear camera is shown in Figs. 1(c)1(e). The main LG G5 camera consists of a Sony IMX234 Exmor RS sensor with 5312×2988, 1.12-μm pixels and a 5.95  mm×3.35  mm sensor size. The sensor is paired with a f/1.8, 4.42-mm focal length lens.

Fig. 1

Two dermascope implementations. The USB-camera-based PMSI and PWLI dermascope is shown in (a) and (b). (a) Various components of the handheld imaging module (the USB camera is hidden behind the imaging polarizer) and (b) the imaging module paired with the smartphone camera. The smartphone-camera-based PMSI and PWLI dermascope is shown in (c), (d), and (e). (c) The smartphone-based system’s side opposite the smartphone screen with the imaging annulus removed, where the LED PCB and smartphone camera are visible and other components are highlighted; (d) the system with the imaging annulus attached; and (e) the smartphone installed in the dermascope.

JBO_25_6_066004_f001.png

To decrease the working distance of the optical system to allow imaging of the epidermis, a 24-mm focal length achromatic doublet (Ross Optical, El Paso, Texas, USA) is placed 4 mm away from the principal plane of the smartphone optical system, providing a magnification of m=0.187 and a numerical aperture NA=0.04. After cropping, the field of view (FOV) is 9.96  mm×11.67  mm, as shown in Fig. 2. The imaging achromat is aligned to the smartphone camera using a machined PMMA disk installed in a removable 3D-printed annulus of VeroBlue RGD840 (Stratasys, Eden Prairie, Minnesota, USA) plastic. The annulus serves as an imaging guide; its length equals the optical system working distance (23 mm), so the PCP can contact the patient to stabilize the device and ensure correct focus. An additional 3D-printed structure serves as a mounting platform for the smartphone, imaging annulus, and LED electronics.

Fig. 2

Layout of the LG G5 smartphone camera and the added achromat. The smartphone camera lens system is modeled as a paraxial lens.

JBO_25_6_066004_f002.png

The alternative PMSI and PWLI dermascope [Figs. 1(a) and 1(b)] is also based on an LG smartphone platform, but it utilizes an external USB-connected RGB camera (OV5648, Omnivision, Santa Clara, California, USA; 5 MP, 3.67  mm×2.74  mm) with the vendor-supplied 2.8-mm focal length lens adjusted to a working distance of 30 mm. After cropping, the FOV is 27.5  mm×20  mm. In addition, the integrated infrared (IR) filter was removed. Again, the mechanical design of the annulus is matched to the working distance of the camera, providing in-focus imaging when the device contacts the patient.

For both systems, multispectral illumination is accomplished using a custom printed circuit board (PCB) with LEDs of various wavelengths (Lumileds, Amsterdam, The Netherlands; Vishay, Malvern, Pennsylvania, USA) installed as shown in Table 1. The color wavelengths were chosen based on commercial availability and the ability to probe both hemoglobin isosbestic points and separate oxygenated from deoxygenated hemoglobin content along the molar attenuation curves (Fig. 3).

Table 1

LED and camera settings for each illumination wavelength and each dermascope. LED wavelength (λ), LED part number, smartphone camera LED-driving current (I), smartphone camera LED flux for a single LED of the given color, smartphone camera International Organization for Standardization setting, smartphone camera exposure time, USB camera LED-driving current (I), USB camera LED flux for a single LED of the given color, USB camera brightness setting, and USB exposure time are provided.

λ (nm)Part numberSmartphone camera settingsUSB camera settings
I (mA)FluxISOExposure time (ms)I (mA)FluxBrightnessExposure time (s)
4000 KLXZ1-4070358101 lm1003.8620161 lm501.6
450LXZ1-PR01620690 mW1000.5620690 mW501.6
470LXZ1-PB0162046 lm1000.762046 lm501.6
500LXZ1-PE01620100 lm1002.6620100 lm501.6
530LXZ1-PM01620142 lm1003.0620142 lm501.6
580LXZ1-PL0135842 lm1005.062067 lm501.6
660LXZ1-PA01620420 mW1000.9620420 mW501.6
810VSMY98145DS620700 mW2390250.0620700 mW501.6
940L1IZ-0940358403 mW2300180.0620700 mW501.6

Fig. 3

Molar extinction coefficients, ε(λ), for Hb, HbO2, and melanin plotted on a log scale and the LED spectral flux probability density functions, ϕe,λ, plotted on a linear scale.

JBO_25_6_066004_f003.png

For the smartphone-based dermascope, the PMMA disk used for mounting the lens also extends over the illumination LEDs to provide mounting for a linear polarizer (Edmund Optics, Barrington, New Jersey, USA). An orthogonal linear polarizer is installed in front of the imaging channel, enabling both PMSI and PWLI and reducing the effect of specular reflection on the images.28 The LED sources’ spectral fluxes, ϕe,λ, shown in Fig. 3, were measured with a spectrometer (Ocean Optics).

The USB-camera-based dermascope uses the same LED PCB and wavelengths for illumination along with orthogonal polarizers in the illumination channel (Edmund Optics) and the imaging channel (Moxtek, Orem, Utah, USA). To help normalize white-light image luminance, an 18% gray color reference (Kodak, Rochester, New York, USA) is permanently installed on both sides of the image FOV. Because the 3D-printed mounting foundation does not need to mount the LED board and imaging annulus, a previously designed geometry is used for this system.29

The illumination PCB consists of three LEDs of each color soldered in a symmetrical pattern around the camera aperture to maximize uniformity without additional beam shaping optics. The backside solder mask of the PCB was removed to expose the copper and is attached to a copper heatsink with electrically insulating epoxy (DP240, 3M, St. Paul, Minnesota, USA). Numerous vias were placed on the PCB to ensure a low thermal resistance between the front and backside copper planes. The LEDs are driven with a switching boost power supply (LT3478, Linear Technology, Milpitas, California, USA) powered by two lithium-ion batteries (Orbtronic, Saint Petersburg, Florida, USA). Each LED color string can be turned on individually with a custom power level setting and illumination, and image capture is synchronized by a custom Android application through a Bluetooth-connected microcontroller (MCU, IOIO-OTG, SparkFun Electronics, Niwot, Colorado, USA). The LED-driving currents, fluxes, and dermascopes’ image capture settings are shown in Table 1. In addition, the smartphone camera uses the daylight white balance setting, and the white balance setting of the USB camera is inaccessible. A block diagram of the system electronics is shown in Fig. 4.30 The Android application controls the camera functions, synchronizes the LED illumination, and sets camera exposure time. For the USB camera, the Android app was modified to use the USB camera instead of the on-board smartphone camera. Images are connected to an ID assigned to each patient, removing identifiable information from the smartphone. Screenshots of the app are shown in Fig. 4.

Fig. 4

The system electronics block diagram is provided in (a) and Android application screenshots in (b) and (c).

JBO_25_6_066004_f004.png

3.

Methods

3.1.

Data Processing

The algorithms used to process collected dermal images are provided in Algorithms 1 and 2. Descriptions of the steps and related equations are provided in the following sections.

Algorithm 1

Processing of reference images.

1: procedure ProcessReferenceImages (reference images)
2:  for all reference images do
3:   convert sRGB to linear RGB ▹ Eq. (1)
4:   if white-light image then
5:    convert linear RGB to CIEXYZ ▹ Eq. (2)
6:   else if color image then
7:    convert linear RGB to Yequal ▹ Eq. (3)
8:   end if
9:   calculate luminance reference ▹ Eq. (4)
10:   calculate illumination uniformity reference ▹ Eq. (5)
11:  end for
12:  return optical density reference images
13:  return illumination uniformity images
14: end procedure

Algorithm 2

Processing of dermal images.

1: procedure ProcessDermalImages (dermal images)
2:  for all dermal images do
3:   if USB camera then
4:    correct white-light image luminance ▹ Eq. (8)
5:  end if
6:  convert sRGB to linear RGB ▹ Eq. (1)
7:  correct by illumination uniformity ▹ Eq. (6)
8:  if white-light image then
9:   convert linearRGB to CIEXYZ ▹ Eq. (2)
10:   convert CIEXYZ to CIELAB ▹ Eqs. (17) and (18)
11:  else if color image then
12:   convert linear RGB to Yequal ▹ Eq. (3)
13:   calculate optical density ▹ Eq. (7)
14:   calculate melanin content ▹ Eq. (19)
15:   calculate erythema ▹ Eq. (20)
16:   solve chromophore concentration ▹ Eq. (14)
17:   end if
18:  end for
19: end procedure

3.1.1.

Image collection

When the dermascopes were first built, images of an 18% reflective gray card were collected by each system at each wavelength to serve as both the optical density (OD) and illumination uniformity references.

For dermal image collection, a pilot study was performed on human subjects at the University of Arizona College of Medicine, Division of Dermatology to determine feasibility of each multispectral dermascope. This study received institutional review board approval (#1612067061). All patients provided informed written and oral consent.

3.1.2.

Colorspace conversions

The melanin content, erythema, and chromophore concentration measurements rely on conversion to the CIELAB and CIEXYZ colorspaces. The imaging systems natively capture in the sRGB colorspace, and the images are first converted to linear RGB space:31

Eq. (1)

Clinear={CsRGB12.92CsRGB0.04045(CsRGB+0.0551+0.055)2.4CsRGB>0.04045,
where CsRGB is each channel of the IsRGB image. Images are then converted from RGBlinear to CIEXYZ using the transformation matrix,31

Eq. (2)

[XYZ]=[0.41240.35760.18050.21260.71520.07220.01930.11920.9505]·[RlinearGlinearBlinear],
where Y is the luminance value and is used to calculate ODs from the dermis images and reference. Luminance is a measure that scales optical radiation by the response of the human visual system.32 Because the images will be processed by a computer, accurate color representation for a human is not required, so an additional luminance measure, Yequal, is created using the equal sum of all three channels:

Eq. (3)

Yequal=[111]·[RlinearGlinearBlinear].

3.1.3.

Reference and illumination uniformity correction

Using the reference images that have been converted to CIEXYZ or Yequal, reference luminance images are defined as

Eq. (4)

I0=Yref¯,
where Yref is the Y (luminance) channel of the CIEXYZ image or Yequal. The reference grayscale image is normalized to serve as the illumination reference for the dermal images.

Eq. (5)

U=Yrefmax(Yref),
where U is now the illumination uniformity correction matrix.

The dermal CIEXYZ and Yequal images are corrected in the same way

Eq. (6)

Idermal,uniformity corrected=IdermalUU¯,
where Idermal is the illumination uniformity corrected dermal image with constant mean luminance. Finally, OD dermal images are calculated as

Eq. (7)

OD=ln(II0).
Finally, the USB dermascope has sections of a 18% gray photography card mounted on either side of the FOV [Fig. 1(b)]. Knowing the card image should equal 50% levels of RGB, the luminance of the white-light image is scaled using the following equation:

Eq. (8)

Idermal,luminance corrected=Idermal,uniformity corrected0.5Ygray¯.

3.1.4.

Chromophore concentration

The Beer–Lambert law is utilized to measure the relative concentrations of Hb, oxyhemoglobin (HbO2), and melanin:17,22,3335

Eq. (9)

I(λ)=I0(λ)exp[cnε(λ)(λ)],
where I is the resulting intensity, I0 is the incident intensity, cn is the concentration of the chromophore, ε(λ) is the molar attenuation coefficient of the chromophore at a particular wavelength, and (λ) is the optical path length of the light in the medium for the incident wavelength. This is restated as OD:

Eq. (10)

OD=log(I(λ)I0(λ))=cHbεHb(λ)(λ)+cHbO2εHbO2(λ)(λ)+cmelaninεmelanin(λ)(λ)+cbackground,
where cbackground is due to residual absorption from molecules present in the epidermis and dermis.

The molar extinction coefficients for Hb and HbO236 and melanin37 are shown in Fig. 3. Jacques’s εmelanin37 was fit with an exponential curve to extend the wavelength to 1000 nm, resulting in a fit of

Eq. (11)

εmelanin=2.2858·104exp(5.5028·103λ).

Optical path lengths, (λ), for the chromophores are calculated from a linear fit of Anderson’s data38 in the region of the illumination wavelengths,

Eq. (12)

(λ)=2.62·104λ9.87·102,
where λ is in units of nm and (λ) is in units of cm.

Because the LEDs are broad spectrum, we integrate over the wavelength probability density function to calculate a total molar attenuation coefficient39,40 for each color

Eq. (13)

εtotal=ϕe,λ(λ)ε(λ)dλ.
The resulting molar attenuation coefficients for all of the chromophores are shown in Table 2.

Table 2

Molar extinction coefficients calculated using Eq. (13) for each illumination wavelength compared with the molar extinction coefficients for the peak wavelength.

Wavelength (nm)Coefficients from Eq. (13)Coefficients at peak LED wavelength
Hb (cm−1 M−1)HbO2 (cm−1 M−1)Melanin (cm−1 M−1)Hb (cm−1 M−1)HbO2 (cm−1 M−1)Melanin (cm−1 M−1)
450199,86482,7471922103,29262,8161921
47035,93736,662170616,15633,2091721
50026,65925,521139220,86220,9321459
53037,82434,851124139,03639,9571237
58022,60613,25886937,01050,104940
66033803526113227320605
810845812281717864265
94065611851426931214130

A system of equations is built from the multispectral datacube and the molar attenuation coefficients shown in Table 2

Eq. (14)

[εHb(λ1)(λ1)εHbO2(λ1)(λ1)εmelanin(λ1)(λ1)1εHb(λ2)(λ2)εHbO2(λ2)(λ2)εmelanin(λ2)(λ2)1εHb(λ3)(λ3)εHbO2(λ3)(λ3)εmelanin(λ3)(λ3)1εHb(λn)(λn)εHbO2(λn)(λn)εmelanin(λn)(λn)1][cHbcHbO2cmelanincbackground]=[OD(λ1)OD(λ2)OD(λ3)OD(λn)].
and the system is solved by linear algebra least-squares techniques33 where OD(λn) are calculated OD matrices for each illumination wavelength.

The ability of the dermascopes to properly measure relative chromophore concentrations was validated using a finger occlusion test. Images were taken with both dermascopes and the chromophores mapped preocclusion, after 2 min of occlusion, postocclusion, and 5 min after ending the occlusion.41

3.1.5.

Melanin and erythema

To measure melanin content and erythema, the white-light image is converted to the CIELAB42 colorspace using lightness (L*) as a measure of relative melanin content and the direction of red color stimuli (a*) as a measure of redness, with more positive values indicating higher levels of erythema.43 Before converting to CIELAB, normalization constants must be calculated from the white-LED spectral content. Using the color matching functions,44 x¯(λ), y(λ), z¯(λ) (Fig. 5), X, Y, and Z are calculated as42

Eq. (15)

X=380  nm780  nmx¯(λ)ϕe,λdλ;Y=380  nm780  nmy¯(λ)ϕe,λdλ;Z=380  nm780  nmz¯(λ)ϕe,λdλ,
where ϕe,λ is the relative spectral flux of the white-LED source as shown in Fig 3. The normalization constants Xn, Yn, and Zn are calculated by

Eq. (16)

Xn=XY;Yn=YY;Zn=ZY.

Fig. 5

Color matching curves used to determine normalization constants to convert to CIELAB along with the 4000 K white-LED spectrum.

JBO_25_6_066004_f005.png

The image is then converted to CIELAB by42

Eq. (17)

L*=116f(YYn)16,a*=500[f(XXn)f(YYn)],b*=200[f(YYn)f(ZZn)],
where

Eq. (18)

f(x)={x1/3x>(24/116)3(841/108)x+16/116x(24/116)3.

In addition to the white-light image measures, melanin and erythema measures are constructed from the color-OD images. Melanin content16,45 is calculated as

Eq. (19)

Melanin=OD660OD940.
As shown in Table 2, these two wavelengths maximize the difference in melanin absorption and minimize the effect of Hb and HbO2 absorption.

Erythema, due to increased blood content, results in increased blue light absorption but little change in red light absorption46 as shown in Table 2. Therefore, an erythema index is constructed as

Eq. (20)

Erythema=OD470OD660.

3.2.

Optical System Characterization

The linearity of the camera responses was measured by adjusting the exposure time in the case of the smartphone-camera-based dermascope and image brightness in the case of the USB-camera-based dermascope, capturing images of the matte 18% gray photography card with each LED color, and measuring the image luminance mean at each wavelength.

Performance of the imaging system’s cutoff frequency and FOV was validated with a 1951 United States Air Force (USAF) resolution test chart, and the modulation transfer function (MTF) was measured using the slanted-edge method.47

Illumination uniformity was measured by illuminating the matte 18% gray photography card with each LED color and imaging the surface with the dermascope. The uniformity is quantified using the coefficient of variation, (cv),48 on normalized data

Eq. (21)

Uniformity=1cv=1σx¯,
where x is the mean of the pixels in the image and σ is the standard deviation of the pixel values.

4.

Results

4.1.

Clinical Results

Following are the RGB, chromophore, melanin, and erythema measures for cases of junctional nevus (JN) (Fig. 6) and squamous cell carcinoma (SCC) (Fig. 7); each case was captured with both the USB camera dermascope and the smartphone camera dermascope.

Fig. 6

The same JN imaged by both the smartphone and USB dermascopes. For the smartphone dermascope, (a) the RGB images after illumination uniformity correction, (b) the relative chromophore concentrations, and (d) lightness as measured by L*, redness as measured by a*, melanin calculated from Eq. (19), and erythema calculated from Eq. (20). The same measures are shown for the USB dermascope in (f), (c), and (e), respectively. A 5-mm scale bar is provided for both the smartphone-camera images and the USB camera images above the RGB image grids.

JBO_25_6_066004_f006.png

Fig. 7

The same SCC imaged by both the smartphone and USB dermascopes. For the smartphone dermascope, (a) the RGB images after illumination uniformity correction, (b) the relative chromophore concentrations, and (d) lightness as measured by L*, redness as measured by a*, melanin calculated from Eq. (19), and erythema calculated from Eq. (20). The same measures are shown for the USB dermascope in (f), (c), and (e), respectively. A 5-mm scale bar is provided for both the smartphone-camera images and the USB camera images above the RGB image grids.

JBO_25_6_066004_f007.png

The chromophore maps for both dermascopes at the chosen time points for the occlusion test are shown in Fig. 8.

Fig. 8

Finger occlusion test results for the smartphone camera and USB camera at preocclusion, after occlusion for 2 min, and postocclusion. The bottom plot provides the mean relative concentration for Hb and HbO2 inside the rectangle showing a dip in HbO2 and increase in Hb after occlusion.

JBO_25_6_066004_f008.png

4.2.

Optical System Performance

Figure 9 shows the changes in the mean of the sum of the red, green, and blue image channels over varying exposure times for the smartphone-based camera and over brightness settings for the USB camera.

Fig. 9

Mean of the sum of the red, green, and blue channels over changing exposure times for the smartphone-based camera and changing brightness settings for the USB camera.

JBO_25_6_066004_f009.png

Figure 10 shows full-field and zoomed 1951 USAF resolution test chart images after cropping along with measured MTF data using the slanted-edge test for both dermascopes.

Fig. 10

(a)–(c) Smartphone camera and (d)–(f) USB camera results of 1951 USAF resolution test chart imaging along with measured MTFs from a slanted-edge test. The smartphone camera’s measured MTF matches the USAF cutoff frequency of group 5 to 6 (57  lp/mm). The USB camera’s measured MTF matches the USAF cutoff frequency of group 3 to 6 (14.25  lp/mm).

JBO_25_6_066004_f010.png

Maps of the illumination uniformities of both systems are shown in Fig. 11, and the coefficient of variations are given in Table 3.

Fig. 11

Normalized luminance maps showing illumination uniformity of each device and each illumination wavelength corresponding to U in Eq. (5)

JBO_25_6_066004_f011.png

Table 3

LED illumination uniformity according to Eq. (21).

WavelengthSmartphone cameraUSB camera
White0.9540.880
4500.9740.980
4700.9820.977
5000.9350.969
5300.9770.955
5800.9800.949
6600.9160.972
8100.8520.900
9400.9070.870

4.3.

CIEXYZ Normalization

The CIEXYZ normalization constants calculated from the white-LED spectrum for the two dermascopes are shown in Table 4.

Table 4

Measured CIEXYZ normalization constants for both dermascopes.

Smartphone cameraUSB camera
Xn82.87382.846
Yn100100
Zn34.56748.757

5.

Discussion

The distribution of polarized multispectral dermascopes based on smartphone platforms and low-cost color LEDs to PCPs (and eventually to consumers) has the potential to democratize dermal chromophore and melanoma mapping along with erythema monitoring, improving quantitative monitoring of lesions and increasing early detection of skin cancers.

This platform demonstrates a number of advantages compared with previous systems targeting chromophore mapping and skin cancer screening.14,15,17,18,20,2224 The smartphone platform is a compact, low-cost, portable, easy-to-use system with native image capture and processing capabilities, which removes the need for expensive, clinical-grade imaging systems.1719 The platform is flexible enough to use either the embedded camera for imaging or a separate USB-connected camera, depending on the desired ergonomics of the user. Both system implementations can still use the built-in smartphone camera for wide-field, white-light, and dermal imaging [the annulus in Fig. 1(c) can be removed]. Additionally, the smartphone camera can be used for large area image capture either using the smartphone-camera-based dermascope with the imaging annulus removed or using the USB camera’s host smartphone.

The use of low-cost, compact, high-power, high-efficacy, surface mount LEDs improves on the costs and complexities associated with laser-based,22,24 interference-filter-based,14,15,20 and spectrometer-based21,23 systems. While these systems likely allow for better discrimination due to their narrow-bandwidth sources or detection schemes, the costs involved (with the possible exception of the laser-based systems) are prohibitive. High-reliability LEDs are available in myriad wavelengths to probe various points along the chromophore molar attenuation curves (Fig. 3) and can be powered with simple driving circuits. Surface-mount packages remove the bulk of transistor outline can packages (or larger packages) necessary for edge-emitting lasers, and the broad wavelength selection is wider than that of surface mount laser packages such as vertical-cavity surface-emitting lasers. The cost of LED sources compared with laser sources or interference filters allows for the use of multiple wavelengths in a single system while keeping bill of materials (BOM) costs low.

5.1.

Clinical Testing

Initial testing of the systems is promising as both systems were able to capture full image datasets and return similar results of relative chromophore concentrations across multiple dermal lesions except for Hb in the JN case, as shown in Fig. 6. The deviation could be explained by the difference in IR imaging performance between the two dermascopes.

In addition, relative melanin content and erythema as measured through the CIELAB white-light images and OD color images agreed between systems and are reasonable based on visual examination. The USB camera and smartphone camera have differing levels of luminance in their white-light images as seen in Figs. 6 and 7, leading to a difference in baseline lightness and redness values, where the higher luminance smartphone images show higher overall L* and a* values. However, as seen in Fig. 6, the relative changes are similar, where ΔL*3 between the nevus and surrounding skin and Δa*3 between the nevus and surrounding skin.

The occlusion test (Fig. 8) provided directionally correct results for both dermascopes, although the magnitudes of change in chromophore concentration were dissimilar between dermascopes. Again, this deviation could be explained by the difference in IR imaging performance between the two dermascopes. With the next system revision, the ability to measure absolute concentrations should be confirmed with known blood and melanin phantoms.

To fully validate the system, a full clinical trial of longitudinal data with multiple types of skin lesions in addition to testing patients with a wide range of baseline melanin levels will be necessary.49 Once a large dataset is collected along with biopsy and diagnosis results, classification algorithms can be built using machine learning, principal components analysis, or similar tools.2527,50 The statistics of the large dataset and the classifier can then be used to predict the relationships between chromophores, lesion type, and diagnosis. In our two datasets, high-melanin concentrations were present for the JN case but not for the SCC case. The classifier will help to determine if this relationship is true more generally or how this might change in patients with high baseline levels of melanin. Likewise, while the Hb and HbO2 levels were similar in our two datasets, a larger dataset might reveal that cancerous activity increases blood flow,51 increasing both Hb and HbO2 and possibly the ratios between them. The classifier could use additional features and relationships in the images. For example, by Eq. (12), the optical path length increases as the wavelength increases, increasing the probe depth. Detecting lesion shape changes over depth through edge detection or similar means could provide another layer of information. Hints of these changes are apparent in both the JN and SCC cases as both have changing edges as the wavelength changes. Likewise, the classifier could potentially use additional measures such as blood contrasts16 and oxygenation percentages.52

5.2.

Measured Optical Performance

Both cameras produced approximately linear responses when changing exposure time in the case of the smartphone camera dermascope and brightness in the case of the USB camera, providing confidence in the ability of the systems to have a linear response to intensity changes from illumination absorption.

For the smartphone dermascope, the measured MTF performance matched both the predicted diffraction-limited performance and the cutoff frequency measured with the USAF target where group 5 to 6 (57  lp/mm) is resolvable. The root mean square error (RMSE) between the measured MTF and predicted diffraction-limited performance was RMSE=0.97. The USB dermascope’s measured MTF performance did not match the predicted diffraction-limited performance (RMSE=0.384); however, full specifications of the imaging lens are not provided by the manufacturer, precluding a more accurate estimation of the true diffraction-limited performance. The lens’ NA was estimated to be 0.004 based on the slanted-edge measurement. The measured MTF cutoff frequency matched the USAF target measurement where group 3 to 6 (14.25  lp/mm) was resolvable. As shown in the dermal images, both dermascopes demonstrated sufficient image quality for most reasonably sized lesions, with the ability to resolve features as small as 17  μm for the smartphone dermascope and 70  μm for the USB dermascope.

Illumination uniformity was greater than 85% for all wavelengths with both dermascopes and was easily corrected in the image processing algorithms.

5.3.

Next Steps

A number of improvements could be made to the systems before conducting a large-scale clinical trial. Currently, the system processing does not incorporate color-to-color spatial image registration. The effects of this are most readily seen in Fig. 7 where the border markings do not completely overlap. Image capture of a full dataset takes about 20 s. Increasing capture speed would reduce the likelihood for image blur between images, easing the need for color-to-color image registration while faster image capture would also increase patient comfort. If image capture speed is not able to be increased, having the clinician deliberately add the markings would likely improve registration because they provide high contrast, well-defined features to extract.

The USB dermascope could benefit from an improved lens design. Future systems could better take advantage of smartphones with two rear cameras and add stereoscopic 3D imaging to its analyses to provide a topography of the skin lesion. Alternatively, the dual cameras could provide two FOVs or two NAs for imaging flexibility.

Additional illumination optics, such as diffusers,24 could increase illumination uniformity. The LED board was originally designed to take advantage of the dual cameras of the LG G5, but reducing the center aperture of the LED board could increase illumination uniformity and reduce system size. LED wavelengths could also be better tailored to the task or expanded into UV wavelengths to probe potential autofluorescence signatures.

Finally, to determine the effect of the IR filter on the mapping performance, an additional dermascope should be built and tested with the USB camera in which the IR filter is not removed.

6.

Conclusion

Two geometries of smartphone-based dermascopes for dermal lesion screening and erythema monitoring using PMSI and PWLI are described. These devices augment the capabilities of PCPs, with the potential for earlier detection of melanoma and NMSC along with quantitative monitoring of erythema. The combination of LED sources, 3D-printing, and smartphone-based imaging enables the creation of low-cost (a high-volume BOM cost of <$40 excluding the smartphone should be easily achievable), feature-rich, easy-to-use medical imaging devices using either the smartphone camera or a USB camera. While initial results are promising, a longitudinal clinical trial along with histopathology gold-standards will be necessary to validate the diagnostic performance of the devices across multiple lesion types and skin types.

7.

Appendix A: Processing Algorithms Visualized

Figures 12Fig. 1314 provide flowcharts to help visualize the processing algorithms for the reference, white-light images, and color images provided in Algorithms 1 and 2.

Fig. 12

Visual process flow for the reference images as provided in Algorithm 1

JBO_25_6_066004_f012.png

Fig. 13

Visual process flow for the white-light images as provided in Algorithm 2

JBO_25_6_066004_f013.png

Fig. 14

Visual process flow for the multispectral images as provided in Algorithm 2.

JBO_25_6_066004_f014.png

8.

Appendix B: Finger Occlusion

While the Hb and HbO2 chromophore levels should change during the finger occlusion test as shown in Fig. 8, the melanin and background measures should remain constant. Figure 15 shows the additional measures during the occlusion test. Here, the melanin measure has been divided by 100 and the background measure divided by 10,000 for easier comparison of changes between measures. Here, the USB camera’s melanin and background measurements are more stable over the time points compared with the smartphone camera.

Fig. 15

Finger occlusion test results for the smartphone camera and USB camera at preocclusion, after occlusion for 2 min, and postocclusion for Hb, HbO2, melanin, and background measures resulting from Eq. (14). Here, the melanin measure has been divided by 100 and the background measure divided by 10,000 for easier comparison with the changes in Hb and HbO2.

JBO_25_6_066004_f015.png

Disclosures

The authors declare that no competing interests exist.

Acknowledgments

The authors would like to thank Oliver Spires for assistance in manufacturing the PMMA disk, Pier Morgan and the Center for Gamma-ray Imaging for use and operation of the rapid prototype printer, David Coombs for assistance with PCB assembly, and Shaobai Li for collection of USAF and slanted-edge images. We are grateful for our funding sources. This research was supported by the National Institutes of Health Biomedical Imaging and Spectroscopy Training under Grant No. T32EB000809 and the National Institutes of Health under Grant No. S10OD018061.

References

1. 

F. Bray et al., “Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries,” CA Cancer J. Clin., 68 (6), 394 –424 (2018). https://doi.org/10.3322/caac.21492 CAMCAM 0007-9235 Google Scholar

2. 

R. L. Siegel et al., “Cancer statistics, 2019,” CA Cancer J. Clin., 69 (1), 7 –34 (2019). https://doi.org/10.3322/caac.21551 CAMCAM 0007-9235 Google Scholar

3. 

E. Harrington et al., “Diagnosing malignant melanoma in ambulatory care: a systematic review of clinical prediction rules,” BMJ Open, 7 (3), e014096 (2017). https://doi.org/10.1136/bmjopen-2016-014096 Google Scholar

4. 

A. Ehrlich et al., “Trends in dermatology practices and the implications for the workforce,” J. Am. Acad. Dermatol., 77 (4), 746 –752 (2017). https://doi.org/10.1016/j.jaad.2017.06.030 JAADDB 0190-9622 Google Scholar

5. 

M. R. Sargen et al., “The dermatology workforce supply model: 2015–2030,” Dermatol. Online J., 23 (9), 8 (2017). Google Scholar

6. 

J. Y. Yoo et al., “Trends in dermatology: geographic density of US dermatologists,” Arch Dermatol., 146 (7), 779 –779 (2010). https://doi.org/10.1001/archdermatol.2010.127 Google Scholar

7. 

C. J. Koelink et al., “Skin lesions suspected of malignancy: an increasing burden on general practice,” BMC Fam. Pract., 15 (1), 29 (2014). https://doi.org/10.1186/1471-2296-15-29 Google Scholar

8. 

A. J. Gewirtzman et al., “Computerized digital dermoscopy,” J. Cosmet. Dermatol., 2 (1), 14 –20 (2003). https://doi.org/10.1111/j.1473-2130.2003.00009.x Google Scholar

11. 

N. R. Abbasi et al., “Early diagnosis of cutaneous melanoma: revisiting the ABCD criteria,” JAMA, 292 (22), 2771 (2004). https://doi.org/10.1001/jama.292.22.2771 JAMAAP 0098-7484 Google Scholar

12. 

M. A. Weinstock et al., “Enhancing skin self-examination with imaging: evaluation of a mole-mapping program,” J. Cutan. Med. Surg., 8 (1), 5 (2004). https://doi.org/10.1177/120347540400800101 Google Scholar

13. 

V. Chiu et al., “The use of mole-mapping diagrams to increase skin self-examination accuracy,” J. Am. Acad. Dermatol., 55 (2), 245 –250 (2006). https://doi.org/10.1016/j.jaad.2006.02.026 JAADDB 0190-9622 Google Scholar

14. 

B. Farina et al., “Multispectral imaging approach in the diagnosis of cutaneous melanoma: potentiality and limits,” Phys. Med. Biol., 45 (5), 1243 –1254 (2000). https://doi.org/10.1088/0031-9155/45/5/312 PHMBA7 0031-9155 Google Scholar

15. 

A. Vogel et al., “Using noninvasive multispectral imaging to quantitatively assess tissue vasculature,” J. Biomed. Opt., 12 (5), 051604 (2007). https://doi.org/10.1117/1.2801718 JBOPFO 1083-3668 Google Scholar

16. 

D. Kapsokalyvas et al., “Imaging of human skin lesions with the multispectral dermoscope,” Proc. SPIE, 7548 754808 (2010). https://doi.org/10.1117/12.841790 PSISDG 0277-786X Google Scholar

17. 

D. Jakovels et al., “RGB imaging system for mapping and monitoring of hemoglobin distribution in skin,” Proc. SPIE, 8158 81580R (2011). https://doi.org/10.1117/12.893789 PSISDG 0277-786X Google Scholar

18. 

I. Kuzmina et al., “Contact and contactless diffuse reflectance spectroscopy: potential for recovery monitoring of vascular lesions after intense pulsed light treatment,” J. Biomed. Opt., 16 (4), 040505 (2011). https://doi.org/10.1117/1.3569119 JBOPFO 1083-3668 Google Scholar

19. 

I. Diebele et al., “Clinical evaluation of melanomas and common nevi by spectral imaging,” Biomed. Opt. Express, 3 (3), 467 (2012). https://doi.org/10.1364/BOE.3.000467 BOEICL 2156-7085 Google Scholar

20. 

F. Wang et al., “High-contrast subcutaneous vein detection and localization using multispectral imaging,” J. Biomed. Opt., 18 (5), 050504 (2013). https://doi.org/10.1117/1.JBO.18.5.050504 JBOPFO 1083-3668 Google Scholar

21. 

F. Vasefi et al., “Polarization-sensitive hyperspectral imaging in vivo: a multimode dermoscope for skin analysis,” Sci. Rep., 4 (1), 4924 (2015). https://doi.org/10.1038/srep04924 SRCEC3 2045-2322 Google Scholar

22. 

J. Spigulis et al., “Snapshot RGB mapping of skin melanin and hemoglobin,” J. Biomed. Opt., 20 (5), 050503 (2015). https://doi.org/10.1117/1.JBO.20.5.050503 JBOPFO 1083-3668 Google Scholar

23. 

F. Vasefi et al., “Multimode optical dermoscopy (SkinSpect) analysis for skin with melanocytic nevus,” Proc. SPIE, 9711 971110 (2016). https://doi.org/10.1117/12.2214288 PSISDG 0277-786X Google Scholar

24. 

J. Spigulis et al., “Smartphone snapshot mapping of skin chromophores under triple-wavelength laser illumination,” J. Biomed. Opt., 22 (9), 091508 (2017). https://doi.org/10.1117/1.JBO.22.9.091508 JBOPFO 1083-3668 Google Scholar

25. 

M. Elbaum et al., “Automatic differentiation of melanoma from melanocytic nevi with multispectral digital dermoscopy: a feasibility study,” J. Am. Acad. Dermatol., 44 (2), 207 –218 (2001). https://doi.org/10.1067/mjd.2001.110395 JAADDB 0190-9622 Google Scholar

26. 

G. Lu et al., “Medical hyperspectral imaging: a review,” J. Biomed. Opt., 19 (1), 010901 (2014). https://doi.org/10.1117/1.JBO.19.1.010901 JBOPFO 1083-3668 Google Scholar

27. 

B. Song et al., “Automatic classification of dual-modalilty, smartphone-based oral dysplasia and malignancy images using deep learning,” Biomed. Opt. Express, 9 (11), 5318 –5329 (2018). https://doi.org/10.1364/BOE.9.005318 BOEICL 2156-7085 Google Scholar

28. 

R. R. Anderson, “Polarized light examination and photography of the skin,” Arch. Dermatol., 127 (7), 1000 –1005 (1991). https://doi.org/10.1001/archderm.1991.01680060074007 Google Scholar

29. 

R. D. Uthoff et al., “Small form factor, flexible, dual-modality handheld probe for smartphone-based, point-of-care oral and oropharyngeal cancer screening,” J. Biomed. Opt., 24 (10), 106003 (2019). https://doi.org/10.1117/1.JBO.24.10.106003 JBOPFO 1083-3668 Google Scholar

30. 

R. D. Uthoff, “Rossuthoff/multichannel_led_driver: KiCAD design files,” (2018). Google Scholar

31. 

International Electrotechnical Commission, “Multimedia systems and equipment—colour measurement and management—Part 2-1: colour management—default RGB colour space—sRGB,” (1999). Google Scholar

32. 

B. G. Grant, Field Guide to Radiometry, SPIE Press, Bellingham, Washington, DC (2011). Google Scholar

33. 

I. Kuzmina et al., “Towards noncontact skin melanoma selection by multispectral imaging analysis,” J. Biomed. Opt., 16 (6), 060502 (2011). https://doi.org/10.1117/1.3584846 JBOPFO 1083-3668 Google Scholar

34. 

S. L. Jacques et al., “Rapid spectral analysis for spectral imaging,” Biomed. Opt. Express, 1 (1), 157 –164 (2010). https://doi.org/10.1364/BOE.1.000157 BOEICL 2156-7085 Google Scholar

35. 

D. Jakovels et al., “2-D mapping of skin chromophores in the spectral range 500–700 nm,” J. Biophotonics, 3 (3), 125 –129 (2010). https://doi.org/10.1002/jbio.200910069 Google Scholar

36. 

S. A. Prahl, “Tabulated molar extinction coefficient for hemoglobin in water,” (1998) https://omlc.org/spectra/hemoglobin/summary.html Google Scholar

37. 

S. L. Jacques, “Melanosome absorption coefficient,” (1998) https://omlc.org/spectra/melanin/mua.html Google Scholar

38. 

R. R. Anderson et al., “The optics of human skin,” J. Invest. Dermatol., 77 (1), 13 –19 (1981). https://doi.org/10.1111/1523-1747.ep12479191 JIDEAE 0022-202X Google Scholar

39. 

A. E. Siegman, Lasers, University Science Books, Mill Valley, California (1986). Google Scholar

40. 

A. Belay et al., “Determination of integrated absorption cross-section, oscillator strength and number density of caffeine in coffee beans by the integrated absorption coefficient technique,” Int. J. Phys. Sci., 4 (11), 722 –728 (2009). 1992-1950 Google Scholar

41. 

I. Nishidate et al., “Estimation of melanin and hemoglobin using spectral reflectance images reconstructed from a digital RGB image by the Wiener estimation method,” Sensors, 13 (6), 7902 –7915 (2013). https://doi.org/10.3390/s130607902 SNSRES 0746-9462 Google Scholar

42. 

J. Schanda et al., Colorimetry: Understanding the CIE System, CIE/Commission Internationale de l’eclairage, Wiley-Interscience, Vienna, Austria, Hoboken, New Jersey (2007). Google Scholar

43. 

J. K. Wagner et al., “Comparing quantitative measures of erythema, pigmentation and skin response using reflectometry,” Pigment Cell Res., 15 (5), 379 –384 (2002). https://doi.org/10.1034/j.1600-0749.2002.02042.x PCREEA 0893-5785 Google Scholar

44. 

C. Wyman et al., “Simple analytic approximations to the CIE XYZ color matching functions,” J. Comput. Graph. Tech., 2 (2), 11 (2013). Google Scholar

45. 

I. Diebele et al., “Analysis of skin basalioma and melanoma by multispectral imaging,” Proc. SPIE, 8427 842732 (2012). https://doi.org/10.1117/12.922301 PSISDG 0277-786X Google Scholar

46. 

B. L. Diffey et al., “A portable instrument for quantifying erythema induced by ultraviolet radiation,” Br. J. Dermatol., 111 (6), 663 –672 (1984). https://doi.org/10.1111/j.1365-2133.1984.tb14149.x BJDEAZ 0007-0963 Google Scholar

47. 

P. D. Burns, “Slanted-edge MTF for digital camera and scanner analysis,” in Proc. IS&T, 135 –138 (2000). Google Scholar

48. 

Z. Qin et al., “Analysis of condition for uniform lighting generated by array of light emitting diodes with large view angle,” Opt. Express, 18 (16), 17460 –17476 (2010). https://doi.org/10.1364/OE.18.017460 OPEXFF 1094-4087 Google Scholar

49. 

S. Sprigle et al., “Detection of skin erythema in darkly pigmented skin using multispectral images,” Adv. Skin Wound Care, 22 (4), 172 –179 (2009). https://doi.org/10.1097/01.ASW.0000305465.17553.1c Google Scholar

50. 

P. Rubegni et al., “Digital dermoscopy analysis and artificial neural network for the differentiation of clinically atypical pigmented skin lesions: a retrospective study,” J. Invest. Dermatol., 119 (2), 471 –474 (2002). https://doi.org/10.1046/j.1523-1747.2002.01835.x JIDEAE 0022-202X Google Scholar

51. 

M. Lupu et al., “Vascular patterns in basal cell carcinoma: dermoscopic, confocal and histopathological perspectives,” Oncol. Lett., 17 (5), 4112 –4125 (2019). https://doi.org/10.3892/ol.2019.10070 Google Scholar

52. 

A. Cysewska-Sobusiak, “Noninvasive monitoring of arterial blood oxygenation with spectrophotometric technique,” Proc. SPIE, 1711 311 –323 (1993). https://doi.org/10.1117/12.155671 PSISDG 0277-786X Google Scholar

Biography

Ross D. Uthoff received his BS degree in optical engineering from the Rose-Hulman Institute of Technology and his PhD from the James C. Wyant College of Optical Sciences at the University of Arizona with a focus on biomedical imaging with smartphone cameras. He is currently a lidar engineer at Lumotive, working on lidar utilizing metasurface-based beam steering. He is a member of SPIE.

Biographies of the other authors are not available.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Ross D. Uthoff, Bofan Song, Melody Maarouf, Vivian Y. Shi, and Rongguang Liang "Point-of-care, multispectral, smartphone-based dermascopes for dermal lesion screening and erythema monitoring," Journal of Biomedical Optics 25(6), 066004 (23 June 2020). https://doi.org/10.1117/1.JBO.25.6.066004
Received: 1 November 2019; Accepted: 8 June 2020; Published: 23 June 2020
Lens.org Logo
CITATIONS
Cited by 19 scholarly publications and 1 patent.
Advertisement
Advertisement
KEYWORDS
Cameras

Chromophores

Light emitting diodes

Imaging systems

Point-of-care devices

Modulation transfer functions

Image processing

Back to Top