Open Access
3 March 2021 Functional brain connectivity related to surgical skill dexterity in physical and virtual simulation environments
Arun Nemani, Anil Kamat, Yuanyuan Gao, Meryem A. Yucel, Denise Gee, Clairice Cooper, Steven D. Schwaitzberg, Xavier Intes, Anirban Dutta, Suvranu De
Author Affiliations +
Abstract

Significance: Surgical simulators, both virtual and physical, are increasingly used as training tools for teaching and assessing surgical technical skills. However, the metrics used for assessment in these simulation environments are often subjective and inconsistent.

Aim: We propose functional activation metrics, derived from brain imaging measurements, to objectively assess the correspondence between brain activation with surgical motor skills for subjects with varying degrees of surgical skill.

Approach: Cortical activation based on changes in the oxygenated hemoglobin (HbO) of 36 subjects was measured using functional near-infrared spectroscopy at the prefrontal cortex (PFC), primary motor cortex, and supplementary motor area (SMA) due to their association with motor skill learning. Inter-regional functional connectivity metrics, namely, wavelet coherence (WCO) and wavelet phase coherence were derived from HbO changes to correlate brain activity to surgical motor skill levels objectively.

Results: One-way multivariate analysis of variance found a statistically significant difference in the inter-regional WCO metrics for physical simulator based on Wilk’s Λ for expert versus novice, F  (  10,1  )    =  7495.5, p  <  0.01. Partial eta squared effect size for the inter-regional WCO metrics was found to be highest between the central prefrontal cortex (CPFC) and SMA, CPFC-SMA (η2  =  0.257). Two-tailed Mann–Whitney U tests with a 95% confidence interval showed baseline equivalence and a statistically significant (p  <  0.001) difference in the CPFC-SMA WPCO metrics for the physical simulator training group (0.960  ±  0.045) versus the untrained control group (0.735  ±  0.177) following training for 10 consecutive days in addition to the pretest and posttest days.

Conclusion: We show that brain functional connectivity WCO metric corresponds to surgical motor skills in the laparoscopic physical simulators. Functional connectivity between the CPFC and the SMA is lower for subjects that exhibit expert surgical motor skills than untrained subjects in laparoscopic physical simulators.

1.

Introduction

Surgical training has traditionally followed an apprenticeship-based model where technical skills are taught in the operating room.1,2 However, this approach is often costly, time-consuming, and presents significant adverse patient outcomes due to the trainee’s inexperience. Furthermore, with the advent of minimally invasive surgery and laparoscopic procedures, programs such as the Fundamentals of Laparoscopic Surgery (FLS) and the Fundamentals of Endoscopic Surgery (FES) have been adopted by the American Board of Surgery as accredited means for assessing technical surgical skills.311 Surgical skill assessment in these simulator-based training methods often utilizes rating scales, rudimentary performance metrics, or direct observation methods to rate and assess surgical task performance.9,1217 While these metrics’ usage is standard of practice in surgical skill training and assessment, they have been cited for having low interpreter reliability and poor correlation of simulator-based performance metrics to clinical outcomes in the operating room.2,18,19

Compounding the lack of robust surgical skill assessment metrics, there is a lack of studies that comprehensively address the underlying neurophysiological responses to varying surgical motor skill levels. Current studies have shown the potential of non-invasive brain imaging to quantify cortical activation differences for subjects with varying degrees of surgical motor skills.2025 These studies have shown significant differences in functional activation in the prefrontal cortex (PFC) (Refs. 2021.22.23.24.25.26.27.28.29.30.31.32), and recently, in the primary motor cortex (M1) and the supplementary motor area (SMA)33 as well, due to their involvement in motor skill learning. However, the underlying temporal correlations between these anatomically separated cortical brain regions correlated to surgical motor skills have not yet been studied systematically. In this regard, functional connectivity methodologies can leverage such temporal correlations to classify or distinguish subjects.34 Indeed, techniques to quantify brain functional connectivity, such as wavelet coherence (WCO) and wavelet phase coherence (WPCO), have been utilized in multiple functional near-infrared spectroscopy (fNIRS) studies.3541 WCO and WPCO analyses can objectively quantify functional connectivity and strong temporal correlations by determining significantly high common power and phase-locked behavior between two specific cortical channels;36,42 this approach can address the neurophysiological knowledge gap of surgical motor skill learning effects on the brain.

Herein, we report the inter-regional functional connectivity between the three above-mentioned cortical regions, namely, the PFC, the M1, and the SMA. Beyond the difference in activation levels of each of these cortical regions as demonstrated in Ref. 33, we hypothesize that the surgical motor skill levels will significantly affect the functional brain connectivity measured with WCO and WPCO during assessment using both virtual and physical surgical simulators. To test this hypothesis, subjects with varying degrees of surgical motor expertise performed a complex surgical training task on physical and virtual simulators while undergoing fNIRS imaging of oxygenated hemoglobin (HbO) changes for each brain region. To quantify inter-regional functional connectivity between cortical regions, WCO and WPCO were calculated from HbO time-series as they performed the surgical training task.

2.

Methods and Materials

2.1.

Subjects

Thirty-six right-handed subjects were recruited in this Institutional Review Board-approved study conducted at the Massachusetts General Hospital and the University at Buffalo. The subjects were split into two cohorts. The first cohort included novice and expert surgeons and the second cohort included training medical students. The second cohort was further divided into three distinct groups: FLS training group, virtual basic laparoscopic skills trainer (VBLaST) training group, and control group. An a priori power analysis, based on two-sample t-tests, was completed to determine the minimum number of samples required for both cohorts in this study. Using pilot study data and the power estimation software G*Power,43 we estimated conservative effect sizes for the FLS and VBLaST training groups for FLS and VBLAST task performance scores, d=5.67 and d=2.57, respectively.25 With a 95% confidence interval (CI) and a minimum power of 0.80, a minimum of eight subjects each for the expert and novice surgeon cohort group, four subjects for the FLS training group, three subjects for the VBLaST training group, and four subjects for the control group was estimated for this study.

All the participants were instructed on how to perform the task with standardized verbal instruction indicating the goal of the task and rules for the task completion. The optical probe was positioned on the participants with great care to avoid any hair between source/detector and scalp, and robust coupling with the skin. Each participant’s experimental protocol consisted of a block design of rest and stimulus period (surgical cutting task). Each surgeon performed five trials, whereas the control group performed three trials, as shown in the table below. The surgical cutting task was performed until the completion or stopped after 5 min. Then a rest period of one minute was given. This cycle of the rest period and task continued for the number of trials indicated above. Subject demographics are summarized in Table 1, and further details on subject recruitment, compensations, and other pertinent study replication details can be found in Nemani et al.25

Table 1

Study subject demographics and training procedures completed.

Cohort# of subjectsMean ageTraining/certificationAverage # of laparoscopic procedures# of completed FLS pattern cutting trials# of completed VBLaST pattern cutting trials
Expert surgeon835Postgraduate year 4 to 5 or attending surgeons70055
Novice surgeon931Postgraduate year 1 to 36055
FLS training group925Medical school year 1 to 40>1000
VBLaST training group824Medical school year 1 to 400>85
Control526Medical school year 1 to 4033

2.2.

Hardware and Study Design

We utilized two different surgical training simulators that employ the pattern cutting task. As a physical surgical trainer, we used the official FLS box trainer used in Board certification.9,44,45 As a representation of a virtual surgical trainer, we utilized the VBLaST, a virtual reality-based simulator that replicates the FLS training tasks.25,4650 We employed a commercially available fNIRS system to measure functional brain activation during surgical training pattern cutting tasks (CW6 system, TechEn Inc., Massachusetts). Infrared light was delivered at 690 and 830 nm to eight different sources that were coupled to eight different short separation detectors and 16 long separation detectors. Each long separation detector was separated from its corresponding source by 30 to 40 mm to ensure depth specificity to the cortex. The short separation detectors were placed 8 mm away from each source to ensure that only superficial tissue, such as the scalp, skull, dura, and pial matter, were measured.

2.3.

Protocol Design

All study participants were asked to perform the pattern cutting task. The objective was to use laparoscopic tools to cut a marked circle on a piece of gauze as accurately and quickly as possible. Each subject was instructed on how to perform the task using a standardized verbal dictation indicating the pattern cutting task’s rules and goals. Each session consisted of each subject performing the pattern cutting task with rest periods between each subsequent trial. Further information regarding study design can be found in Nemani et al.25

2.4.

fNIRS Data Processing for the Hemodynamic Response Function

All fNIRS data processing were completed using HOMER2, a validated and published open-source software suite implemented in Matlab (Mathworks, Natick, Massachusetts), which provides a set of Matlab scripts used for analyzing fNIRS data.51 Prior to any data processing, data channels that exhibit low signal to noise ratios, namely, outside of the range of 80 to 140 dB, were excluded from the analysis. The modified Beer–Lambert law was used to convert the detectors’ raw optical data into optical density (hmrIntensity2OD). The fNIRS data can be contaminated with the inevitable motion artefacts due to the participant’s motion while doing the pattern cutting task. Any such large motion artefacts were corrected using principal component analysis (PCA) in HOMER2 (hmrMotionCorrectedPCA);5153 however, no filters were applied to the time-series data to preserve the entire frequency bandwidth of each channel. PCA application assumes that any large motion has a dominant contribution to the variance of the fNIRS data, and because the first principal component will account for the largest proportion of that variance, which was removed from the original fNIRS data. Then, following the conversion of optical density to changes in oxy and deoxy-hemoglobin concentrations (hmrOD2Conc) with partial path-length factors of 6.4 (690 nm) and 5.8 (830 nm), the short separation channels (an inter-optode distance of 8 mm) were regressed from the long separation channels (an inter-optode distance of 30 to 40 mm) using a general linear model (GLM) in HOMER2 to remove systemic physiology originating from non-cortical superficial regions.54,55 Then, the hemodynamic response function (HRF) was estimated by the GLM approach in HOMER2 (hmrDeconvHRF_DriftSS) that uses ordinary least squares. The HRFs were calculated using a consecutive sequence of Gaussian functions as the temporal basis for the HRF.54,5658 The result is a time-series that shows changes in HbO for each brain region that is specific to the cortical activity. The functional connectivity metrics were computed between each pair of HbO time-series from the following brain regions: left lateral prefrontal cortex (LPFC), central prefrontal cortex (CPFC), right lateral prefrontal cortex (RPFC), left medial primary motor cortex (LMM1), and SMA.

2.5.

Wavelet Coherence and Wavelet Phase Coherence Metrics of Functional Connectivity

To objectively quantify functional connectivity between time series from different cortical regions, we utilize the WCO and WPCO metrics. WCO as a function of frequency is defined below:59,60

Eq. (1)

WCO(f)=[1Nn=1Nw1(tn)w2*(tn)][1Nm=1Nw1*(tm)w2(tm)]P1(f)P2(f)
where w1 and w2 are complex oscillatory Morlet wavelet transforms of the first and second-time series, N is the total number of time steps of each time series, * is the complex conjugate, and P1(f) and P2(f) are the wavelet power at frequency f. The time-averaged WPCO is also defined below:33,36,59,60

Eq. (2)

WPCO(f)=cosΔϕ(f)2+sinΔϕ(f)2

Eq. (3)

cosΔϕ(f)=1Nn=1NcosΔϕ(f,tn)

Eq. (4)

sinΔϕ(f)=1Nn=1NsinΔϕ(f,tn)
where Δφ(f,tn) is the instantaneous phase difference between two complex oscillatory time series. The coefficients cos Δφ(f,tn) and sin Δφ(f,tn) is then time-averaged across the entire time series. The significance of these metrics is that they can objectively quantify correlations of two independent time series with specificity to the frequency and temporal changes.35 A value of 0 for both WCO and WPCO indicates that two time-series are entirely unrelated in phase changes and coherence magnitudes. A value of 1 for both WCO and WPCO indicates a significant linear relationship between the two time-series and that the oscillatory phase changes are significantly correlated.35,38,61,62 As shown in Table 2 below, the entire frequency bandwidth of the resulting WCO and WPCO vectors is split into five different intervals that are correlated to different physiological activities. Furthermore, results from WCO and WPCO analysis are shown for two examples of fNIRS time-series in Fig. 1. Figure 1(a) shows two example channels, the left lateral PFC and the left medial M1, for one subject while performing the FLS pattern cutting task. Figure 1(b) shows the corresponding WCO magnitude plot for each frequency and time step between the two example channels. Figures 1(c) and 1(d) are the time-averaged WCO and WPCO magnitudes. Furthermore, the frequency intervals are depicted to show the specific coherence magnitudes ranges for each associated physiology. It is worth noting that only WCO and WPCO values within the cone of influence, depicted as a shaded white line, are used for analysis due to edge effects that may bias the analysis. The inter-regional functional connectivity metrics (WCO and WPCO) were computed between LPFC and CPFC (LPFC-CPFC), between LPFC and RPFC (LPFC-RPFC), between LPFC and SMA (LPFC-SMA), between LPFC and LMM1 (LPFC-LMM1), between CPFC and RPFC (CPFC-RPFC), between CPFC and SMA (CPFC-SMA), between CPFC and LMM1 (CPFC-LMM1), between RPFC and SMA (RPFC-SMA), between RPFC and LMM1 (RPFC-LMM1), and between SMA and LMM1 (SMA-LMM1).

Table 2

Frequency bandwidth intervals with their associated physiology.35–38,67

Frequency intervalFrequency range (Hz)Associated physiology
I0.6 to 2Cardiac activity
II0.15 to 0.6Respiratory activity
III0.05 to 0.15Myogenic smooth muscle activity
IV0.02 to 0.05Neurovascular coupling and autonomic control in the cortex
V0.005 to 0.02Nitric oxide-related endothelial metabolic activity

Fig. 1

An illustrative example of WCO between two different fNIRS time-series data. (a) Timeseries data from the left lateral PFC (LPFC) and left medial M1 (LMM1) channels for a surgical expert during one FLS task trial. (b) WCO magnitude between the two time-series data in time and frequency domains. WCO magnitude values are shown via the color bar. Only values within the cone of influence range, indicated by a dashed white line, are included for WCO power magnitude and phase coherence calculations. (c) Time-averaged WCO magnitudes and (d) WPCO magnitudes between the two example time series shown in (a).

NPH_8_1_015008_f001.png

2.6.

Statistical Testing

The inter-regional functional connectivity metrics (WCO and WPCO) from the first cohort of novice and expert surgeons were used to conduct a one-way multivariate analysis of variance (one-way MANOVA) in SPSS version 27 (IBM) to determine whether there is any significant difference in the inter-regional (i.e., LPFC-CPFC, LPFC-RPFC, LPFC-SMA, LPFC-LMM1, CPFC-RPFC, CPFC-SMA, CPFC-LMM1, RPFC-SMA, RPFC-LMM1, and SMA-LMM1) functional connectivity metrics between novice and expert surgeons using Wilks’ Lambda. The Shapiro-Wilk test was used to test normality for each of the dependent variables (i.e., inter-regional functional connectivity metrics) for the independent variable, novice, and expert. Also, the Levene test was used to test homogeneity of variance. All the significance levels were set at p<0.01. Then, to determine how the dependent variables (i.e., inter-regional functional connectivity) differ for the independent variable (novice versus expert), partial eta squared effect size was used and with alpha correction with Bonferroni correction. The dependent variable with the largest partial eta squared effect size was selected to investigate medical students’ surgical training effects. Nonparametric two-tailed Mann–Whitney U tests were utilized within a 95% CI to determine the baseline equivalence and a significant difference in WCO and WPCO metrics following surgical training between the training group and the control group.

3.

Results

3.1.

Functional Connectivity Differences between Expert and Novice Surgeons

To investigate significant inter-regional functional connectivity differences between expert and novice surgeons on physical (FLS) or virtual (VBLaST) simulators, we report in Fig. 2 the mean WCO and WPCO metrics with error bars representing 95% CIs for LPFC-CPFC, LPFC-RPFC, LPFC-SMA, LPFC-LMM1, CPFC-RPFC, CPFC-SMA, CPFC-LMM1, RPFC-SMA, RPFC-LMM1, and SMA-LMM1. Shapiro–Wilk test showed that each of the inter-regional functional connectivity metrics is normally distributed at a significance level of p<0.01. Also, Levene’s test of equality of error variance was satisfied at a significance level of p<0.01. One-way MANOVA found statistically significant difference only in the inter-regional WCO metrics for physical (FLS) simulator based on Wilk’s Λ for expert versus novice, F(10,1)=7495.5, p<0.01. Partial eta squared effect size for the inter-regional WCO metrics was found to be highest between the CPFC and SMA, CPFC-SMA (η2=0.257). The other Partial Eta Squared effect sizes were, LPFC-CPFC: η2=0.003, LPFC-RPFC: η2=0.001, LPFC-SMA: η2=0.024, LPFC-LMM1: η2=0.015, CPFC-RPFC: η2=0.013, CPFC-LMM1: η2=0.058, RPFC-SMA: η2=0.005, RPFC-LMM1: η2=0.121, SMA-LMM1: η2=0.195. Figure 2 shows that the CPFC-SMA WCO and WPCO metrics were higher in novice than experts in physical (FLS) simulator while higher in expert than a novice in virtual (VBLAST) simulators. All MANOVA results are provided in Figures S1 to S4 in the Supplementary Materials.

Fig. 2

WCO and WPCO magnitude changes between expert (E) and novice (N) surgeons on physical (FLS) and virtual (VBLAST) simulators. (a)–(b) WCO magnitudes and WPCO magnitudes for FLS experts (blue) vs. novices (green) within the neurovascular coupling activity frequency range (0.02 to 0.05 Hz). (c)–(d) WCO magnitudes and WPCO magnitudes for VBLaST experts (blue) versus novices (green) within the neurovascular coupling activity frequency range. Error bars represent a 95% CI.

NPH_8_1_015008_f002.png

3.2.

CPFC-SMA Functional Connectivity Changes during FLS Surgical Training

Since functional connectivity changes between the CPFC and SMA, CPFC-SMA was responsive to increased surgical motor skill proficiency in expert versus novice during physical (FLS) simulator task, so we calculated CPFC-SMA WCO and WPCO metrics for FLS practice in medical student trainees. Figure 3 shows the longitudinal functional connectivity results of the FLS training group and the control group. Two-tailed Mann–Whitney U tests with a 95% CI showed baseline equivalence and a statistically significant (p<0.001) difference in the WPCO metric between the FLS training group (0.960±0.045) and the untrained control group (0.735±0.177).

Fig. 3

Longitudinal WPCO with FLS surgical skill training. WPCO magnitudes within the neurovascular coupling activity frequency range (0.02 to 0.05 Hz) between the CPFC and SMA channels for the untrained control group and FLS training group during surgical training over ten consecutive days.

NPH_8_1_015008_f003.png

4.

Discussion

While surgical simulators are significantly gaining ground for surgical skill training and assessments,2 the underlying neurological mechanisms or functional connectivity between correlated cortical regions are mostly unstudied. This study compares the functional connectivity of cortical regions associated with fine motor skills for subjects with varying degrees of surgical motor skill. We found a statistically significant (p<0.01) difference in the inter-regional WCO metrics for physical (FLS) simulator task for expert versus novice surgeons based on Wilk’s Λ. The inter-regional WCO metric was found to have the highest Partial Eta Squared effect size for the CPFC and SMA. Furthermore, a statistically significant (p<0.001) difference in the CPFC-SMA WPCO metric was found for the physical (FLS) simulator training group when compared to the untrained control group following baseline equivalence and then training for 10 consecutive days (in addition to the pretest and posttest days).

Inter-regional functional connectivity within the neurovascular coupling frequency range (0.02 to 0.05 Hz) is postulated to be related to neuronal communication. Neurovascular coupling is the interaction between neural activity and vascular response in terms of regional cerebral blood supply and HbO during brain activity. The interaction between HbO time-series between two brain regions can be assessed using various methods, including WCO, a measure of the correlation between two time-series, and WPCO based on the degree of coincidence of instantaneous phase over the entire time-series.63 Lower WCO and WPCO metrics of functional connectivity between CPFC and SMA (Fig. 2) in an expert when compared to a novice are postulated to be related to more implicit knowledge-based physical (FLS) surgical task performance in experts.64 PFC has been shown to be engaged during explicit motor-sequence learning while implicit knowledge activates SMA,64 so CPFC-SMA functional connectivity indicates an interplay between explicit motor-sequence learning and implicit knowledge during FLS surgical task performance and learning in novice. Our study provided preliminary evidence on CPFC-SMA functional connectivity to assess this interplay between motor and frontal regions in experts versus novices in physical (FLS) simulators where tactile and proprioceptive feedbacks are available.64 However, inter-regional PFC WCO and WPCO metrics, including LPFC-RPFC and CPFC-RPFC, were found to be more relevant during virtual (VBLaST) surgical task performance (Fig. 2) that was higher in novices than experts. Here, higher inter-regional PFC WCO and WPCO metrics may indicate explicit motor-sequence learning64 in novices during VBLaST surgical task performance primarily using visual feedback.

During surgical tasks in physical and virtual simulators, our neuroimaging approach utilized the most recent advances in portable functional brain imaging using fNIRS with increased specificity to cortical tissue due to short separation regression.57,58 Such methods provide a more accurate estimation of the cortical tissue’s hemodynamics during complex bimanual surgical tasks in virtual and physical simulators (Refs. 55, 65, and 66), which have not been reported previously. Our results quantified inter-regional functional connectivity solely based on WCO and WPCO metrics that showed promise in assessing surgical motor skill proficiency and can be utilized for learning assessment during surgical training in the future. The differences in cortical activation and inter-regional functional connectivity between physical (FLS) or virtual (VBLaST) simulators need further investigation.

5.

Conclusion

This study showed that functional connectivity changes based on WCO and WPCO metrics corresponded to the surgical motor skill proficiency, and these connectivity changes were in the neurovascular coupling frequency range in the cortical regions. Our study showed that surgical experts and surgically trained subjects exhibited functional activation correlations and the instantaneous phase’s coincidence over the CPFC and SMA time-series. These results further our understanding of neural correlates of the interplay between motor and frontal regions related to fine motor learning associated with surgical training and can be used for future assessment paradigms.

Disclosures

All of the authors had neither relevant financial or competing interests nor other potential conflicts of interests.

Acknowledgments

This work is supported by the National Institutes of Health (NIH) Award Nos. NIBIB 1R01EB014305, NHLBI 1R01HL119248, and NCI 1R01CA197491; and the Medical Technology Enterprise Consortium (MTEC) Award No. W81XWH2090019 (2020-628). The authors would like to thank the attending surgeons, residents, and medical student subjects and their dedication to this study. We would also like to thank Arthur “Buzz” DiMartino and his team at TechEn for graciously providing us support with the CW6 spectrometer.

References

1. 

B. Zendejas et al., “State of the evidence on simulation-based training for laparoscopic surgery: a systematic review,” Ann. Surg., 257 (4), 586 –593 (2013). https://doi.org/10.1097/SLA.0b013e318288c40b Google Scholar

2. 

S. R. Dawe et al., “Systematic review of skills transfer after surgical simulation‐based training,” Br. J. Surg., 101 (9), 1063 –1076 (2014). https://doi.org/10.1002/bjs.9482 Google Scholar

3. 

S. A. Fraser et al., “Evaluating laparoscopic skills: setting the pass/fail score for the MISTELS system,” Surg. Endosc. Other Interv. Tech., 17 (6), 964 –967 (2003). https://doi.org/10.1007/s00464-002-8828-4 Google Scholar

4. 

S. A. Fraser et al., “Characterizing the learning curve for a basic laparoscopic drill,” Surg. Endosc. Other Interv. Tech., 19 (12), 1572 –1578 (2005). https://doi.org/10.1007/s00464-005-0150-5 Google Scholar

5. 

G. M. Fried et al., “Proving the value of simulation in laparoscopic surgery,” Ann. Surg., 240 (3), 518 (2004). https://doi.org/10.1097/01.sla.0000136941.46529.56 Google Scholar

6. 

A. L. McCluney et al., “FLS simulator performance predicts intraoperative laparoscopic skill,” Surg. Endosc., 21 (11), 1991 –1995 (2007). https://doi.org/10.1007/s00464-007-9451-1 Google Scholar

7. 

D. J. Scott et al., “Certification pass rate of 100% for fundamentals of laparoscopic surgery skills after proficiency-based training,” Surg. Endosc., 22 (8), 1887 –1893 (2008). https://doi.org/10.1007/s00464-008-9745-y Google Scholar

8. 

R. M. Satava, “Emerging trends that herald the future of surgical simulation,” Surg. Clin.., 90 (3), 623 –633 (2010). https://doi.org/10.1016/j.suc.2010.02.002 Google Scholar

9. 

G. M. Fried, “FLS assessment of competency using simulated laparoscopic tasks,” J. Gastrointest. Surg., 12 (2), 210 –212 (2008). https://doi.org/10.1007/s11605-007-0355-0 Google Scholar

10. 

B. K. Poulose et al., “Fundamentals of endoscopic surgery cognitive examination: Development and validity evidence,” Surg. Endosc., 28 (2), 631 –638 (2014). https://doi.org/10.1007/s00464-013-3220-0 Google Scholar

11. 

M. C. Vassiliou et al., “Fundamentals of endoscopic surgery: creation and validation of the hands-on test,” Surg. Endosc., 28 (3), 704 –711 (2014). https://doi.org/10.1007/s00464-013-3298-4 Google Scholar

12. 

R. Aggarwal et al., “An evaluation of the feasibility, validity, and reliability of laparoscopic skills assessment in the operating room,” Ann. Surg., 245 (6), 992 (2007). https://doi.org/10.1097/01.sla.0000262780.17950.e5 Google Scholar

13. 

M. C. Vassiliou et al., “A global assessment tool for evaluation of intraoperative laparoscopic skills,” Am. J. Surg., 190 (1), 107 –113 (2005). https://doi.org/10.1016/j.amjsurg.2005.04.004 AJOOA7 0096-6347 Google Scholar

14. 

J. D. Doyle, E. M. Webber and R. S. Sidhu, “A universal global rating scale for the evaluation of technical skills in the operating room,” Am. J. Surg., 193 (5), 551 –555 (2007). https://doi.org/10.1016/j.amjsurg.2007.02.003 AJOOA7 0096-6347 Google Scholar

15. 

B. Zendejas, R. K. Ruparel and D. A. Cook, “Validity evidence for the fundamentals of laparoscopic surgery (FLS) program as an assessment tool: a systematic review,” Surg. Endosc., 30 (2), 512 –520 (2016). https://doi.org/10.1007/s00464-015-4233-7 Google Scholar

16. 

B. Zheng et al., “Validity of using fundamentals of laparoscopic surgery (FLS) program to assess laparoscopic competence for gynecologists,” Surg. Endosc., 24 (1), 152 –160 (2010). https://doi.org/10.1007/s00464-009-0539-7 Google Scholar

17. 

M. C. Vassiliou et al., “FLS and FES: comprehensive models of training and assessment,” Surg. Clin. N. Am., 90 (3), 535 –558 (2010). https://doi.org/10.1016/j.suc.2010.02.012 Google Scholar

18. 

K. Moorthy et al., “Objective assessment of technical skills in surgery,” Br. Med. J., 327 (7422), 1032 –1037 (2003). https://doi.org/10.1136/bmj.327.7422.1032 BMJOAE 0007-1447 Google Scholar

19. 

N. J. Hogle et al., “Validation of laparoscopic surgical skills training outside the operating room: a long road,” Surg. Endosc., 23 (7), 1476 –1482 (2009). https://doi.org/10.1007/s00464-009-0379-5 Google Scholar

20. 

D. R. C. James et al., “The ergonomics of natural orifice translumenal endoscopic surgery (NOTES) navigation in terms of performance, stress, and cognitive behavior,” Surgery, 149 (4), 525 –533 (2011). https://doi.org/10.1016/j.surg.2010.11.019 SURGAZ 0039-6060 Google Scholar

21. 

J. Andreu-Perez et al., “Disparity in frontal lobe connectivity on a complex bimanual motor task aids in classification of operator skill level,” Brain Connect., 6 (5), 375 –388 (2016). https://doi.org/10.1089/brain.2015.0350 Google Scholar

22. 

D. R. Leff et al., “Assessment of the cerebral cortex during motor task behaviours in adults: a systematic review of functional near infrared spectroscopy (fNIRS) studies,” NeuroImage, 54 (4), 2922 –2936 (2011). https://doi.org/10.1016/j.neuroimage.2010.10.058 NEIMEF 1053-8119 Google Scholar

23. 

D. R. Leff et al., “Changes in prefrontal cortical behaviour depend upon familiarity on a bimanual co-ordination task: an fNIRS study,” Neuroimage, 39 (2), 805 –813 (2008). https://doi.org/10.1016/j.neuroimage.2007.09.032 NEIMEF 1053-8119 Google Scholar

24. 

H. N. Modi et al., “A decade of imaging surgeons’ brain function (part I): terminology, techniques, and clinical translation,” Surgery, 162 (5), 1121 –1130 (2017). https://doi.org/10.1016/j.surg.2017.05.021 SURGAZ 0039-6060 Google Scholar

25. 

A. Nemani et al., “Convergent validation and transfer of learning studies of a virtual reality-based pattern cutting simulator,” Surg. Endosc., 32 (3), 1265 –1272 (2018). https://doi.org/10.1007/s00464-017-5802-8 Google Scholar

26. 

K. Ohuchida et al., “The frontal cortex is activated during learning of endoscopic procedures,” Surg. Endosc., 23 (10), 2296 –2301 (2009). https://doi.org/10.1007/s00464-008-0316-z Google Scholar

27. 

D. M. Wolpert, J. Diedrichsen and J. R. Flanagan, “Principles of sensorimotor learning,” Nat. Rev. Neurosci., 12 (12), 739 –751 (2011). https://doi.org/10.1038/nrn3112 NRNAAN 1471-003X Google Scholar

28. 

S. Miyachi et al., “Differential roles of monkey striatum in learning of sequential hand movement,” Exp. Brain Res., 115 (1), 1 –5 (1997). https://doi.org/10.1007/PL00005669 EXBRAP 0014-4819 Google Scholar

29. 

S. P. Swinnen and J. Gooijers, “Bimanual coordination,” Brain Mapping, 2 475 –482 (2015). https://doi.org/10.1016/B978-0-12-397025-1.00030-0 Google Scholar

30. 

C. Maes et al., “Two hands, one brain, and aging,” Neurosci. Biobehav. Rev., 75 234 –256 (2017). https://doi.org/10.1016/j.neubiorev.2017.01.052 NBREDE 0149-7634 Google Scholar

31. 

S. P. Swinnen and N. Wenderoth, “Two hands, one brain: cognitive neuroscience of bimanual skill,” Trends Cognit. Sci., 8 (1), 18 –25 (2004). https://doi.org/10.1016/j.tics.2003.10.017 Google Scholar

32. 

S. P. Swinnen, “Intermanual coordination: from behavioural principles to neural-network interactions,” Nat. Rev. Neurosci., 3 (5), 348 –359 (2002). https://doi.org/10.1038/nrn807 NRNAAN 1471-003X Google Scholar

33. 

A. Nemani et al., “Assessing bimanual motor skills with optical neuroimaging,” Sci. Adv., 4 (10), eaat3807 (2018). https://doi.org/10.1126/sciadv.aat3807 STAMCV 1468-6996 Google Scholar

34. 

K. J. Friston, “Functional and effective connectivity: a review,” Brain Connect., 1 (1), 13 –36 (2011). https://doi.org/10.1089/brain.2011.0008 Google Scholar

35. 

Q. Tan et al., “Frequency-specific functional connectivity revealed by wavelet-based coherence analysis in elderly subjects with cerebral infarction using NIRS method,” Med. Phys., 42 (9), 5391 –5403 (2015). https://doi.org/10.1118/1.4928672 MPHYA6 0094-2405 Google Scholar

36. 

L. Xu et al., “Functional connectivity analysis using fNIRS in healthy subjects during prolonged simulated driving,” Neurosci. Lett., 640 21 –28 (2017). https://doi.org/10.1016/j.neulet.2017.01.018 NELED5 0304-3940 Google Scholar

37. 

X. Cui, D. M. Bryant and A. L. Reiss, “NIRS-based hyperscanning reveals increased interpersonal coherence in superior frontal cortex during cooperation,” Neuroimage, 59 (3), 2430 –2437 (2012). https://doi.org/10.1016/j.neuroimage.2011.09.003 NEIMEF 1053-8119 Google Scholar

38. 

R. Cui et al., “Wavelet coherence analysis of spontaneous oscillations in cerebral tissue oxyhemoglobin concentrations and arterial blood pressure in elderly subjects,” Microvasc. Res., 93 14 –20 (2014). https://doi.org/10.1016/j.mvr.2014.02.008 MIVRA6 0026-2862 Google Scholar

39. 

L. Holper, F. Scholkmann and M. Wolf, “Between-brain connectivity during imitation measured by fNIRS,” Neuroimage, 63 (1), 212 –222 (2012). https://doi.org/10.1016/j.neuroimage.2012.06.028 NEIMEF 1053-8119 Google Scholar

40. 

A. Bandrivskyy et al., “Wavelet phase coherence analysis: application to skin temperature and blood flow,” Cardiovasc. Eng., 4 (1), 89 –93 (2004). https://doi.org/10.1023/B:CARE.0000025126.63253.43 Google Scholar

41. 

F. Tian et al., “Wavelet coherence analysis of dynamic cerebral autoregulation in neonatal hypoxic-ischemic encephalopathy,” Neuroimage Clin., 11 124 –132 (2016). https://doi.org/10.1016/j.nicl.2016.01.020 Google Scholar

42. 

B. Biswal et al., “Functional connectivity in the motor cortex of resting human brain using echo‐planar MRI,” Magn. Reson. Med., 34 (4), 537 –541 (1995). https://doi.org/10.1002/mrm.1910340409 MRMEEN 0740-3194 Google Scholar

43. 

F. Faul et al., “G*Power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences,” Behav. Res. Methods, 39 (2), 175 –191 (2007). https://doi.org/10.3758/BF03193146 Google Scholar

44. 

N. J. Soper and G. M. Fried, “The fundamentals of laparoscopic surgery: its time has come,” Bull. Am. Coll. Surg., 93 (9), 30 –32 (2008). Google Scholar

45. 

J. H. Peters et al., “Development and validation of a comprehensive program of education and assessment of the basic fundamentals of laparoscopic surgery,” Surgery, 135 (1), 21 –27 (2004). https://doi.org/10.1016/S0039-6060(03)00156-9 SURGAZ 0039-6060 Google Scholar

46. 

L. Zhang et al., “Characterizing the learning curve of the VBLaST-PT© (virtual basic laparoscopic skill trainer),” Surg. Endosc., 27 (10), 3603 –3615 (2013). https://doi.org/10.1007/s00464-013-2932-5 Google Scholar

47. 

A. Maciel et al., “Development of the VBLaST™: a virtual basic laparoscopic skill trainer,” Int. J. Med. Robot. Comput. Assist. Surg., 4 (2), 131 –138 (2008). https://doi.org/10.1002/rcs.185 Google Scholar

48. 

V. S. Arikatla et al., “Face and construct validation of a virtual peg transfer simulator,” Surg. Endosc., 27 (5), 1721 –1729 (2013). https://doi.org/10.1007/s00464-012-2664-y Google Scholar

49. 

G. Sankaranarayanan et al., “Preliminary face and construct validation study of a virtual basic laparoscopic skill trainer,” J. Laparoendosc. Adv. Surg. Tech., 20 (2), 153 –157 (2010). https://doi.org/10.1089/lap.2009.0030 Google Scholar

50. 

A. Chellali et al., “Preliminary evaluation of the pattern cutting and the ligating loop virtual laparoscopic trainers,” Surg. Endosc., 29 (4), 815 –821 (2015). https://doi.org/10.1007/s00464-014-3764-7 Google Scholar

51. 

T. J. Huppert et al., “HomER: a review of time-series analysis methods for near-infrared spectroscopy of the brain,” Appl. Opt., 48 (10), D280 –D298 (2009). https://doi.org/10.1364/AO.48.00D280 APOPAI 0003-6935 Google Scholar

52. 

M. A. Franceschini et al., “Diffuse optical imaging of the whole head,” J. Biomed. Opt., 11 (5), 054007 (2006). https://doi.org/10.1117/1.2363365 JBOPFO 1083-3668 Google Scholar

53. 

Y. Zhang et al., “Eigenvector-based spatial filtering for reduction of physiological interference in diffuse optical imaging,” J. Biomed. Opt., 10 (1), 011014 (2005). https://doi.org/10.1117/1.1852552 JBOPFO 1083-3668 Google Scholar

54. 

L. Gagnon et al., “Improved recovery of the hemodynamic response in diffuse optical imaging using short optode separations and state-space modeling,” Neuroimage, 56 (3), 1362 –1371 (2011). https://doi.org/10.1016/j.neuroimage.2011.03.001 NEIMEF 1053-8119 Google Scholar

55. 

M. A. Yücel et al., “Short separation regression improves statistical significance and better localizes the hemodynamic response obtained by near-infrared spectroscopy for tasks with differing autonomic responses,” Neurophotonics, 2 (3), 035005 (2015). https://doi.org/10.1117/1.nph.2.3.035005 Google Scholar

56. 

J. C. Ye et al., “NIRS-SPM: statistical parametric mapping for near-infrared spectroscopy,” Neuroimage, 44 (2), 428 –447 (2009). https://doi.org/10.1016/j.neuroimage.2008.08.036 NEIMEF 1053-8119 Google Scholar

57. 

L. Gagnon et al., “Short separation channel location impacts the performance of short channel regression in NIRS,” Neuroimage, 59 (3), 2518 –2528 (2012). https://doi.org/10.1016/j.neuroimage.2011.08.095 NEIMEF 1053-8119 Google Scholar

58. 

L. Gagnon et al., “Further improvement in reducing superficial contamination in NIRS using double short separation measurements,” Neuroimage, 85 127 –135 (2014). https://doi.org/10.1016/j.neuroimage.2013.01.073 NEIMEF 1053-8119 Google Scholar

59. 

Q. Tan et al., “Frequency-specific functional connectivity revealed by wavelet-based coherence analysis in elderly subjects with cerebral infarction using NIRS method,” Med. Phys., 42 (9), 5391 –5403 (2015). https://doi.org/10.1118/1.4928672 MPHYA6 0094-2405 Google Scholar

60. 

R. Cui et al., “Wavelet coherence analysis of spontaneous oscillations in cerebral tissue oxyhemoglobin concentrations and arterial blood pressure in elderly subjects,” Microvasc. Res., 93 14 –20 (2014). https://doi.org/10.1016/j.mvr.2014.02.008 MIVRA6 0026-2862 Google Scholar

61. 

A. Bernjak et al., “Coherence between fluctuations in blood flow and oxygen saturation,” Fluctuation Noise Lett., 11 (1), 1240013 (2012). https://doi.org/10.1142/S0219477512400135 Google Scholar

62. 

A. B. Rowley et al., “Synchronization between arterial blood pressure and cerebral oxyhaemoglobin concentration investigated by wavelet cross-correlation,” Physiol. Meas., 28 (2), 161 (2007). https://doi.org/10.1088/0967-3334/28/2/005 PMEAE3 0967-3334 Google Scholar

63. 

G. Xu et al., “Functional connectivity analysis of distracted drivers based on the wavelet phase coherence of functional near-infrared spectroscopy signals,” PLoS One, 12 (11), e0188329 (2017). https://doi.org/10.1371/journal.pone.0188329 POLNCL 1932-6203 Google Scholar

64. 

J. N. Sanes, “Neocortical mechanisms in motor learning,” Curr. Opin. Neurobiol., 13 (2), 225 –231 (2003). https://doi.org/10.1016/S0959-4388(03)00046-1 COPUEN 0959-4388 Google Scholar

65. 

S. Wu et al., “Suppressing systemic interference in fNIRS monitoring of the hemodynamic cortical response to motor execution and imagery,” Front. Hum. Neurosci., (2018). https://doi.org/10.3389/fnhum.2018.00085 Google Scholar

66. 

F. Deligianni et al., “Expertise and task pressure in fNIRS-based brain connectomes,” (2020). Google Scholar

67. 

Y. Shiogai, A. Stefanovska and P. V. E. McClintock, “Nonlinear dynamics of cardiovascular ageing,” Phys. Rep., 488 (2-3), 51 –110 (2010). https://doi.org/10.1016/j.physrep.2009.12.003 PRPLCM 0370-1573 Google Scholar

Biographies of the authors are not available.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Arun Nemani, Anil Kamat, Yuanyuan Gao, Meryem A. Yucel, Denise Gee, Clairice Cooper, Steven D. Schwaitzberg, Xavier Intes, Anirban Dutta, and Suvranu De "Functional brain connectivity related to surgical skill dexterity in physical and virtual simulation environments," Neurophotonics 8(1), 015008 (3 March 2021). https://doi.org/10.1117/1.NPh.8.1.015008
Received: 6 August 2020; Accepted: 11 February 2021; Published: 3 March 2021
Lens.org Logo
CITATIONS
Cited by 13 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Brain

Shape memory alloys

Surgery

Laparoscopy

Wavelets

Neurophotonics

Prefrontal cortex

RELATED CONTENT


Back to Top