EEG Mapping of Cortical Activation Related to Emotional Stroop with Facial Expressions: A TREFACE Study

Abstract

TREFACE (Test for Recognition of Facial Expressions with Emotional Conflict) is a computerized model for investigating the emotional factor in executive functions based on the Stroop paradigm, for the recognition of emotional expressions in human faces. To investigate the influence of the emotional component at the cortical level, the electroencephalographic (EEG) recording technique was used to measure the involvement of cortical areas during the execution of certain tasks. Thirty Brazilian native Portuguese-speaking graduate students were evaluated on their anxiety and depression levels and on their well-being at the time of the session. The EEG recording was performed in 19 channels during the execution of the TREFACE test in the 3 stages established by the model-guided training, reading, and recognition—both with congruent conditions, when the image corresponds to the word shown, and incongruent condition, when there is no correspondence. The results showed better performance in the reading stage and in congruent conditions, while greater intensity of cortical activation in the recognition stage and in incongruent conditions. In a complementary way, specific frontal activations were observed: intense theta frequency activation in the left extension representing the frontal recruitment of posterior regions in information processing; also, activation in alpha frequency in the right frontotemporal line, illustrating the executive processing in the control of attention, in addition to the dorsal manifestation of the prefrontal side, for emotional performance. Activations in beta and gamma frequencies were displayed in a more intensely distributed way in the recognition stage. The results of this mapping of cortical activity in our study can help to understand how words and images of faces can be regulated in everyday life and in clinical contexts, suggesting an integrated model that includes the neural bases of the regulation strategy.

Share and Cite:

Prada, E. , Tavares, M. , Garcia, A. , Satler, C. , Martinez, L. , Alves, C. and Tomaz, C. (2022) EEG Mapping of Cortical Activation Related to Emotional Stroop with Facial Expressions: A TREFACE Study. Journal of Behavioral and Brain Science, 12, 514-532. doi: 10.4236/jbbs.2022.1210030.

1. Introduction

TREFACE (Test for Recognition of Facial Expressions with Emotional Conflict) is a neuropsychological assessment model in Portuguese based on the emotional Stroop paradigm, which assesses components of executive functions, specifically working memory, inhibitory control, and cognitive flexibility. It is a computational tool composed of a collection of emotional facial expressions and words that may or may not be congruent with those facial expressions, reproducing characteristics of the emotional Stroop test. The results revealed that a task associated with word reading allows better performance than a task associated with face recognition. It was also identified that when the word coincides with the image (congruent condition), there is an advantage in terms of the hit rate, while there is greater recognition data difficulty when the image was not congruent with the word. In general, the results suggest that the emotional attribute can compromise the ability to recognize faces, affecting the functioning of mechanisms such as cognitive control and emotion regulation.

In view of these behavioral results, taking a step forward, we sought in the present study to investigate the neurobiological substrate of this phenomenon from electrophysiological analyses.

The electroencephalographic (EEG) record is a method for measuring electrical brain activity that allows the establishment of correlations between patterns of cortical activation and the behavior presented by the individual. The EEG records the postsynaptic activity of cortical neurons, corresponding to the cortical response from the influence of some event, as well as revealing the manifestation of the activity of subcortical structures [1] . The activity manifests itself in the EEG acquisition as electrical waves, which are classified according to their frequency band. The dominance of each class of brainwaves can be interpreted according to certain physiological and psychological states to which they are subjected. The high temporal resolution of this technique favors the assessment of the cerebral pronouncement of a phenomenon at the moment closest to its occurrence [2] . This important feature also enables the use of EEG both for studies of normal brain activity and in the presence of some pathology [3] .

Electrophysiological methods have been used to measure activity related to executive functions (EF), discerning patterns, and defining neurobiological markers [4] [5] [6] [7] . The EF construct is strongly related to behavior control processes and is closely associated with frontal and limbic circuits, bilaterally [8] [9] . Important fronto-subcortical circuits that accompany the development of the EF processes originate from the prefrontal cortex (PFC): 1) the Dorsolateral (DLPFC); 2) the Ventrolateral (VLPFC); 3) the Orbitofrontal (OFPFC); 4) the Ventromedial (VMPFC); and 5) the Anterior Cingulate (ACPFC) [7] [10] [11] [12] .

The PFC also receives inputs from the visual and auditory processing areas of the occipital and temporal lobes, and, in particular, the OFPFC presents a strong interconnection with the cognitive and emotional processing areas. This circuit originates in the inferior lateral and anterior ventral prefrontal cortex, projects to the ventromedial caudate nucleus and receives information from other cortical areas. Therefore, it is suggested that it could be involved in certain aspects of social behavior such as inhibitory control, empathy and compliance with social rules [13] . According to Diamond [14] , such processes have a supervisory role, letting us act against our instincts or intuition and giving preference premeditated, controlled, and planned behaviors.

The visual processing of facial stimuli has been gaining interest in neuroscience. The facial recognition (FR) engine is actively linked to the functioning of EFs. An individual’s emotional state is conveyed both by their behavioral context and their face expressions [15] [16] [17] . Several biological mechanisms are involved in the processing of emotional meaning. There is broad consensus in the literature on the existence of active neural activity in the amygdala and OFPFC [18] [19] [20] [21] [22] ; in turn, these two structures are of great importance in recognizing the emotional expressions of the face. In addition, other much more specific neural subsystems associated with the recognition of certain emotions have been identified, such as fear [23] [24] , joy [25] , anger and disgust [15] . Cognitive processes such as memory, language, attention control and the basic components of EF are also strongly involved in the efficiency of emotional facial processing [26] [27] [28] [29] .

By means of the Stroop’s paradigm, it is possible to recognize how emotional attributes exert influence on cognitive processes. Some studies have shown that the speed in naming words that convey emotion could be an indicator of the subjects’ concerns or anxieties, thus, words that refer to emotions produce a greater interference than “neutral” words [19] [30] [31] . These studies, therefore, demonstrate the importance of the emotional Stroop test as an effective tool to establish a model of emotional conflict. Thus, the testing model proposed by TREFACE can explore the quality of active cognitive control mechanisms that mediate conflict, as well as those that provide the resources to identify and solve the problem [32] . Furthermore, their findings support existing theories of emotional regulation, which involve the dorsolateral prefrontal cortex and the anterior cingulate cortex, in situations of cognitive conflict. The detection site resides in medial prefrontal structures, including the dorsal anterior cortex, which extends dorsally into premotor regions. Once the system’s alert detection is activated, a flow of information goes to the cognitive control system, which initiates the resolution of emotional conflicts and modulates the processing of information, that is, facilitating the processing of the appropriate response and, at the same time, hampering the processing of the incorrect response, involving areas of the dorsolateral region of the prefrontal cortex in the resolution [19] .

Additionally, several EEG studies have revealed the involvement of activities in specific bands related to EF. Oscillations in the theta band, particularly in the frontal lobe, have been identified as an indicator of demands on working memory due to the active recruitment of cognitive resources for task resolution [7] [33] [34] [35] [36] [37] .

Alpha band activity, in its turn, is related to the inhibition of brain activities that are not involved in the mental task at hand, suggesting predictive markers of performance during working memory tasks [7] [36] [38] [39] [40] . Beta activity reflects a memory-promoting state, moderated by modality-independent attentional or inhibitory processes [7] [37] [41] . Finally, gamma oscillations support the maintenance of resource-specific information and reflect the formation of representations of content in visual processing [7] [42] .

The emotional component involved in perceptual and cognitive processing makes recognition more complex, requiring greater activation of brain networks and mental resources to reach an efficient solution. From this perspective, the present work aimed to substantiate the model proposed by TREFACE for monitoring emotional conflict with evidence of neurophysiological mechanisms specifically identified by mapping cortical activity via EEG recording. Thus, it is possible to provide details related to both behavioral performance after exposure to the task at hand and the cortical mechanisms involved in solving problems.

2. Methods

Participants

Thirty volunteers aged from 18 to 25 participated in this study (15 women = 20.27 ± 0.60 and 15 men = 21.40 ± 0.69 years old). The participants were Brazilian Portuguese native speakers recruited through an advertisement within the Campus of the University of Brasília (UnB), Brasília, Brazil, and on social media (Facebook and WhatsApp), who agreed with the ethical terms for research with humans in Resolution 196/96 CNS/MS. All participants declared that they had no history of neurological or psychiatric disorders and had not consumed drugs, alcoholic beverages, or energy drinks 24 hours prior to the EEG recording. None of them had either impaired hearing or visual or speech problems. In addition to not reporting poor sleep the night before the evaluation, they presented scores below 50 on the State and Trait Anxiety Inventory (STAI-E/T), below 20 on the Beck Depression Inventory (BDI-II) and above 60% on the Self-Perception of Quality of Life Questionnaire (WHOQOL-bref).

Procedures

Test of Recognition of Facial Expressions with Emotional Conflict (TREFACE)

TREFACE [43] is composed of a set of stimuli formed by black and white photographs, extracted from the Pictures of Facial Affect (POFA®) collection by Ekman and Friesen (1976), and words in red that correspond to the basic emotions present in the images. During the task, the participant has to respond verbally according to the instructions presented before each step. The first stage is called guided recognition (GR), in which an image is presented, followed by a word after a delay. The subject is informed to silently observe the relationship between the facial expression and the written word. This step is intended to familiarize the participant with the task and no recording or correction is performed.

In the following steps, the word is presented in the center of the image, which may or may not be congruent with the expression, and a verbal response is expected according to the instructions given. In the second stage, of total reading (LT), the participant is instructed to make a quick reading of the written word and, in the third stage of total recognition (RT), to verbalize the emotional expression that indicates the character’s face, ignoring the written word. Each participant was exposed to the same stimuli, and the content that each participant read was the same.

Each stage had 70 stimuli with an interval of 100 ms, in the predefined sequence, and the delay in the first stage was 1000 ms, according to the scheme in Figure 1.

This model was developed at the Neuroscience and Behavior Laboratory at the University of Brasilia and implemented as a computerized test in the TestPlataform tool version 1.1 [44] .

This platform allows the recording of the test execution in files, with the description and display time of each stimulus, and the audio with the responses

Figure 1. Example of the TREFACE presentation structure. (a) TG: Guided reconnaissance. (b) LT: Reading the written word. (c) RT: Recognition of emotional expression of faces. Letters bellow the pictures: C = Congruent condition. I = Incongruent condition.

dictated by the subject. It also allows the automatic marking of the EEG record, at each stimulus presentation, a useful configuration for this study.

Electroencephalography recording

Measurements of cortical activity by EEG were collected using the Neuron-Spectrum-4/EPM device (NeuroSoft®, Ivanovo, Russia). The EEG recording was performed using a 19-channel cap placed under the scalp, according to the international 10/20 system, and two sponges soaked in saline solution under points Fp1 and Fp2, to reduce the pressure of the cap on the skin. The impedance of the cap points was constantly checked to be kept below 10 KΩ throughout the session by applying conductive gel to each of the cap’s passive sensor electrodes with the aid of a syringe with a special tip. All data were sampled at 1024 Hz, with amplification bandwidth settings between 0.1 and 70.0 Hz.

Data collection was performed at the Laboratory of Neuroscience and Behavior of the University of Brasília in a session divided into two segments: the screening and the experiment. When scheduling their participation, the participant was instructed to be available for an one-hour period; to abstain from ingesting alcoholic beverages, using drugs, relaxants or stimulants, as well as energy drinks, 24 hours beforehand; to avoid practicing strenuous physical exercise for at least three hours prior to the test; to try to get a good night’s sleep the day before; and to wash their head well with neutral shampoo; to bring prescription glasses if used frequently. After going through the screening, the participant was sent to the EEG recording room with lighting for reading and noise control. The individuals were comfortably seated at a distance of 40 cm from a computer screen on which the stimuli would be presented, and the electrode cap was placed for the simultaneous EEG recording. After preparation, the baseline corresponding to basal cortical activity was recorded with eyes closed for three minutes. Afterwards, the test began, on a 17" computer screen, following the aforementioned steps—TG, LT, RT—each preceded by a baseline measurement, and, at the end of the last step, another baseline was recorded. During the entire recording, the marking of each test stimulus was visually verified, given by the TestPlatform, in the tracing of the EEG recording. At the end of the experimental session, the cap was removed.

Data analysis

For the behavioral analysis in TREFACE, the performance and audio files provided by the platform were used to calculate the number of hits, i.e., when the dictated answer coincided with the expected answer, the number of errors, i.e., when the answers did not coincide, and the number of omissions, when there was no response. Reaction time was also calculated through a voice detection module, developed in the Matlab laboratory, which uses the stimulus presentation moment, recorded in the performance file, from which the voice signal is detected in the audio file.

For behavioral data, the Sigma Stat 3.5 statistical program was used and a Wilcoxon test for paired samples was performed to compare the reading and recognition steps. A two-way ANOVA for repeated measures was used to compare conditions within the steps of the TREFACE, in which factor 1 was a step, with two levels (reading emotional words and recognition of the emotional expression of the face) and factor 2 was the condition, with two levels (congruent and incongruent), and the dependent variable was the number hits or reaction time. The significance level established for the analyzes was p < 0.05.

All data were processed using codes programmed in Matlab, integrated into EEGLAB, version 9.0.4.5 [45] ; http://sccn.ucsd.edu/eeglab). Initially, the sampling rate used in the collection was resampled to 200 Hz to optimize data processing. The fragments corresponding to the TREFACE steps were extracted from the continuous records and, in each fragment, the record was digitally separated at non-overlapping times, according to the markings made during the performance, later named according to the condition of the task. The fragments and epochs were submitted to the Infomax algorithm to decompose into their independent components (ICA; [46] . Components related to blinking or eye movement were removed from the original data and then the record was recalculated using the remaining components, filtered, and processed to extract measurements. The precomputed EEG data were calculated in spectral power and displayed for analysis in the frequency ranges: theta (4 - 8 Hz), alpha (8 - 13 Hz), beta (13 - 30 Hz) and gamma (30 - 70 Hz). By means of topographic maps of cortical activity, using the power recorded at each electrode and the smoothing technique (smoothing) around the channels, to project the activation in the spaces between them. Time window length to analyze EEG signals was set in 2 seconds. The study was calculated in spectral power (in μV) and the data were made available for statistical analysis (paired t test; p < 0.05) in the EEGLAb itself to compare the topographic maps of each stage and condition of TREFACE.

3. Results

The analysis with the Wilcoxon test for paired samples, comparing the performance of the participants, based on the rate of correct answers in the reading (LT) and recognition (RT) stages of TREFACE, showed a statistically significant difference (Z = −4.790, p < 0.001; LT > RT), higher for LT (100.00 ± 0.00) than for RT (72.80 ± 1.39). When comparing the average reaction times of the two steps, the analysis showed a significant difference (Z = −2.523, p = 0.012; LT > RT), in which the average of total reaction time was higher for LT (307.70 ± 10.46) than for RT (253.69 ± 15.77).

Comparing the performance of the participants according to the rate of hits in the congruent (C) and incongruent (I) conditions within the TREFACE steps, the two-way ANOVA for repeated measures identified a statistically significant effect on the condition factor (F[1,29] = 123.103, p < 0.001) and in the step factor (F[1,29] = 381.444, p < 0.001). It also showed a statistically significant effect on the interaction between condition and step (F[1,29] = 123.103, p < 0.001).

The post hoc analysis for multiple comparisons (Bonferroni’s t test) of performance showed that, regarding the condition factor, the congruent condition was different from the incongruent condition (t = 11.095, p < 0.001; C > I) with greater statistical significance for C (91.25 ± 1.23) than for I (81.55 ± 2.62). Additionally, in the step factor, reading was different from recognition (t = 19,531, p < 0.001; LT > RT), with a higher number of correct answers for LT (100.00 ± 0.00) than RT (72.80 ± 1.71).

On the other hand, the analysis identified that, in the step factor within the congruent condition, the participants’ performance in reading was different from the performance in recognition (t = 10.641, p < 0.001; L-C > R-C) with greater statistical significance for L-C (100.00 ± 0.00) than for R-C (82.50 ± 0.95). This same result was observed in the incongruous condition (t = 22,440, p < 0.001; L-I > R-I), being higher for L-I (100.00 ± 0.00) than for R-I (63.10 ± 2.12). Finally, for the condition factor within the step, no significant difference was found within the reading step, only in the recognition step, between conditions (t = 15.691, p < 0.001; R-C > R-I), higher in R-C (82.50 ± 0.95) than in R-I (63.10 ± 2.12).

When comparing the performance obtained by the participants from the average reaction time in the conditions within the TREFACE steps, a two-way ANOVA for repeated measures showed a statistically significant effect in the condition factor (F[1,29] = 10.241, p = 0.003) and in the step factor (F[1,29] = 6.229, p = 0.019). However, there was no statistically significant effect on the interaction between condition and step (F[1,29] = 0.568, p = 0.457).

A post hoc analysis for multiple comparisons (Bonferroni’s t test) of the reaction time showed that, for the condition factor, the incongruent condition was different from the congruent (t = 3.200, p = 0.003; I > C) with greater statistical significance for I (293.58 ± 13.41) than for C (267.81 ± 9.64). Additionally, in the step factor, reading was different from recognition (t = 2.496, p = 0.019; LT > RT), with greater statistical significance for LT (307.86 ± 8.43) than for RT (254.82 ± 13.53). On the other hand, the analysis identified that, in the step factor, within the congruent condition, the participants’ performance in reading was different from the performance in recognition (t = 2.602, p = 0.013; L-C > R-C), higher for L-C (297.86 ± 9.76) than for R-C (238.73 ± 12.64). This same result was observed in the incongruous condition (t = 2.074, p = 0.045; L-I > R-I), higher for L-I (317.54 ± 11.47) than for R-I (270.39 ± 20.60). Finally, for the condition factor, there was no statistically significant difference within the reading stage, while within the recognition stage, the analysis indicated a statistically significant difference between incongruent and congruent conditions (t = 2.804, p = 0.007; R-I > R-C), higher for R-I (270.39 ± 20.60) than for R-C (238.73 ± 12.64).

Cortical mapping in TREFACE

The topographic maps generated from the EEG data were compared between the reading (L) and recognition (R) stages and, within these, under the different TREFACE conditions, congruent (C) and incongruent (I), considering the bands of Frequency: Theta (4 - 8 Hz), Alpha (8 - 13 Hz), Beta (13 - 30 Hz) and Gamma (30 - 70 Hz).

In the global analysis, comparing the TREFACE steps in a paired t test in the recording channels, there was greater activation in the recognition step (RT) in relation to the reading step (LT) (Figure 2). For the theta band, there was an extended activation, with a statistically significant difference regarding the level of electrical activity, in the frontotemporal region, including parietal and occipital regions of the left hemisphere (F7, F3, T3, C3, T5, P3, O1), in the midline, between Fz and Cz, and across the entire frontotemporal line (F8, T4, T6) of the right hemisphere, greater for RT.

Figure 2. Topographic maps of cortical activation in the study bands comparing the stages of TREFACE, based on the participants’ evaluation. You red dots in the representation on the right indicate the electrodes for which statistically significant differences were found. paired t test (p < 0.05). LT = Total Reading. RT = Full Recognition.

The alpha frequency range showed greater activation for RT with a significant difference in the left frontal (F3) and central (Fz) regions, in addition to the frontotemporal region (F8, T4, T6). The electrical activity in beta frequency was significantly higher for RT, in the bilateral frontal region (F7, F3, F4), in the left temporal region (T3, T5), in the frontoparietal midline (Fz, Cz, Pz), and in the parietotemporal region of the right hemisphere (P4 and T6). Similarly, the gamma frequencies were found, with a statistically significant difference, in the bilateral frontal region (F7, F3, F4), in the left posterior temporal region (T5), in the frontoparietal midline (Fz, Cz, Pz), and in the parietotemporal region of the right hemisphere (P4 and T6), closing the right frontoparietal line with C4, greater for RT.

In comparing the conditions, congruent (C) and incongruent (I), and the steps of reading (L) and recognition (R), no significant results were found for the conditions within the steps (L-C vs L-I and R-C vs R-I). Furthermore, the findings in the comparisons of the steps between the conditions (L-C vs R-C, L-I vs R-I, L-C vs R-I and L-I vs R-C) are described in Figure 3. For the Theta band (θ), it was possible to identify greater cortical activity in the frontotemporal, left, and central parietal extension greater in the recognition stage among all conditions, with a significant difference in F7, F3, Fz, T3, C3, T5 (except L-C vs R-C), P3, Pz (except L-I vs R-I and L-I vs R-C), and O1, while for the right hemisphere, frontotemporal activation was observed with significant differences in F8, F4, T4 and T6, including F4 for L-C vs R-I and C4 for L-C vs R-I and L-I vs R-I, also higher for recognition stage.

Figure 3. Topographic maps of cortical activation in the study bands, comparing the steps of TREFACE according to the condition of the composition of the stimuli presented, based on the participants’ evaluation. The red dots in the representation on the right indicate the electrodes for which they were found statistically significant differences. Paired t test (p < 0.05). L-C = Reading Congruent. L-I = Incongruous Reading. R-C = Congruent Recognition. R-I = Incongruous Recognition.

On the other hand, high alpha activity was identified for the recognition step in all conditions with a statistically significant difference in the left and central frontal region in F3 and Fz, in Pz only for L-C vs R-I, and right frontotemporal regions in F8, T4 and T6 (except for L-C vs R-C). For the beta and gamma bands, a significant difference was observed for all conditions in all regions, greater for the recognition step in all comparisons, except for F8 for L-I vs R-I and L-I vs R-C.

4. Discussion

TREFACE is a computerized model for investigating the emotional factor in executive functions, based on the Stroop paradigm for the Brazilian population, by means of the recognition of emotional expressions in human faces. To investigate the influence of the emotional component at the cortical level, the EEG recording technique was used to describe the involvement of cortical areas during the execution of the task.

The behavioral analyzes of this study revealed that performance was better in the reading stage when compared to the recognition stage, as revealed in the first TREFACE study [43] . Complementarily, reaction times were higher in the reading stage, when compared to task times in the recognition stage. Thus, it is possible to say that the participants, regarding the reading of the textual word, which appeared written on the stimuli of the faces, needed more time to identify the letters, to recognize the word as a whole and to have access to its meaning, taking more time overall to make their response efficient [47] . On the other hand, studies have shown that the facial recognition mechanisms analyzes visual information in the order of milliseconds, which implies the use of less time for the analysis, which, in this case, impaired the quality of recognition [48] [49] .

The same conformation of behavior in relation to the steps, i.e., better performance for reading than for recognition, was found when comparing the different conditions, congruent and incongruent, as in the previous study [43] . In general, a better performance was observed in the congruent condition, while the reaction time was longer in the incongruent condition, regardless of the step. This finding may indicate that conflict, in addition to compromising task performance, tends to require more time for the response, thus, the attention mechanism is impaired to respond more effectively [50] . Reading incongruent stimuli requires more time, while the competition between image-word impairs the quality of processing recognition, even with the deceleration in responses [51] . The effort revealed in the incongruent condition, observed by the reaction time, suggests a time interference effect, which delays the arrival of an inhibitory activity required to suppress the irrelevant element of the relevant in the decision of the correct answer [52] [53] .

With regard to cortical mapping, the EEG recording performed during the execution of the task presented diversified rhythms predominant in the frontal areas, midlines and temporal lines in the reading (LT) and recognition (RT) stages. The strongest theta activity during the recognition (RT) stage in the left hemisphere demonstrates the intense recruitment of cognitive resources from the frontal region for visual information processing [35] [36] . Theta oscillations are commonly associated with memory processes, mental manipulation of information and effort in audiovisual tasks [38] [54] [55] . It may indicate a communication between the temporal region and the hippocampus, given the strong connections between these two structures [56] [57] . Other evidence suggests that during emotional states the amygdala produces differentiated theta activity [58] , and identified during the process of functional inhibition [59] . In this case, theta oscillations are observed mainly in the frontal cortex, expanding to other brain structures as a means of greater sensory-perceptual integration to initiate mechanisms of inhibition, as observed in midline activation.

In general, active participation of the right hemisphere in the control and orientation of executive attention has been reported in executive functioning tasks [60] . In this study, a positive activation in the right frontotemporal line was observed, both for theta and for alpha frequencies. Oscillations in alpha frequency were found in the right hemisphere, with no difference between steps, coherent with prior evidence that top-down processing in a working memory task increases alpha power in the prefrontal area, reinforcing the idea of selective cortical recruitment that can be extended from sensory cortices to the frontoparietal attention network [33] [61] . Furthermore, alpha oscillations are modulated during visual sensory stimulation [62] [63] , which is a characteristic of stimuli presentation in TREFACE. Prefrontal alpha activity is directly involved in the neural mechanisms responsible for the maintenance of working memory [64] , in addition to the relationship with the cognitive processes of memory [65] and attention [66] , required for efficient task resolution in TREFACE. Evidence of this activation in the left dorsolateral region may indicate emotional response assessment processing [67] .

For the analysis of higher frequencies, beta and gamma, similar activation patterns were identified, with synchronization in the recognition stage. In both beta and gamma, the frontal and posterior temporal regions were registered with greater power in the right hemisphere. The frontal and lower left temporal regions were differentiated in their electrical activity. Interestingly, beta frequencies are also modulated in the performance of tasks that require sensorimotor interaction [68] [69] .

Regarding gamma oscillations, these have been associated with functional inhibition [70] , attention and information processing [71] [72] , as well as the active maintenance of content, memory [73] and conscious awareness [74] . The intensity of the gamma rhythm is also associated with the demand for cognitive flexibility, verified in the task of this study, so that the findings on alpha, beta and gamma oscillations can be correlated with previous studies that propose the existence of distinct roles of these bands in inference hierarchical perceptual and predictive coding [37] [75] . Beta and gamma waves are involved in attentional processes [73] . Gamma-band activity has received increasing consideration in recent years due to its role in different cognitive processes [37] [75] . Indeed, phase synchronization of gamma activity seems to be involved in attentional processes [37] , and its measurement provides an important indicator of the relationship between executive functions and the prefrontal cortex [37] . Our results are in agreement with these suggestions as to the right hemisphere.

The overall activation pattern observed in the TREFACE steps was also observed as to each of the conditions. A broader participation of the sensory cortex during the reading of congruent stimuli suggests an approximation motivated by aspects of autobiographical memory resulting in the best composition of the response [76] . More intensely, recognition in the incongruent condition signals a broader recruitment of right hemisphere resources, strengthening the hypothesis of an executive role in orienting attention to task resolution [37] [60] . As for beta and gamma waves, extensive synchrony was observed, encompassing all regions, possibly due to the need to modulate the effect of the emotional attribute, consistent with evidence in left dorsolateral frontal activation, and due to the cognitive effort required in the task, thus conferring greater integration of brain areas [77] .

In general, in this work it was possible to verify the influence of the emotional component induced during the execution of TREFACE, particularly by involving the participation of an extensive network of the frontomedial and temporal cortical circuits with the need for the participation of the two cerebral hemispheres in the monitoring of the conflict experimentally produced in a situation of facial emotion recognition, under a context of non-relationship with the emotional word. It is possible to indicate that the various skills tested by TREFACE regarding the monitoring of emotional conflict, functionally can cause an overlap of brain rhythms in different amplitudes, duration, and frequency ranges, with the participation of both hemispheres. This model proposes that the disturbance of the attention system, required for control and resolution, is altered by emotional interference, i.e., a word unrelated to the emotional expression of the face; and the cortical mapping demonstrated in this study corroborates previous studies [19] [27] [29] [48] .

Likewise, the regulation of these simultaneous oscillatory patterns can be functionally associated with a large and complex neural network that has the participation of frontal areas and their extensive interaction with central parietal and temporal regions, so that the result is translated into an organization of behaviors efficient for solving the TREFACE task. On the other hand, testing in future works the effect produced by TREFACE in conditions in which the neural circuits involved are clinically compromised (neurologically or psychiatrically) will allow to broaden the understanding of the neurophysiological and neuroanatomical substrates that are associated with the functioning of the executive components. Thus, the results of this mapping of cortical activity in our study can help to understand how words and images of faces can be regulated in everyday life and in clinical contexts, suggesting an integrated model that includes the neural bases of the regulation strategy.

Additionally, and in a complementary way, this study substantiates TREFACE as a new neuropsychological assessment tool, computerized and suitable for the Portuguese language, which may be of theoretical and practical relevance for future research in Neurosciences.

Acknowledgements

This work was supported partly by a FAPEMA grant to Carlos Tomaz (FAPEMA No. 01102/16). Edward Prada was recipient of a PhD fellowship from the Pontifical Bolivarian University, Bucaramanga, Colombia, and Lia Martinez was recipient of a PhD fellowship from the Student Program—Graduate Studies Plan (PEC-PG), CAPES/CNPq, Brazil. Maria C H Tavares was recipient of a research fellowship from CNPq/Brazil (311582/2015-0).

Conflicts of Interest

The authors report no conflict of interest.

References

[1] Nunez, P.L. and Srinivasan, R. (2006) Electric Fields of the Brain: The Neurophysics of EEG. Oxford University Press, New York.
https://doi.org/10.1093/acprof:oso/9780195050387.001.0001
[2] Smith, E.E. and Kosslyn, S.M. (2006) Attention. In: Cognitive Psychology: Mind and Brain, Psychology Stanford, Stanford, 103-146.
[3] Teplan, M. (2002) Fundamentals of EEG Measurement. Measurement Science Review, 2, 1-11.
[4] Garcia, A., Uribe, C.E., Tavares, M.C.H. and Tomaz, C. (2011) EEG and Autonomic Responses during Performance of Matching and Non-Matching to Sample Working Memory Tasks with Emotional Content. Frontiers in Behavioral Neuroscience, 5, Article 82.
https://doi.org/10.3389/fnbeh.2011.00082
[5] Tye, C., McLoughlin, G., Kuntsi, J. and Asherson, P. (2011) Electrophysiological Markers of Genetic Risk for Attention Deficit Hyperactivity Disorder. Expert Reviews in Molecular Medicine, 13, e9.
https://doi.org/10.1017/S1462399411001797
[6] Kim, N.Y., Wittenberg, E. and Nam, C.S. (2017) Behavioral and Neural Correlates of Executive Function: Interplay between Inhibition and Updating Processes. Frontiers in Neuroscience, 11, Article 378.
https://doi.org/10.3389/fnins.2017.00378
[7] Basharpoor, S., Heidari, F. and Molavi, P. (2021) EEG Coherence in Theta, Alpha, and Beta Bands in Frontal Regions and Executive Functions. Applied Neuropsychology: Adult, 28, 310-317.
https://doi.org/10.1080/23279095.2019.1632860
[8] Miller, E.K. and Wallis, J.D. (2008) Chap. 52. The Prefrontal Cortex and Executive Brain Functions. In: Squire, L.R., et al., Eds., Fundamental Neuroscience, 3rd Edition, Academic Press, London, 1199-1222.
[9] Purper-Ouakil, D., Ramoz, N., Lepagnol-Bestel, A.M., Gorwood, P. and Simonneau, M. (2011) Neurobiology of Attention Deficit/Hyperactivity Disorder. Pediatric Research, 69, 69-76.
https://doi.org/10.1203/PDR.0b013e318212b40f
[10] Bonelli, R.M. and Cummings, J.L. (2007) Frontal-Subcortical Circuitry and Behavior. Dialogues in Clinical Neuroscience, 9, 141-151.
[11] Siddiqui, S.V., Chatterjee, U., Kumar, D., Siddiqui, A. and Goyal, N. (2008) Neuropsychology of Prefrontal Cortex. Indian Journal of Psychiatry, 50, 202-208.
https://doi.org/10.4103/0019-5545.43634
[12] Stuss, D.T. and Levine, B. (2002) Adult Clinical Neuropsychology. Annual Review of Psychology, 53, 401-433.
https://doi.org/10.1146/annurev.psych.53.100901.135220
[13] Beer, J.S., John, O.P., Scabini, D. and Knight, R.T. (2006) Orbitofrontal Cortex and Social Behavior: Integrating Self-Monitoring and Emotion-Cognition Interactions. Journal of Cognitive Neuroscience, 18, 871-879.
https://doi.org/10.1162/jocn.2006.18.6.871
[14] Diamond, A. (2013) Executive Functions. Annual Review of Psychology, 64, 135-168.
https://doi.org/10.1146/annurev-psych-113011-143750
[15] Fox, E., Lester, V., Russo, R., Bowles, R.J., Pichler, A. and Dutton, K. (2000) Facial Expressions of Emotion: Are Angry Faces Detected More Efficiently? Cognition & Emotion, 14, 61-92.
https://doi.org/10.1080/026999300378996
[16] Calder, A.J. and Young, A.W. (2005) Understanding the Recognition of Facial Identity and Facial Expression. Nature Reviews Neuroscience, 6, 641-65.
https://doi.org/10.1038/nrn1724
[17] Yadav, S.P. (2021) Emotion Recognition Model Based on Facial Expressions. Multimedia Tools and Applications, 80, 26357-26379.
https://doi.org/10.1007/s11042-021-10962-5
[18] Dolan, R.J. (2002) Emotion, Cognition, and Behavior. Science, 298, 1191-1194.
https://doi.org/10.1126/science.1076358
[19] Etkin, A., Egner, T., Peraza, D.M., Kandel, E.R. and Hirsch, J. (2006) Resolving Emotional Conflict: A Role for the Rostral Anterior Cingulate Cortex in Modulating Activity in the Amygdala. Neuron, 51, 871-882.
https://doi.org/10.1016/j.neuron.2006.07.029
[20] Fusar-Poli, P., et al. (2009) Functional Atlas of Emotional Faces Processing: A Voxel-Based Meta-Analysis of 105 Functional Magnetic Resonance Imaging Studies. Journal of Psychiatry and Neuroscience, 34, 418-432.
[21] Jehna, M., et al. (2011) The Functional Correlates of Face Perception and Recognition of Emotional Facial Expressions as Evidenced by fMRI. Brain Research, 1393, 73-83.
https://doi.org/10.1016/j.brainres.2011.04.007
[22] Šimić, G., Tkalčić, M., Vukić, V., Mulc, D., Španić, E., Šagud, M., Olucha-Bordonau, F.E., Vukšić, M.R. and Hof, P. (2021) Understanding Emotions: Origins and Roles of the Amygdala. Biomolecules, 11, Article 823.
https://doi.org/10.3390/biom11060823
[23] Morris, J.S., Friston, K.J., Büchel, C., Frith, C.D., Young, A.W., Calder, A.J. and Dolan, R.J. (1998) A Neuromodulatory Role for the Human Amygdala in Processing Emotional Facial Expressions. Brain: A Journal of Neurology, 121, 47-57.
https://doi.org/10.1093/brain/121.1.47
[24] Bliss-Moreau, E., Williams, L.A. and Karaskiewicz, C.L. (2021) Evolution of Emotion in Social Context. In: Shackelford, T.K. and Weekes-Shackelford, V.A., Eds., Encyclopedia of Evolutionary Psychological Science, Springer, Cham, 2587-2499.
https://doi.org/10.1007/978-3-319-19650-3_2459
[25] Hennenlotter, A. and Schroeder, U. (2006) Partly Dissociable Neural Substrates for Recognizing Basic Emotions: A Critical Review. Progress in Brain Research, 156, 443-456.
https://doi.org/10.1016/S0079-6123(06)56024-8
[26] Adolphs, R., Damasio, H., Tranel, D. and Damasio, A.R. (1996) Cortical Systems for the Recognition of Emotion in Facial Expressions. Journal of Neuroscience, 16, 7678-7687.
https://doi.org/10.1523/JNEUROSCI.16-23-07678.1996
[27] Clayson, P.E. and Larson, M.J. (2013) Adaptation to Emotional Conflict: Evidence from a Novel Face Emotion Paradigm. PLOS ONE, 8, e75776.
https://doi.org/10.1371/journal.pone.0075776
[28] Weber-Fox, C.M. and Neville, H.J. (1999) Functional Neural Subsystems Are Differentially Affected by Delays in Second Language Immersion: ERP and Behavioral Evidence in Bilinguals. In: Birdsong, D., Ed., Second Language Acquisition and the Critical Period Hypothesis, Lawrence Erlbaum Associates Publishers, Hillsdale, NJ, 23-38.
[29] Reeck, C. and Egner, T. (2011) Affective Privilege: Asymmetric Interference by Emotional Distracters. Frontiers in Psychology, 2, Article 232.
https://doi.org/10.3389/fpsyg.2011.00232
[30] Botvinick, M.M., Braver, T.S., Barch, D.M., Carter, C.S. and Cohen, J.D. (2001) Conflict Monitoring and Cognitive Control. Psychological Review, 108, 624-652.
https://doi.org/10.1037/0033-295X.108.3.624
[31] Módenes, P.F. and Cabaco, A.S. (2008) Saber Envejecer: Aspectos Positivos Y Nuevas Perspectivas. Foro de Educación, 6, 369-383.
[32] Zhu, X.R., Zhang, H.J., Wu, T.T., Luo, W.B. and Luo, Y.J. (2010) Emotional Conflict Occurs at an Early Stage: Evidence from the Emotional Face-Word Stroop Task. Neuroscience Letters, 478, 1-4.
https://doi.org/10.1016/j.neulet.2010.04.036
[33] Sauseng, P., Klimesch, W., Schabus, M. and Doppelmayr, M. (2005) Fronto-Parietal EEG Coherence in Theta and Upper Alpha Reflect Central Executive Functions of Working Memory. International Journal of Psychophysiology, 57, 97-103.
https://doi.org/10.1016/j.ijpsycho.2005.03.018
[34] Langer, N., Von Bastian, C.C., Wirz, H., Oberauer, K. and Jäncke, L. (2013) The Effects of Working Memory Training on Functional Brain Network Efficiency. Cortex, 49, 2424-2438.
https://doi.org/10.1016/j.cortex.2013.01.008
[35] Roux, F. and Uhlhaas, P.J. (2014) Working Memory and Neural Oscillations: Alpha-Gamma versus Theta-Gamma Codes for Distinct WM Information? Trends in Cognitive Sciences, 18, 16-25.
https://doi.org/10.1016/j.tics.2013.10.010
[36] Dai, Z., De Souza, J., Lim, J., Ho, P.M., Chen, Y., Li, J., et al. (2017) EEG Cortical Connectivity Analysis of Working Memory Reveals Topological Reorganization in Theta and Alpha Bands. Frontiers in Human Neuroscience, 11, Article 237.
https://doi.org/10.3389/fnhum.2017.00237
[37] Rasoulzadeh, V., Sahan, M.I., van Dijck, J.P., Abrahamse, E., Marzecova, A., Verguts, T. and Fias, W. (2021) Spatial Attention in Serial Order Working Memory: An EEG Study. Cerebral Cortex, 31, 2482-2493.
https://doi.org/10.1093/cercor/bhaa368
[38] Klimesch, W., Doppelmayr, M., Röhm, D., Pöllhuber, D. and Stadler, W. (2000) Simultaneous Desynchronization and Synchronization of Different Alpha Responses in the Human Electroencephalograph: A Neglected Paradox? Neuroscience Letters, 284, 97-100.
https://doi.org/10.1016/S0304-3940(00)00985-X
[39] Klimesch, W., Sauseng, P. and Hanslmayr, S. (2007) EEG Alpha Oscillations: The Inhibition-Timing Hypothesis. Brain Research Reviews, 53, 63-88.
https://doi.org/10.1016/j.brainresrev.2006.06.003
[40] Gevins, A., Chan, C.S. and Sam-Vargas, L. (2012) Towards Measuring Brain Function on Groups of People in the Real World. PLOS ONE, 7, e44676.
https://doi.org/10.1371/journal.pone.0044676
[41] Scholz, S., Schneider, S.L. and Rose, M. (2017) Differential Effects of Ongoing EEG Beta and Theta Power on Memory Formation. PLOS ONE, 12, e0171913.
https://doi.org/10.1371/journal.pone.0171913
[42] Honkanen, R., Rouhinen, S., Wang, S.H., Palva, J.M. and Palva, S. (2015) Gamma Oscillations Underlie the Maintenance of Feature-Specific Information and the Contents of Visual Working Memory. Cerebral Cortex, 25, 3788-3801.
https://doi.org/10.1093/cercor/bhu263
[43] Prada, E., Satler, C., Tavares, M.C., Garcia, A., Martinez, L., Alves, C., Lacerda, E. and Tomaz, C. (2022) TREFACE: A New Computerized Test of Emotional Stroop with Facial Expressions. Journal of Behavioral and Brain Science, 12, 342-358.
https://doi.org/10.4236/jbbs.2022.127020
[44] Garcia, A., Fleury, F., Silva, G., Honda, H. and Tavares, M.C.H. (2019) Plataforma de Avaliação de Funções Executivas com Testes Neuropsicológicos Computadorizados. In: Conferências IADIS Ibero-Americanas WWW/Internet e Computação Aplicada 2019, 140-146.
https://doi.org/10.33965/ciaca2019_201914L018
[45] Delorme, A. and Makeig, S. (2004) EEGLAB: An Open Source Toolbox for Analysis of Single-Trial EEG Dynamics Including Independent Component Analysis. Journal of Neuroscience Methods, 134, 9-21.
https://doi.org/10.1016/j.jneumeth.2003.10.009
[46] Delorme, A., Sejnowski, T. and Makeig, S. (2007) Enhanced Detection of Artifacts in EEG Data Using Higher-Order Statistics and Independent Component Analysis. NeuroImage, 34, 1443-1449.
https://doi.org/10.1016/j.neuroimage.2006.11.004
[47] Salles, J.F.D. and Paula, F.V.D. (2016) Text Reading Comprehension and Its Relationship with Executive Functions. Educar em Revista, No. 62, 53-67.
https://doi.org/10.1590/0104-4060.48332
[48] Ovaysikia, S., Tahir, K.A., Chan, J.L. and DeSouza, J.F. (2011) Word Wins over Face: Emotional Stroop Effect Activates the Frontal Cortical Network. Frontiers in Human Neuroscience, 4, Article 234.
https://doi.org/10.3389/fnhum.2010.00234
[49] Sauerland, M., Wolfs, A.C., Crans, S. and Verschuere, B. (2019) Testing a Potential Alternative to Traditional Identification Procedures: Reaction Time-Based Concealed Information Test Does Not Work for Lineups with Cooperative Witnesses. Psychological Research, 83, 1210-1222.
https://doi.org/10.1007/s00426-017-0948-5
[50] Jansma, J.M., Ramsey, N.F., Slagter, H.A. and Kahn, R.S. (2001) Functional Anatomical Correlates of Controlled and Automatic Processing. Journal of Cognitive Neuroscience, 13, 730-743.
https://doi.org/10.1162/08989290152541403
[51] Zhang, Y., Chen, Y., Bressler, S.L. and Ding, M. (2008) Response Preparation and Inhibition: The Role of the Cortical Sensorimotor Beta Rhythm. Neuroscience, 156, 238-246.
https://doi.org/10.1016/j.neuroscience.2008.06.061
[52] Hart, S.J., Green, S.R., Casp, M. and Belger, A. (2010) Emotional Priming Effects during Stroop Task Performance. NeuroImage, 49, 2662-2670.
https://doi.org/10.1016/j.neuroimage.2009.10.076
[53] Kurtić, A. and Pranjić, N. (2011) Facial Expression Recognition Accuracy of Valence Emotion among High and Low Indicated PTSD. Primenjena Psihologija, 4, 5-11.
https://doi.org/10.19090/pp.2011.1.5-11
[54] Kawasaki, M., Kitajo, K. and Yamaguchi, Y. (2014) Fronto-Parietal and Fronto-Temporal Theta Phase Synchronization for Visual and Auditory-Verbal Working Memory. Frontiers in Psychology, 5, Article 200.
https://doi.org/10.3389/fpsyg.2014.00200
[55] Daume, J., Graetz, S., Gruber, T., Engel, A.K. and Friese, U. (2017) Cognitive Control during Audiovisual Working Memory Engages Frontotemporal Theta-Band Interactions. Scientific Reports, 7, Article No. 12585.
https://doi.org/10.1038/s41598-017-12511-3
[56] Mitchell, D.J., McNaughton, N., Flanagan, D. and Kirk, I.J. (2008) Frontal-Midline Theta from the Perspective of Hippocampal “Theta”. Progress in Neurobiology, 86, 156-185.
https://doi.org/10.1016/j.pneurobio.2008.09.005
[57] Zhang, H. and Jacobs, J. (2015) Traveling Theta Waves in the Human Hippocampus. Journal of Neuroscience, 35, 12477-12487.
https://doi.org/10.1523/JNEUROSCI.5102-14.2015
[58] Schönfeld, L.M. and Wojtecki, L. (2019) Beyond Emotions: Oscillations of the Amygdala and Their Implications for Electrical Neuromodulation. Frontiers in Neuroscience, 13, Article 366.
https://doi.org/10.3389/fnins.2019.00366
[59] Huster, R.J., Enriquez-Geppert, S., Lavallee, C.F., Falkenstein, M. and Herrmann, C. S. (2013) Electroencephalography of Response Inhibition Tasks: Functional Networks and Cognitive Contributions. International Journal of Psychophysiology, 87, 217-233.
https://doi.org/10.1016/j.ijpsycho.2012.08.001
[60] Spagna, A., Kim, T.H., Wu, T. and Fan, J. (2020) Right Hemisphere Superiority for Executive Control of Attention. Cortex, 122, 263-276.
https://doi.org/10.1016/j.cortex.2018.12.012
[61] Misselhorn, J., Friese, U. and Engel, A.K. (2019) Frontal and Parietal Alpha Oscillations Reflect Attentional Modulation of Cross-Modal Matching. Scientific Reports, 9, Article No. 5030.
https://doi.org/10.1038/s41598-019-41636-w
[62] Schürmann, M. and Başar, E. (2001) Functional Aspects of Alpha Oscillations in the EEG. International Journal of Psychophysiology, 39, 151-158.
https://doi.org/10.1016/S0167-8760(00)00138-0
[63] Ronconi, L., Busch, N.A. and Melcher, D. (2018) Alpha-Band Sensory Entrainment Alters the Duration of Temporal Windows in Visual Perception. Scientific Reports, 8, Article No. 11810.
https://doi.org/10.1038/s41598-018-29671-5
[64] Jensen, O., Gelfand, J., Kounios, J. and Lisman, J.E. (2002) Oscillations in the Alpha Band (9 - 12 Hz) Increase with Memory Load during Retention in a Short-Term Memory Task. Cerebral Cortex, 12, 877-882.
https://doi.org/10.1093/cercor/12.8.877
[65] Klimesch, W. (1997) EEG-Alpha Rhythms and Memory Processes. International Journal of Psychophysiology, 26, 319-340.
https://doi.org/10.1016/S0167-8760(97)00773-3
[66] Hanslmayr, S., Gross, J., Klimesch, W. and Shapiro, K.L. (2011) The Role of Alpha Oscillations in Temporal Attention. Brain Research Reviews, 67, 331-343.
https://doi.org/10.1016/j.brainresrev.2011.04.002
[67] Wang, Y., Lu, J., Gu, C. and Hu, B. (2018) Mapping the Frontal Alpha Asymmetry Indicators of Habitual Emotion Regulation: A Data-Driven Approach. NeuroReport, 29, 1288-1292.
https://doi.org/10.1097/WNR.0000000000001109
[68] Neuper, C. and Pfurtscheller, G. (2001) Event-Related Dynamics of Cortical Rhythms: Frequency-Specific Features and Functional Correlates. International Journal of Psychophysiology, 43, 41-58.
https://doi.org/10.1016/S0167-8760(01)00178-7
[69] Kilavik, B.E., Zaepffel, M., Brovelli, A., MacKay, W.A. and Riehle, A. (2013) The Ups and Downs of Beta Oscillations in Sensorimotor Cortex. Experimental Neurology, 245, 15-26.
https://doi.org/10.1016/j.expneurol.2012.09.014
[70] Merker, B. (2013) Cortical Gamma Oscillations: The Functional Key Is Activation, Not Cognition. Neuroscience & Biobehavioral Reviews, 37, 401-417.
https://doi.org/10.1016/j.neubiorev.2013.01.013
[71] Fries, P., Reynolds, J.H., Rorie, A.E. and Desimone, R. (2001) Modulation of Oscillatory Neuronal Synchronization by Selective Visual Attention. Science, 291, 1560-1563.
https://doi.org/10.1126/science.1055465
[72] Womelsdorf, T., Fries, P., Mitra, P.P. and Desimone, R. (2006) Gamma-Band Synchronization in Visual Cortex Predicts Speed of Change Detection. Nature, 439, 733-736.
https://doi.org/10.1038/nature04258
[73] Herrmann, C.S., Munk, M.H. and Engel, A.K. (2004) Cognitive Functions of Gamma-Band Activity: Memory Match and Utilization. Trends in Cognitive Sciences, 8, 347-355.
https://doi.org/10.1016/j.tics.2004.06.006
[74] Singer, W. (2001) Consciousness and the Binding Problem. Annals of the New York Academy of Sciences, 929, 123-146.
https://doi.org/10.1111/j.1749-6632.2001.tb05712.x
[75] Bauer, M., Stenner, M.P., Friston, K.J. and Dolan, R.J. (2014) Attentional Modulation of Alpha/Beta and Gamma Oscillations Reflect Functionally Distinct Processes. Journal of Neuroscience, 34, 16117-16125.
https://doi.org/10.1523/JNEUROSCI.3474-13.2014
[76] Walden, K., Pornpattananangkul, N., Curlee, A., McAdams, D.P. and Nusslock, R. (2015) Posterior versus Frontal Theta Activity Indexes Approach Motivation during Affective Autobiographical Memories. Cognitive, Affective, & Behavioral Neuroscience, 15, 132-144.
https://doi.org/10.3758/s13415-014-0322-7
[77] Canabarro, S.L.S., Garcia, A., Satler, C. and Tavares, M.C.H. (2017) Interaction between Neural and Cardiac Systems during the Execution of the Stroop Task by Young Adults: Electroencephalographic Activity and Heart Rate Variability. AIMS Neuroscience, 4, 28-51.
https://doi.org/10.3934/Neuroscience.2017.1.28

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.