Next Article in Journal
Source of Knowledge Dynamics—Transition from High School to University
Next Article in Special Issue
Bridging Offline Functional Model Carrying Aging-Specific Growth Rate Information and Recombinant Protein Expression: Entropic Extension of Akaike Information Criterion
Previous Article in Journal
Solving Equations of Motion by Using Monte Carlo Metropolis: Novel Method Via Random Paths Sampling and the Maximum Caliber Principle
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Entropy and the Brain: An Overview

The Thomas N. Sato BioMEC-X Laboratories, Advanced Telecommunications Research Institute International (ATR), Kyoto 619-0237, Japan
Submission received: 3 July 2020 / Revised: 25 July 2020 / Accepted: 19 August 2020 / Published: 21 August 2020
(This article belongs to the Collection Do Entropic Approaches Improve Understanding of Biology?)

Abstract

:
Entropy is a powerful tool for quantification of the brain function and its information processing capacity. This is evident in its broad domain of applications that range from functional interactivity between the brain regions to quantification of the state of consciousness. A number of previous reviews summarized the use of entropic measures in neuroscience. However, these studies either focused on the overall use of nonlinear analytical methodologies for quantification of the brain activity or their contents pertained to a particular area of neuroscientific research. The present study aims at complementing these previous reviews in two ways. First, by covering the literature that specifically makes use of entropy for studying the brain function. Second, by highlighting the three fields of research in which the use of entropy has yielded highly promising results: the (altered) state of consciousness, the ageing brain, and the quantification of the brain networks’ information processing. In so doing, the present overview identifies that the use of entropic measures for the study of consciousness and its (altered) states led the field to substantially advance the previous findings. Moreover, it realizes that the use of these measures for the study of the ageing brain resulted in significant insights on various ways that the process of ageing may affect the dynamics and information processing capacity of the brain. It further reveals that their utilization for analysis of the brain regional interactivity formed a bridge between the previous two research areas, thereby providing further evidence in support of their results. It concludes by highlighting some potential considerations that may help future research to refine the use of entropic measures for the study of brain complexity and its function. The present study helps realize that (despite their seemingly differing lines of inquiry) the study of consciousness, the ageing brain, and the brain networks’ information processing are highly interrelated. Specifically, it identifies that the complexity, as quantified by entropy, is a fundamental property of conscious experience, which also plays a vital role in the brain’s capacity for adaptation and therefore whose loss by ageing constitutes a basis for diseases and disorders. Interestingly, these two perspectives neatly come together through the association of entropy and the brain capacity for information processing.

1. Introduction

Over the course of last two decades, neuroscientific research has accumulated comprehensive empirical [1,2,3] and theoretical [4,5,6] evidence for the crucial role of the brain signal variability in its function. This variability has been shown to emerge from the interplay between individual neurons and their neuronal circuits [7,8] and to extend over wide spatiotemporal scales in the brain [9,10]. Substantial findings relate its occurrence to the brain self-organized criticality [11,12,13,14,15] in which the brain maximizes its information capacity [16,17].
These findings, in turn, help explain the pivotal role of entropy (Appendix A) in quantification of the brain’s information processing [18,19,20,21], given the direct correspondence between the variance and the amount of information [22]. Today’s applications of entropy for quantification of the brain function range from information capacity of working memory (WM) [23] and the neural coding [24,25] to the interplay between neural adaptation and behaviour [26], functional interactivity between the brain regions [27], and the state of consciousness [28,29].
Entropy has also manifested as a robust feature for classification of the individuals’ mental states in such diverse domains as detection of the onset of epileptic seizure [30] and the change in state of vigilance [31]. For instance, Zheng and Lu [32] showed that the use of entropy for emotion classification based on electroencephalography (EEG) time series outperformed such feature spaces as differential asymmetry (DASM), rational asymmetry (RASM), and power spectral density (PSD). Similarly, Keshmiri et al. [33] verified the discriminative power of entropy in comparison with the dominant feature spaces in near-infrared spectroscopy (fNIRS) analysis of the n-back WM task [34]. Most recently, Liu and colleagues [35] used a considerably large resting-state functional magnetic resonance imaging dataset from the Human Connectome Project (998 individuals). They demonstrated that the information from cortical entropy profiles could effectively predict diverse facets of human subjects’ cognitive ability. In fact, there is ample evidence for underlying relations between the entropic measures on the one hand and the (brain) signal variability on the other hand [36,37,38,39,40,41,42,43,44,45]. These studies have advanced our understanding of the practical considerations while using these measures [46,47,48,49]. This, in turn, has paved the path for development of information-theoretic software and libraries [50,51,52,53,54,55,56,57,58] that facilitated the deployment of these measures for the study of the brain.
In the past, several reviews have summarized the use of entropic measures in neuroscience. For instance, Takahashi [59] reviewed the literature on the utility of measuring neurophysiological complexity associated with mental disorders in light of shared neurobiological conditions. However, this review was not primarily based on the use of entropic measures and instead covered different nonlinear analytical methodologies. Borst and Theunissen [25] highlighted the pros and cons of the use of information theory, in general, for capturing the information-content of neural responses to stimuli. Similarly, Quiroga and Panzeri [20] showed how decoding methodologies may yield further insights on neuronal population responses through the adaptation of information theory. However, their review primarily centred around the application of information theory in decoding strategies pertinent to single-trial population analyses. On the other hand, Carhart-Harris [18] and Carhart-Harris et al. [60] considered entropy as a means of studying the state of consciousness in relation with the effect of psychedelic drugs. The authors provided a comprehensive coverage of the related literature which formed a solid foundation for their hypothesis: the “entropic brain” hypothesis that proposes the entropy as a marker of the conscious states in spontaneous brain activity.

The Present Overview

The present overview aims at complementing the aforementioned reviews by focusing on three domains of research: the (altered) state of consciousness, the ageing brain, and the quantification of the brain networks’ information processing. The reasons why these three areas of research are covered by the present overview are twofold. First, the study of consciousness, the ageing brain, and the brain networks’ information processing are among the most active fields of research, to the best of our knowledge, where the use of entropy has resulted in highly promising findings. Second (and perhaps more importantly), the use of entropy in these areas helped realize that their seemingly different lines of research are indeed highly intertwined and related to each other. On the one hand, a number of theses consider complexity as a fundamental property of conscious experience [61,62] where the evolution of brain’s complexity manifests as an entropy-enhancing process [63]. On the other hand, there is growing evidence that identifies the vital role of physiologic complexity in organisms’ capacity for adaptation [26] and pp. 3, 93–94 in [64], thereby relating the age-related loss of complexity with diseases and disorders [59,60,65,66,67]. Interestingly, these two propositions substantially overlap through their association with the brain capacity for information processing [13,68,69,70].
In the case of (altered) states of consciousness, the present overview highlights the utility of entropy for differentiating the brain activity in three states of wakefulness, anaesthesia, and unresponsive wakefulness syndrome. It also summarizes the findings that show the effectiveness of entropy for quantification of the brain function during hypnosis. In the case of the ageing brain, it brings together the studies that identify the correspondence between the brain complexity and ageing and how such a relation translates into the change in the brain function with ageing. For the case of quantification of the brain networks’ information processing, it summarizes the findings that shed further lights on the functional connectivity studies in terms of brain networks’ complexity and information processing.
There were two ways to organize the sections of this overview: (1) to focus on types of entropic measures (2) to categorize their contents as per domains of application. The first choice would allow for highlighting the different formulations of entropic measures and their subsequent modifications. However, it could also make it more inconvenient for the readers to pinpoint the fields of research in which these measures yielded substantial results. On the other hand, the second option would allow for more tangibly bringing together the incremental findings in each area of research, while putting less emphasis on types of entropic measures that were utilized. To more efficiently appreciate the recent progress based on the application of entropy in neuroscientific studies, the present overview adapted the second approach. For interested readers, it also provided a brief summary of three most commonly used entropic measures in Appendix B. These measures are (differential) entropy (DE) [48,71], multiscale entropy (MSE) [72], and permutation entropy (PE) [73].
Last, the use of entropy for quantification of the interplay between psychedelic drugs and consciousness is an exciting venue in which the neuroscientific research has yielded interesting and promising findings. However, the present study did not cover this topic. The interested readers may find a comprehensive coverage of this topic in Carhart-Harris [18] and Carhart-Harris et al. [60].

2. (Altered) State of Consciousness

Classical approaches to the study of consciousness primarily focus on contribution of specific brain areas or groups of neurons [62]. However, such approaches fall short in realizing that the modulation of information does not necessarily involve a change in brain’s regional power and/or activity [74]. Such a shortcoming can be addressed by analyzing the brain’s activity using entropic measures that are, by principle, designed for quantification of the amount of information. This, in turn, allows for identification of the brain signal’s complexity (Appendix A.2) as a fundamental property of conscious experience.

2.1. Various Stages of (Un)consciousness

King et al. [75] demonstrated that the state of consciousness was associated with markedly increased information sharing among distributed brain regions. They further identified that such an increase in brain distributed information was particularly observable over medium- and long-range brain connectivity. Interestingly, the degree of change in such a distributed information sharing was a reliable marker to robustly distinguish between unresponsive wakefulness syndrome, minimally conscious state, and conscious state. These results complemented the findings on the relation between information sharing and loss of consciousness that were reported using spectral-based functional connectivity measures [76,77]. Furthermore, and unlike these synchrony-based measures, their approach allowed for minimization of the effect of common-source artefacts, thereby robustly differentiating between distinct states of consciousness. Their findings also provided substantial evidence for a number of theoretical models of consciousness that envisioned the brain’s large-scale information sharing as a consistent signature of conscious processing [62,78,79,80,81]. The authors posited their approach may enhance the current behavioural and neuroimaging methodologies to more effectively diagnose consciousness-related disorders [82].
Zhang et al. [29] used the entropy of Markov trajectories [83] to demonstrate that human consciousness relies on the temporal circuit in which the dynamic is characterized by the balanced and reciprocal accessibility of the default mode network (DMN) [84] and the dorsal attention network (DAT) [85]. Their findings provided further support for involvement of these two brain networks as two distinct cortical systems that support consciousness [85,86,87,88]. These findings further explained the diminished anti-correlation between these two networks during unconsciousness [89,90,91] in terms of spatiotemporal brain dynamics. These findings further advanced the study of the conscious brain by revealing that the disruption of this spatiotemporal brain dynamic might be the common signature of various forms of unresponsiveness.
Demertzi and colleagues [28] further advanced these previous findings by identifying the presence of conscious brain dynamics in sustained patterns of long-range coordination. These patterns showed a low similarity to the anatomical connectivity. Their analyses also associated the reduced/absence of conscious processing to brain dynamics that exhibited a low interregional coordination. Such alternating states of high and low brain dynamics has been known to constitute a basis for complex cognitive functions in which integrated states support faster and more accurate cognitive performance [92]. They were also in accord with the theories that relate the state of consciousness to a brain-scale activity in which self-sustained coordination [93,94,95] allows for manifestation of perception, emotion, and cognition [96,97,98]. Their findings further extended the previous research on nonhuman primates that showed the anaesthetized brain activity mostly resembled the brain anatomical connectivity patterns [99] to the case of humans. They also provided empirical evidence for the theoretical stance that suggested such alternating patterns of brain dynamics may constitute a fundamental property of the brain information processing [87].
Miskovic et al. [100] studied the change in human brain signal variability during the sleep cycle and found that the brain signal entropy throughout the sleep cycle was strongly time-scale dependent. Their results indicated that the slow wave sleep (SWS) was associated with a reduced complexity at short time scales which was accompanied with its increase in long time scales. They further noted that the temporal signal complexity at short time scales and the slope of the EEG power spectral captured a neuronal noise that potentially reflected the cortical balance between excitation and inhibition [64]. These results established an explicit link between the brain’s entropy and the 1 f (Appendix C, Definition A2) component [36] of power spectra [101,102]. This, in turn, extended the previous research, which showed that the slope of power spectrum density (PSD) at large-scale field potentials acts as a measure of the neuronal population spiking [103,104]. They further verified the association between the smaller time scales’ entropy and the brain’s local information processing [105,106,107,108,109,110]. Additionally, they provided further support for the observation that the increase in sleep depth was accompanied by greater inhibitory activity [111] and the lowered global level of consciousness [112,113]. Another interesting observation in this study was the increased level of entropy in stage 3 non rapid-eye-movement (NREM-3) sleep compared to its increase in NREM-2 and REM at large time scales [114]. This suggested the switching of the cortical activity into a global bistable pattern of depolarized and hyperpolarized states that are the characteristics of SWS [108].

2.2. Anaesthesia

Olofsen et al. [115] observed that the change in entropy tracked the qualitative effect of anaesthetic drug from awake state to sedated/lightly anaesthetized and eventually the deep anaesthetized state. They further observed that, compared to other anaesthetic indices, their index did not require long segments of EEG data. Additionally, it required a minimal pre-processing of data and was highly resistant to blink artefacts while being computationally efficient. The authors also noted that the open-access nature of their index presented a reliable alternative to proprietary anaesthetic indices, the results of which could not be readily interpreted (due to unavailability of their algorithmic logic). These findings were further replicated by other studies [116,117,118].

2.3. Hypnosis

Hypnosis has received a growing interest from cognitive neuroscience research, in part, due to its utility for studying the consciousness [119,120]. To realize the effect of hypnosis on the brain, a number of biomarkers have been introduced. They include structural synchrony measure [121], Phase Lag Index [122], topographic variability in the beta and gamma bands [123], coherence (COH) [124] and the imaginary component of coherence (iCOH) [125] (for a brief description of brain wave’s frequency bands, see Appendix C, Definition A3). Although these measures provided encouraging results, their applicability appeared to be limited. Whereas Deivanayagi et al. [126] found that COH associated the state of hypnosis with lowered theta and alpha frequency bands, Sabourin et al. [127] found that COH indicated an increase in theta power during hypnosis in both low as well as high hypnotizable individuals. In the same vein, the use of COH and iCOH [128] were both required to analyze different frequency bands (i.e., one per theta and beta1 bands, respectively). Furthermore, they did not identify any significant differences in power [128].
Interestingly, entropy provided a potential unifying measure to reproduce these previous observations that were based on multiple biomarkers [129]. Precisely, it identified a large bi-hemispheric effect of hypnotic suggestibility (with comparable strength in their effect sizes) on information-content of theta, alpha, and beta frequency bands. This observation was in accord with Han et al. [130] who showed that the signals that were carried by individual cortical neurons concurrently contributed to multiple functional pathways in both hemispheres, thereby providing promising evidence in support of the global brain neural excitation in response to stimuli. Entropy also identified a significantly higher functional connectivity (FC) among higher hypnotic suggestible participants’ theta and alpha bands that was more pronounced in the parietal (in the case of theta) and centroparietal (for alpha) regions and that was accompanied by a non-significant smaller FC in the beta band in the central region. In this respect, Jamieson and Burgess [128] also reported similar changes in FC from pre-hypnosis to hypnosis state using iCOH (increase in theta) [125] and COH (decrease in beta1) [124]. However, their analyses, which were primarily based on the state of hypnosis (i.e., without observing responses of the participants to hypnotic suggestions), did not identify any significant differences [128] between pre-hypnosis and hypnosis state on these bands. It is also worthy of note that the entropy’s identification of the role of alpha and theta bands during the brain processing of the hypnotic suggestions were in line with the previous findings on the pivotal role of these bands in information processing and transfer of information between functionally connected brain regions [121,122,131,132,133,134].

3. The Ageing Brain

Section 2 identified the substantial contribution of the brain’s entropy in formation of a conscious experience. In the same vein, a growing number of recent studies also hint at the interplay between physiological complexity and the brain’s capacity for adaptation. This section highlights some of these findings that show the significant contribution of the brain’s signal complexity to its development and also recognize the loss of such a complexity through the process of ageing as a major contributing factor to various age-related cognitive declines and deficiencies.

3.1. The Interplay between Brain Signal Variability and Its Development

McIntosh et al. [135] showed that the brain signal variability increases by age. They also showed that such a variability correlated with the reduced behavioural variability and therefore resulted in more accurate performance in cognitive tasks. Their findings also identified the observed variability in the brain activity to be a critical feature of its function [3,7].
These findings advanced the previous research on developing brain by validating that the neural system maturation promotes an increased physiological variability, thereby allowing for better environmental adaptability [136,137]. More specifically, they pinpointed the vital role of the signal variability in enabling the brain to parse weak and ambiguous incoming signals [138,139,140]. This, in turn, verified the central role of the brain signal variability in facilitating such functions as inter-neuronal signal exchange [141,142], transitional states in metastable systems [143], and the formation of functional networks [144,145].

3.2. The Power-Law Scaling of the Brain Signal Variability in Adulthood

Takahashi and colleagues [146] studied the effect of photic stimulation on the brain activity of healthy younger and older adults. They identified a significant increase in the brain signal complexity that was only present among the younger individuals. This indicated that unlike older adults, the brain of younger individuals exhibited a power-law scaling property that corresponded to the long-range temporal correlation between their brain regions.
These findings extended the previous research on the effect of ageing on brain function in two ways. Second, the absence of power-law scaling in the older adults’ brain dynamics helped establish a relation between reduced/diminished brain ability to respond to an external stimuli with ageing [147,148]. Second, it identified that such a scaling that corresponds to the intrinsic complexity in physiological systems [65] to be also vital for the healthy brain functioning. Third, they further advanced the proposal [149] that the loss of complexity results in functional decline of the organism by diminishing the range of available, adaptive responses to events of everyday life [66].

3.3. Ageing and the Default Mode of the Brain

Yang and colleagues [150] demonstrated that, compared to younger adults, the older individuals’ default mode network (DMN) [84] exhibited a reduced complexity. They further verified that such a decrease in complexity was more significant in DMN’s posterior cingulate gyrus and hippocampal components.
These results contributed to the prior studies of the brain’s normal ageing in three distinct ways. First, they provided empirical support for the hypothesis that considers the ageing to underlie the reduced network’s complexity and information integration of the brain [151,152,153,154]. Second, they verified that the reduced brain complexity is at the core of the older adults’ significantly lower magnitude of DMN co-activation in the posterior cingulate cortex [155,156]. Third, they verified that the reduced network efficiency in older individuals’ frontotemporal and limbic brain regions [157] could be consequentially explained in terms of observed decrease in brain’s signal variability.

3.4. Ageing and the Brain’s Distributed Versus Local Information Processing

McIntosh and colleagues [109] studied the interplay between ageing brain’s local and distributed information processing using two independent datasets of EEG and magnetoencephalography (MEG) recordings. These datasets included both younger and older human subjects.
Their results indicated that whereas most brain regions exhibited an increase in local information processing, the change in distributed brain’s complexity was age-related and was, subsequently, reduced across hemispheres by ageing. More importantly, they observed that unlike early life maturation in which the changes in the brain complexity were more widespread, such changes during adulthood exhibited a strong spatiotemporal dependency.
An important implication of these findings was the attribution of the observed older adults’ higher brain modularity [158] as a direct consequence of its increased local information processing that could potentially be due to its reduced capacity for distributed information processing. Furthermore, the observed age-related temporal changes in the brain complexity in this study extended the prior results on the changes in spectral power during normal ageing [159,160] that identified a general decrease/increase in power in lower/higher brain frequency bands. In this regard, as a result, they extended the findings on increased brain regional long-range interactions and its reduced local complexity in early life [107] to the case of the ageing brain. Additionally, the predominantly cross-hemispheric nature of observed decreases in the complexity in their study complemented the coherence-based studies that noted the reduction in inter-hemispheric functional connections with age [161,162]. Taken together, the study by McIntosh et al. [109] supported the dedifferentiation hypothesis [67,163,164,165,166,167] which states that the age-related decline in the brain and behaviour and the loss in brain complexity are synonymous and integrated.

4. Quantification of the Brain Networks’ Information Processing

Section 2 and Section 3 highlighted the correspondence of brain’s entropy to manifestation of various states of conscious experience on the one hand and the decline of the brain’s function in terms of loss of such an inherent complexity through the process of ageing on the other hand. As the current section reveals, such a correspondence is not a coincident but a substantial relation that emerges from the association between the brain’s inherent complexity and its capacity for information processing.

4.1. Emergence of the Distinct Functional Networks

Through introduction of a novel entropic functional connectivity index, Tononi and colleagues [27] showed that certain subsets of brain regions may interact more strongly among themselves than with the rest of the brain. They further argued that such variations among differential brain’s sub-regions might be crucial for the emergence of the functional boundaries from widespread brain global connectivity [168].
These results further advanced the cognitive neuroscience studies by demonstrating that while many brain regions are active in the control of cognition and behaviour, only a subset of active neuronal groups is directly correlated with conscious experience [169,170].

4.2. Variability in the Brain’s Distributed Functional Networks

McDonough and Nashiro [110] found that the patterns of brain signal variability were distinct from noise. In addition, these patterns were differentially expressed among such distributed networks as DMN, cingulo-opercular network, and the left and right fronto-parietal (attention) networks [171] (Chapter 7, pp. 274–323). In particular, the complexity of these networks was negatively/positively associated with functional connectivity at small/large time scales (see Appendix B.1 for details on time-scaled quantification of the brain signal variability through signal’s coarse-graining process). This suggested that the DMN might be accommodating a higher degree of information processing across distributed connections (e.g., medial frontal and medial parietal regions). It is worth noting that the authors quantified the functional connectivity using dual regression analyses [172,173] in which the regression weights were interpreted as a strength of the functional connectivity [174].
The contributions of these findings were threefold. First (and perhaps most importantly), they indicated that the brain signal variability was distinct from the noise [1,3,7,8]. Second, they showed that, relative to the other networks, DMN exhibited the smallest and largest degrees of network complexity at short and mid to large time scales relative to other networks. Third, they identified that the presence of such temporal scalings might be critical for understanding the dynamics of the inter-neuronal transfer of information [175]. This was in accord with the thesis that considers DMN’s regions to serve as critical gateways for information processing and integration within the local and distributed brain’s networks [176,177,178]. For instance, DMN has been related to episodic memory, imagining the future, self-reflection, mentalizing, divergent thinking, working memory, reading comprehension, and constructing moral judgments [171] (Chapter 13, pp. 515–565) and [179,180,181,182,183].
Additionally, the observed neural complexity in this study may also be interpreted as evidence for the presence of differential range or capacity of the brain for exploring the alternative brain states [3]. In fact, Wang and colleagues [184] found further evidence for these results and proposed that the complexity of the regional neural activity may serve as an index of the brain’s capacity for information processing. Specifically, they envisioned the presence of this increase in complexity to indicate the brain’s capacity for transitioning between its different states and networks to promote a greater propensity for information processing.

4.3. Variability and the Distributed Functional Synchrony

Liu et al. [185] observed (in mice under anaesthesia and wakefulness) a strong negative spatiotemporal linear correlation between functional connectivity and entropy. They also observed that entropy correlated positively with the complexity of the cortical activity in small time scales. This correlation was negative in large time scales. They further explored the validity of their results using simulated human’s brain activity (i.e., simulated blood oxygen level dependent, BOLD). Using this simulated data, they observed that the functional connectivity and complexity provided partial assessments (although with a reduced resolution) of the structural and dynamics variation of the cortical entropy.
This study contributed to the study of the brain function in terms of change in its signal variability in two substantial ways. First, it identified a scale-dependent correlation between complexity and entropy. Precisely, this finding verified that whereas the brain signal variability at small time scales was associated with its local information processing, it reflected the brain networks’ long range communication at larger scales [107,109]. Second, it demonstrated that anaesthesia affected the brain function by (1) reducing its entropy (2) strengthening its functional connectivity (3) decreasing/increasing its signal variability at small/large times scales. In this respect, these observations extended the previous sleep research that reported an increase in functional connectivity (within the dorsal attention network) during light sleep [186] and a global decrease in complexity at small/large time scales during deep sleep [100,114].

5. Concluding Remarks

The present overview sought to bring together three areas of neuroscientific research in which the use of entropy for quantification of the brain activity has yielded promising results. These areas were the (altered) state of consciousness, the ageing brain, and the quantification of the brain networks’ information processing. While doing so, it highlighted three main observations. First, it identified that the use of entropic measures for the study of consciousness and its (altered) states led the field to substantially advance the previous findings. For instance, they helped verify the theoretical models that envisioned the brain’s large-scale information sharing underlies its conscious processing [75], that DMN and DAT networks were crucially involved in such processes [29], and that the conscious brain’s dynamics sustained patterns of long-range coordination which was substantially distinct from its anatomical connectivity [28]. These studies further validated the utility of entropic measures as potential biomarkers for the study of differential states of consciousness [115,116,117,118,129]. Second, it realized that the use of these measures for the study of the ageing brain provided significant insights on various ways that the process of ageing may affect its dynamics and information processing capacity. For example, they identified that the brain signal variability was a critical feature of its function [135] in which loss by ageing may help explain its functional decline [146] and reduced information integration [109,150]. Third, it revealed that their utilization for analysis of the brain regional interactivity formed a bridge between the previous two research areas. This included the evidence for the emergence of functional boundaries from stronger interaction among subsets of brain’s global connectivity [27], the involvement of DMN in its higher information processing [110], and its reduced functional connectivity by ageing [185,187,188].
Collectively, these studies hinted at the brain’s complexity as its fundamental property that underlies the manifestation of such phenomena as consciousness [61,62] and adaptability [26] and pp. 3, 93–94 in [64]. This, in turn, resulted in attribution of various brain-related deficiencies and disorders [59,60,65,66,67] to the decline of this inherent complexity. These findings appeared to converge on the utility of brain’s complexity in its function and information processing [13,68,69,70].
The use of entropy for analysis of the brain regional interactivity also shed further light on the correspondence between the brain regional synchrony on the one hand and its dynamical complexity on the other hand. In this respect, Reid et al. [189] noted that associations among the brain regions in terms of correlation and synchronized activity can arise in a variety of ways that may not relate to the extent of the influence among these co-occurring processes. As a result, such measures may fall short of capturing a more comprehensive mapping between the observed associations and their underlying neural substrates [189,190,191]. These shortcomings can be addressed by the future research through utilization of such tools as Granger causality (GA) [192,193,194] and transfer entropy (TE) [19,195,196]. These tools can enable the research to devise analytical approaches that study the brain networks’ complexity in light of directed flow of information among their components. An advantage of TE compared to GA is that whereas the latter is based on linear vector autoregressive (VAR) [197,198] (and hence linear in nature), the former is a nonlinear directional measure of flow of information [41,199]. Therefore, its adaptation for the study of brain networks’ information processing and dynamics may yield more informative results.
In the same vein, the future research can further advance the study of (altered) states of consciousness and its relation with the brain complexity through utilization of such entropic frameworks as integrated information theory (IIT) [62,81,93,200]. This is a particularly interesting venue, considering the recent growth of our understanding about this measure’s interpretability [201]. This observation becomes more intriguing by the availability of toolboxes that substantially facilitate its computation [56,202] even for relatively large scale brain networks [203].
Another area of research for future exploration is the relation between entropy and neuropsychological disorders. In this regards, the review by Takahashi [59] found inconsistent results among studies that utilized nonlinear analytical methodologies for this purpose. As noted by the author, such studies can benefit from methodologies that take into account the brain signal complexity and its dynamics across multiple time scales [42,72]. Such studies can also benefit from the recent findings [35] that demonstrate the utility of entropy for quantification of diverse facets of human subjects’ cognitive ability and its effectiveness for capturing the effect of psychedelic drugs while studying such disorders [18,60].
The entropic measures have also appeared in the literature pertinent to meditation [204] and mediated social communication [205]. Although these are potentially interesting findings, they are primarily based on the association between observed changes in brain signal variability and the individuals’ subjective self-reported mental states. In this respect, a study by MacDuffie et al. [206] that included 1256 human subjects concluded that such self-report ratings were unrelated to measured neural activation. The effect of this shortcoming can be indeed observed in meditation and mediated social communication studies. For instance, in the case of mediation, whereas Vyšata et al. [204] reported a decrease in global entropy during the mediation (interpreted as a sign of relaxation by the authors), Kakumanua et al. [207] observed its increase which was only present in the case of experienced meditators. On the other hand, the mediated social communication studies assumed both, lower [205] and higher [208,209] entropy to indicate such self-assessed feelings as relaxed mood, interest, and the feeling of human presence. Furthermore, the mediated social communication studies were based on a surprisingly limited number of brain sites (e.g., only two forehead sites in [205,208,209]). As a result, it was unclear whether the observed changes in entropy were due to the individuals’ interaction with the media or potentially a residue of overall brain signal variability [1,3] which may not necessarily pertain to the effect of stimuli. Most importantly, their results were inconsistent with the findings that pinpointed the substantial contrast between human–human and human–agent interactions (HHI and HAI, respectively) [210,211,212,213]. Therefore, future research to clarify these discrepancies is necessary to allow for robust interpretation of the potential change in brain signal variability in response to such stimuli as meditation [204] and mediated social communication [205].
The entropic measures primarily adapted by the studies covered in the present overview included differential entropy (DE) [48], permutation entropy (PE) [214], and sample entropy (SE) [215] that forms the computational backbone of multiscale entropy (MSE) [72] (see Appendix B). There is an interesting underlying computational similarity between these measures on the one hand and the classical Shannon entropy [22] on the other hand. Specifically, whereas the original Shannon entropy was proposed for the case of discrete random variables, the computation of DE, PE, SE, and MSE are based on estimates of the entropy of a continuous time series through its discretization. Furthermore (although in a much-restricted sense), DE and PE both relate to SE computation through realization that the latter is based on a discretized (a binary discretization in this case) summary statistics of a given continuous time series. In this respect, it would be interesting for the future research to further examine the effect of such discretization strategies on the level of correspondence between the estimated entropy by these algorithms. Another possibility for future research that is worth investigating would be to reevaluate the use of PE for estimation of the brain variability based on its more recent variant, i.e., multiscale permutation entropy (MPE) [45].

Funding

This research was funded by the Japan Society for the Promotion of Science (JSPS) KAKENHI grant number JP19K20746.

Acknowledgments

This study was carried out in accordance with the recommendations of the ethical committee of the Advanced Telecommunications Research Institute International (ATR). The protocol was approved by the ATR ethical committee (approval code: 0221200702002).

Conflicts of Interest

The author declares no conflict of interest. The funder had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

Abbreviations

The following abbreviations are used in this manuscript:
WMWorking Memory
EEGElectroencephalography
DASMDifferential Asymmetry
RASMRational Asymmetry
PSDPower Spectral Density
fNIRSFunctional Near-Infrared Spectroscopy
DMNDefault Mode Network
DATDorsal Attention Network
SWSSlow Wave Sleep
REMRapid-Eye-Movement
NREMNon Rapid-Eye-Movement
COHCoherence
iCOHImagery Component of Coherence
FCFunctional Connectivity
MEGMagnetoencephalography
BOLDBlood-Oxygen-Level-Dependent
GAGranger Causality
TETransfer Entropy
VARVector Autoregressive
IITIntegrated Information Theory
MSEMultiscale Scale Entropy
DEDifferential Entropy
PEPermutation Entropy
SESample Entropy
MPEMultiscale Permutation Entropy
HHIHuman-Human Interaction
HAIHuman-Agent Interaction
PPNPerson Perception Network
TOMTheory Of Mind (theory-of-mind)

Appendix A. Entropy

In the context of information theory [71], entropy (derived from Greek to mean “transformation content”) of a random variable can be defined as the average level of “information”, “surprise” or “uncertainty” that is inherent of its possible outcomes. In other words, it is the measure of information as the measure of uncertainty or in words of Shannon [22,216] “of how much ‘choice’ is involved in the selection of the event or of how uncertain we are of the outcome.” Shannon demonstrated that such an amount of information can be measured as a function of probabilities of the events involved
H = p i p i l o g 2 ( p i )
where H stands for computed entropy and p i is the probability by which the ith may occur.
Equation (A1) uses the base 2 logarithm (as originally proposed by Shannon), thereby treating the entropy in terms of bits of information. Although this presentation helps facilitate the interpretation of computed entropy (e.g., Miller [23]), it does not necessarily impose a restricted rule and could be replaced by any other base (e.g., 10 or e) as long as the computed entropy can be conveniently interpreted in the context of its application.

Appendix A.1. Principle of Maximum Entropy

A substantial insight about understanding and interpretation of Shannon entropy came through Edwin T. Jaynes who argued that among all possible distributions, the one that maximizes the Shannon information entropy must be preferred [217,218] (i.e., the principle of maximum entropy, see also Cover and Thomas [71] (p. 410, Theorem 12.11) for a formal statement). The principle of maximum entropy has a demonstrated utility in further advancing the mathematical modelling of brain’s neural activity [16,44,75,219].

Appendix A.2. Information, Complexity, and Randomness

A crucial (and perhaps quite often confusing/misleading) issue is the relation between such concepts as “information”, “complexity” and “randomness”. As the extensive works by Solomonoff [220], Kolmogorov [221], and Chaitin [222] have demonstrated, these are in fact equivalent when it comes to information. Interested readers can find a fascinating informal treatment of the subject in Gleick [223] (Chapter 12).

Appendix A.3. Entropic Interpretation of Brain’s Function: A Simplified Example

To realize how entropy may facilitate the study and analysis of brain neuronal activity, it is useful to imagine the brain’s neural population in terms of a system that resides (i.e., as a whole) in a baseline state This is a gross oversimplification of how the brain may actually function. Nonetheless, such an abstract and oversimplification helps capture the essence of entropic interpretation of the brain’s function.) in the absence of any stimulus/stressor.
For a neuronal population of size N , there are 2 N possible ways in which the firing patterns of this neuronal ensemble can be arranged in. For example, if N = 3 (i.e., 3 neurons [ n 1 , n 2 , n 3 ] ), their firing patterns could be arranged in 2 3 = 8 different ways: [ n 1 = 0 , n 2 = 0 , n 3 = 0 ] , [ n 1 = 0 , n 2 = 0 , n 3 = 1 ] , …, [ n 1 = 1 , n 2 = 1 , n 3 = 1 ] ).
If one assumes one of these possible arrangements to represent the brain’s state in the absence of any stimulus/stressor, it is quite apparent that the non-baseline-state arrangements far outnumber the representation of the brain’s function at its baseline state. Continuing on this line of abstraction, one would soon realize that there are many states in which brain’s neuronal firing may take place, a small subset of which (i.e., relative to all of their possible arrangements) are representatives of the brain’s function at its non-stimulated, under-stressed state.
To respond to a stimulus, the brain’s neuronal population drifts/deviates from its baseline state’s firing patterns, resulting in the subset that represents these baseline state’s patterns having lower probability of occurrence when compared to all other potential ways that the brain’s neuronal population may fire in response to the given stimulus/stressor. Referring to Equation (A1)), this situation translates to an increase in the brain’s entropy (i.e., with respect to its baseline neuronal firing pattern) increases while responding to a stimulus/stressor.

Appendix B. An Overview of Multiscale Entropy (MSE), Permutation Entropy (PE), and Differential Entropy (DE)

Although DE, MSE, and PE measure the variability of a given time series, they achieve this objective through different quantification procedures. This Appendix provides a brief overview of these measures and their distinct operational principles.

Appendix B.1. MSE

To understand MSE, one first needs to comprehend the concept of sample entropy (SE) [215]. Therefore, we begin with an overview of SE. We then elaborate on the use of SE for MSE computation.

Appendix B.1.1. SE

SE forms the backbone of MSE computation. There are two parameters involved in SE computation: pattern length m and similarity criterion r (i.e., a positive real-number, typically within 10.0% to 20.0% of the time series standard deviation [50]). SE can be formally defined as
Definition A1.
Given a time series X = { x 1 , x 2 , , x N } of length N, its SE for a pattern length m and a similarity criterion r is computed as the negative natural logarithm of the probability that if two simultaneous data points of a subset X m X, of length m have distance ≤ r from each other, then two simultaneous data points of a subset X m + 1 X of length m + 1 also have distance ≤ r.
S E = l o g ( P Q )
where P is the number of subset pairs with d ( X m + 1 ( i ) , X m + 1 ( j ) ) r and Q represents the number of subset pairs with d ( X m ( i ) , X m ( j ) ) r . In this definition, d ( x , y ) is a function that returns the distance between its two arguments. Although Chebyshev distance has been frequently used for this purpose, there is indeed no restriction on the choice of this function and any similarity-distance function (including the Euclidean distance) can be used.
Figure A1A illustrates the process of SE computation for a time series X = x 1 , x 2 , , and the pattern length m = 2. Dotted horizontal lines around data points x 1 , x 2 , and x 3 represent x 1 ± r , x 2 ± r , and x 3 ± r boundaries. In this setting, two data points are said to match each other (i.e., to be indistinguishable) if the absolute difference between them is r. For instance, x 9 and x 13 in Figure A1A are data points that match x 1 (i.e., all orange-coloured data points). In the same vein, x 14 and x 17 match x 2 and x 3 , respectively. In this subplot, there exists one orange-red subsequence that matches the subsequence x 1 , x 2 (i.e., x 13 , x 14 ) and one orange-red-green subsequence that matches x 1 , x 2 , x 3 (i.e., x 13 , x 14 , x 17 ). SE computation proceeds by first acquiring the total counts of such subsequences of length two and three for the entire time series and then finding their ratio. SE of the given time series, X, is the negative logarithm of this ratio. This quantity is the probability that the subsequences that match each other for their first two data points will also match each others for their next point (i.e., subsequences of length three). SE has been shown to provide a reliable power for the study of young versus older adults based on short functional magnetic resonance imaging (fMRI) data [224].
Figure A1. (A) Sample entropy (SE) forms the backbone of MSE computation. For a given time series X = { x 1 , x 2 , , x N } of length N, its SE (Definition A1) for a pattern length m and a similarity criterion r is computed as the negative natural logarithm of the probability that if two simultaneous data points of a subset X m X, of length m have distance ≤ r from each other (e.g., x 1 , x 2 in this subplot, given a choice of similarity distance d in Definition A1), then two simultaneous data points of a subset X m + 1 X of length m + 1 also have distance ≤ r (e.g., x 1 , x 2 , x 3 in this subplot). In this subplot the dotted lines enclosing the red-, orange-, and green-coloured points signify the distance threshold r in Definition A1 (typically, defined as a positive real-number within 10.0% to 20.0% of the time series standard deviation [50]). (B) Coarse-graining of a time series in MSE procedure. MSE computation requires a scale factor m for coarse-graining of the signal. This subplot illustrates the scenario in which scale factor m = 3 is adapted. In this setting, the MSE of the given time series is the SEs of each of these coarse-grained versions (i.e., three MSEs; MSE1, MSE2, and MSE3 in this example). (C) PE computation. This subplot illustrates a scenario in which permutation order m = 3 is adapted. The ordinal symbolic transformations that are associated with m = 3 (i.e., 3! = 6) are shown in dark-grey boxes below the time series (i.e., S 1 through S 6 ). PE computation uses the count (i.e., total occurrences of each of these symbols) to compute their respective probability of occurrences. The PE of the sequence is the Shannon entropy of these probabilities.
Figure A1. (A) Sample entropy (SE) forms the backbone of MSE computation. For a given time series X = { x 1 , x 2 , , x N } of length N, its SE (Definition A1) for a pattern length m and a similarity criterion r is computed as the negative natural logarithm of the probability that if two simultaneous data points of a subset X m X, of length m have distance ≤ r from each other (e.g., x 1 , x 2 in this subplot, given a choice of similarity distance d in Definition A1), then two simultaneous data points of a subset X m + 1 X of length m + 1 also have distance ≤ r (e.g., x 1 , x 2 , x 3 in this subplot). In this subplot the dotted lines enclosing the red-, orange-, and green-coloured points signify the distance threshold r in Definition A1 (typically, defined as a positive real-number within 10.0% to 20.0% of the time series standard deviation [50]). (B) Coarse-graining of a time series in MSE procedure. MSE computation requires a scale factor m for coarse-graining of the signal. This subplot illustrates the scenario in which scale factor m = 3 is adapted. In this setting, the MSE of the given time series is the SEs of each of these coarse-grained versions (i.e., three MSEs; MSE1, MSE2, and MSE3 in this example). (C) PE computation. This subplot illustrates a scenario in which permutation order m = 3 is adapted. The ordinal symbolic transformations that are associated with m = 3 (i.e., 3! = 6) are shown in dark-grey boxes below the time series (i.e., S 1 through S 6 ). PE computation uses the count (i.e., total occurrences of each of these symbols) to compute their respective probability of occurrences. The PE of the sequence is the Shannon entropy of these probabilities.
Entropy 22 00917 g0a1

Appendix B.1.2. MSE Computation Using SE

MSE incorporates SE computation in a procedure that is known as coarse-graining of a given time series. The coarse-graining requires a scale factor m (typically set to 20). It quantifies how many rounds the signal’s coarse-graining and its subsequent SE computation is performed. Figure A1B illustrates this procedure in which scale factors m = 1, 2, 3 are applied. In essence, MSE starts with the scale factor 1 (i.e., the original time series X), thereby computing the SE (using Equation (A2)) for the original time series. It then repeats this SE computation for scale factor 2. The use of scale factor 2 generates a new coarse-grained copy of original time series in which every new coarse-grained data point y j is the average of two neighbouring data points x j 1 and x j . It is important to note that y j s are generated based on non-overlapping x j s of the original time series. Repeating this process for scale factors m = 1 , , M where M represents the upper limit for coarse-graining the given time series ( M = 3 in this example), produces MSEs for the given time series, X. It is apparent that the number of MSEs computed for X equals the selected scale factor (e.g., three MSEs for the example in Figure A1B). Courtiol et al. [225] presents comprehensive analytical guidelines and considerations for the use of MSE.

Appendix B.2. PE

PE utilizes the reoccurrences of ordinal patterns in the signal to estimate the signal variability through application of Shannon entropy [22]. PE requires a permutation order m. This parameter is set in such a way that 3 m ! < < N [46,226,227] where N is the length of the time series. Figure A1C illustrates the process of PE computation for permutation order m = 3 . The bottom subplot shows the ordinal symbols that correspond to this permutation order. One can consider the use of such an ordinal transformation as a discretization of the original time series. In essence, PE computes the number of occurrences of each of these symbols in the given time series (Figure A1C, top subplot). It then converts these counts to the probability of their respective symbols’ occurrences
p i = | S i | | S | , S i S , i = 1 , , m !
where S i S is subset pertinent to the i th symbol (e.g., S1 in Figure A1C) and | . | returns the cardinality of its argument. S comprises all symbols (i.e., | S | = m ! ), and m is permutation order. p i captures the probability of the i th symbol. These probabilities can then be conveniently used to compute the permutation entropy of the time series using Shannon entropy - p i p i × l o g b ( p i ) , i = 1 , , N (i.e., Equation (A1)), where N refers to the total number of symbols (i.e., 6 in Figure A1C), p i is the probability associated with the i th symbol (i.e., S i ), and b is the base of the logarithm.

Appendix B.3. DE

DE or differential entropy refers to the entropy of a continuous random variable [71]. DE computation is based on the estimation of the probability distribution that underlies the observed time series data. When a signal follows a parametric distribution (e.g., normal distribution), its entropy can be computed using a model-based estimator that utilizes the signal’s parameters. For instance, in the case of the Gaussian signal, one can compute the signal’s DE as
D E ( X ) = 0.5 × l o g b ( 2.0 × π × e × σ 2 )
where DE is the differential entropy of the time series X and σ 2 represents the X’s variance [71]. It is also worth noting that the DE in Equation (A4) is in fact one of the parametric distributions that satisfies the principle of maximum entropy (Appendix A.1) when the time series is considered variance (e.g., power) stationary. Another maximum entropy preserving parametric distribution is exponential distribution that can be used for the case of the mean stationary signal. The DE in the case of exponential distribution can be computed as [228] (p. 125)
D E ( X ) = l o g b ( e μ )
with X and D E representing the time series and its differential entropy, and μ is X’s variance [71]. In Equation (A5), e 2.71828 is the mathematical constant (i.e., the base of the natural logarithm) and b is the base of the logarithm. Similar to the case of Equation (A1), there is no restriction to the choice of this base and it could be any permissible base (b = 2, 10, or e) as long as its use facilitates the interpretation of the time series’ quantified entropy.
On the other hand, when no such assumptions about the time series distribution is warranted, model-free approaches are employed. These approaches approximate the signal’s probability distribution directly from its data points. The most intuitive model-free approach utilizes histograms to approximate the time series distribution. However, this approach suffers from bias problems and its strong dependence on the size of the quantization levels (i.e., number of histogram’s bins) [229]. The weaknesses of histogram-based entropy computation are improved by using density estimators that make use of kernels [196] or the nearest neighbour [230,231] estimators. Xiong et al. [48] present a comprehensive treatment of this topic.

Appendix C. Definitions

Definition A2.
1 f is a form of fluctuation in which the power density is inversely proportional to the frequency of the signal (i.e., ∼ 1 f ). The 1 f noise is also referred to as pink or flicker noise [232].
Definition A3.
Brain waves (first discovered by the German psychologist Hans Berger in 1929 [64] (p. 87)). These waves are associated with five frequency bands [171] (p. 100): delta (1–4 Hz), theta (4–8 Hz), alpha (7.5–12.5 Hz), beta (13–30 Hz), gamma (30–70 Hz), and high gamma (>70 Hz). As noted by Gazzaniga et al. (ibid.), although the brain waves are described in terms of these frequency bands, we cannot assume that the individual neurons oscillate at these frequencies. Interestingly, these frequency bands are associated with different mental states. Some examples include [232] (p. 87), delta and sleep and drowsiness, theta and such metal states as unconscious processing (and also engagement in cognitively demanding tasks [171] (p. 100)), alpha and change in level of arousal (e.g., increase in alpha signals reduced states of attention [171] (p. 100)), beta and the cognitive and emotional activation. Gamma, at present, is presumed to reflect some of the highest functions of the human brain, such perceptual and higher cognitive processes [233].

References

  1. Garrett, D.D.; Kovacevic, N.; McIntosh, A.R.; Grady, C.L. The importance of being variable. J. Neurosci. 2011, 31, 4496–4503. [Google Scholar] [CrossRef] [Green Version]
  2. Heisz, J.J.; Shedden, J.M.; McIntosh, A.R. Relating brain signal variability to knowledge representation. Neuroimage 2012, 63, 1384–1392. [Google Scholar] [CrossRef]
  3. Garrett, D.D.; Samanez-Larkin, G.R.; MacDonald, S.W.S.; Lindenberger, U.; MacIntosh, A.R.; Grady, C.L. Moment-to-moment brain signal variability: A next frontier in human brain mapping? Neurosci. Behav. Rev. 2013, 37, 610–624. [Google Scholar] [CrossRef] [Green Version]
  4. Pouget, A.; Drugowitsch, J.; Kepecs, A. Confidence and certainty: Distinct probabilistic quantities for different goals. Nat. Neurosci. 2016, 19, 366–374. [Google Scholar] [CrossRef] [Green Version]
  5. Friston, K. The free-energy principle: A unified brain theory? Nat. Rev. Neurosci. 2010, 11, 127–138. [Google Scholar] [CrossRef]
  6. Friston, K.J.; Wiese, W.; Hobson, J.A. Sentience and the Origins of Consciousness: From Cartesian Duality to Markovian Monism. Entropy 2020, 22, 516. [Google Scholar] [CrossRef]
  7. Stein, R.B.; Gossen, E.R.; Jones, K.E. Neuronal variability: Noise or part of the signal? Nat. Rev. Neurosci. 2005, 6, 389–397. [Google Scholar] [CrossRef]
  8. Faisal, A.A.; Selen, L.P.; Wolpert, D.M. Noise in the nervous system. Nat. Rev. Neurosci. 2008, 9, 292–303. [Google Scholar] [CrossRef]
  9. Avena-Koenigsberger, A.; Misic, B.; Sporns, O. Communication dynamics in complex brain networks. Nat. Rev. Neurosci. 2018, 19, 17–33. [Google Scholar] [CrossRef]
  10. Muller, L.; Chavane, F.; Reynolds, J.; Sejnowski, T.J. Cortical travelling waves: Mechanisms and computational principles. Nat. Rev. Neurosci. 2018, 19, 255–268. [Google Scholar] [CrossRef]
  11. Bak, P.; Tang, C.; Wiesenfeld, K. Self-organized criticality: An explanation of the 1 f noise. Phys. Rev. Lett. 1987, 59, 381–384. [Google Scholar] [CrossRef] [PubMed]
  12. Shew, W.L.; Plenz, D. The functional benefits of criticality in the cortex. Neuroscientist 2013, 19, 88–100. [Google Scholar] [CrossRef] [PubMed]
  13. Fagerholm, E.D.; Lorenz, R.; Scott, G.; Dinov, M.; Hellyer, P.J.; Mirzaei, N.; Leeson, C.; Carmichael, D.W.; Sharp, D.J.; Shew, W.L.; et al. Cascades and cognitive state: Focused attention incurs subcritical dynamics. J. Neurosci. 2015, 35, 4626–4634. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Palva, S.; Palva, J.M. Roles of brain criticality and multiscale oscillations in temporal predictions for sensorimotor processing. Trends Neurosci. 2018, 104, 729–743. [Google Scholar] [CrossRef] [Green Version]
  15. Ma, Z.; Turrigiano, G.G.; Wessel, R.; Hengen, K.B. Cortical circuit dynamics are homeostatically tuned to criticality in vivo. Neuron 2019, 41, 655–664. [Google Scholar] [CrossRef]
  16. Laughlin, S. A simple coding procedure enhances a neuron’s information capacity. Z. Nat. 1981, 36, 910–912. [Google Scholar] [CrossRef]
  17. Shew, W.L.; Yang, H.; Yu, S.; Roy, R.; Plenz, D. Information capacity and transmission are maximized in balanced cortical networks with neuronal avalanches. J. Neurosci. 2011, 31, 55–63. [Google Scholar] [CrossRef] [Green Version]
  18. Carhart-Harris, R.L. The entropic brain-revisited. Neuropharmacology 2018, 142, 167–178. [Google Scholar] [CrossRef]
  19. Lungarella, M.; Sporns, O. Mapping information flow in sensorimotor networks. PLoS Comput. Biol. 2006, 2, e144. [Google Scholar] [CrossRef] [Green Version]
  20. Quiroga, R.Q.; Panzeri, S. Extracting information from neuronal populations: Information theory and decoding approaches. Nat. Rev. Neurosci. 2009, 10, 173–185. [Google Scholar] [CrossRef]
  21. Sengupta, B.; Stemmier, M.B.; Friston, K.J. Information and efficiency in the nervous system—A synthesis. PLoS Comput. Biol. 2013, 9, e1003157. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  22. Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  23. Miller, G.A. The magical number seven, plus or minus two: Some limits on our capacity for processing information. Physiol. Rev. 1956, 63, 81–97. [Google Scholar] [CrossRef] [Green Version]
  24. Strong, S.P.; Koberle, R.; de Ruyter van Steveninck, R.R.; Bialek, W. Entropy and information in neural spike trains. Phys. Rev. Lett. 1998, 80, 197. [Google Scholar] [CrossRef]
  25. Borst, A.; Theunissen, F.E. Information theory and neural coding. Nat. Neurosci. 1999, 2, 947–957. [Google Scholar] [CrossRef]
  26. Sharpee, T.O.; Calhoun, A.J.; Chalasani, S.H. Information theory of adaptation in neurons, behavior, and mood. Curr. Opin. Neurobiol. 2014, 25, 47–53. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  27. Tononi, G.; McIntosh, A.R.; Russell, D.P.; Edelman, G.M. Functional clustering: Identifying strongly interactive brain regions in neuroimaging data. Neuroimage 1998, 7, 133–149. [Google Scholar] [CrossRef] [Green Version]
  28. Demertzi, A.; Tagliazucchi, E.; Dehaene, S.; Deco, G.; Barttfeld, P.; Raimondo, F.; Martial, C.; Fernández-Espejo, D.; Rohaut, B.; Voss, H.U.; et al. Human consciousness is supported by dynamic complex patterns of brain signal coordination. Sci. Adv. 2019, 5, eaat7603. [Google Scholar] [CrossRef] [Green Version]
  29. Huang, Z.; Zhang, J.; Wu, J.; Mashour, G.A.; Hudetz, A.G. Temporal circuit of macroscale dynamic brain activity supports human consciousness. Sci. Adv. 2020, 6, eaaz0087. [Google Scholar] [CrossRef] [Green Version]
  30. Cao, Y.; Tung, W.W.; Gao, J.B.; Protopopescu, V.A.; Hively, L.M. Detecting dynamical changes in time series using the permutation entropy. Phys. Rev. E 2004, 70, 046217. [Google Scholar] [CrossRef] [Green Version]
  31. Shi, L.C.; Jiao, Y.Y.; Lu, B.L. Differential entropy feature for EEG-based vigilance estimation. In Proceedings of the 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; pp. 6627–6630. [Google Scholar]
  32. Zheng, W.L.; Lu, B.L. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans. Auton. Ment. Dev. 2015, 7, 162–175. [Google Scholar] [CrossRef]
  33. Keshmiri, S.; Sumioka, H.; Yamazaki, R.; Ishiguro, H. A non-parametric approach to the overall estimate of cognitive load using NIRS time series. Front. Hum. Neurosci. 2017, 11, 15. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  34. Owen, A.M.; McMillan, K.M.; Laird, A.R.; Bullmore, E. N-back working memory paradigm: A meta-analysis of normative functional neuroimaging studies. Hum. Brain Mapp. 2005, 25, 46–59. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  35. Liu, M.; Liu, X.; Hildebrandt, A.; Zhou, C. Individual cortical entropy profile: Test?retest reliability, predictive power for cognitive ability, and neuroanatomical foundation. Cereb. Cortex Commun. 2020, 1, tgaa015. [Google Scholar] [CrossRef]
  36. Zhang, Y.C. Complexity and 1/f noise. A phase space approach. J. Phys. I 1991, 1, 971–977. [Google Scholar] [CrossRef]
  37. Bandt, C.; Shiha, F. Order patterns in time series. J. Time Ser. Anal. 2007, 28, 646–665. [Google Scholar] [CrossRef]
  38. Haruna, T.; Nakajima, K. Permutation complexity via duality between values and orderings. Phys. D Nonlinear Phenom. 2011, 240, 1370–1377. [Google Scholar] [CrossRef] [Green Version]
  39. Amigó, J.M.; Kennel, M.B.; Kocarev, L. The permutation entropy rate equals the metric entropy rate for ergodic information sources and ergodic dynamical systems. Phys. D Nonlinear Phenom. 2005, 210, 77–95. [Google Scholar] [CrossRef] [Green Version]
  40. Amigó, J.M. Permutation Complexity in Dynamical Systems: Ordinal Patterns, Permutation Entropy and All That; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
  41. Barnett, L.; Bossomaier, T. Transfer entropy as a log-likelihood ratio. Phys. Rev. Lett. 2013, 109, 0138105. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  42. Gao, J.; Hu, J.; Liu, F.; Cao, Y. Multiscale entropy analysis of biological signals: A fundamental bi-scaling law. Front. Comput. Neurosci. 2015, 9, 64. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  43. Keshmiri, S.; Sumioka, H.; Okubo, M.; Ishiguro, H. An Information-Theoretic Approach to Quantitative Analysis of the Correspondence Between Skin Blood Flow and Functional Near-Infrared Spectroscopy Measurement in Prefrontal Cortex Activity. Front. Neurosci. 2018, 13, 79. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  44. Keshmiri, S.; Sumioka, H.; Yamazaki, R.; Ishiguro, H. Differential Entropy Preserves Variational Information of Near-Infrared Spectroscopy Time Series Associated with Working Memory. Front. Neuroinform. 2018, 12, 33. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  45. Dávalos, A.; Jabloun, M.; Ravier, P.; Buttelli, O. On the Statistical Properties of Multiscale Permutation Entropy: Characterization of the Estimator’s Variance. Entropy 2019, 21, 450. [Google Scholar] [CrossRef] [Green Version]
  46. Riedl, M.; Müller, A.; Wessel, N. Practical considerations of permutation entropy. Eur. Phys. J. Spec. Top. 2013, 222, 249–262. [Google Scholar] [CrossRef]
  47. Yang, A.C.; Tsai, S.J.; Lin, C.P.; Peng, C.K. A strategy to reduce bias of entropy estimates in resting-state fMRI signals. Front. Neurosci. 2018, 12, 398. [Google Scholar] [CrossRef]
  48. Xiong, W.; Faes, L.; Ivanov, P.C. Entropy measures, entropy estimators, and their performance in quantifying complex dynamics: Effects of artifacts, nonstationarity, and long-range correlations. Phys. Rev. E 2017, 95, 062114-1–062114-37. [Google Scholar] [CrossRef] [Green Version]
  49. Mediano, P.A.; Seth, A.K.; Barrett, A.B. Measuring integrated information: Comparison of candidate measures in theory and simulation. Entropy 2019, 21, 17. [Google Scholar] [CrossRef] [Green Version]
  50. Goldberger, A.; Amaral, L.; Glass, L.; Hausdorff, J.; Ivanov, P.; Mark, P. Physiobank, physiotoolkit, and physionet: Components of a new research resource for complex physiologic signals. Circulation 2000, 88, e215–e220. [Google Scholar] [CrossRef] [Green Version]
  51. Ince, R.A.; Petersen, R.S.; Swan, D.C.; Panzeri, S. Python for information theoretic analysis of neural data. Front. Neuroinform. 2009, 3, 4. [Google Scholar] [CrossRef] [Green Version]
  52. Lindner, M.; Vicente, R.; Priesemann, V.; Wibral, M. TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy. BMC Neurosci. 2011, 12, 119. [Google Scholar] [CrossRef] [Green Version]
  53. Szabó, Z. Information theoretical estimators toolbox. J. Mach. Learn. Res. 2014, 15, 283–287. [Google Scholar]
  54. Lizier, J.T. JIDT: An information-theoretic toolkit for studying the dynamics of complex systems. Front. Robot. AI 2014, 1, 11. [Google Scholar] [CrossRef] [Green Version]
  55. Ince, R.A.; Giordano, B.L.; Kayser, C.; Rousselet, G.A.; Gross, J.; Schyns, P.G. A statistical framework for neuroimaging data analysis based on mutual information estimated via a gaussian copula. Hum. Brain Mapp. 2017, 38, 1541–1573. [Google Scholar] [CrossRef] [PubMed]
  56. Mayner, W.G.; Marshall, W.; Albantakis, L.; Findlay, G.; Marchman, R.; Tononi, G. PyPhi: A toolbox for integrated information theory. PLoS Comput. Biol. 2018, 14, e1006343. [Google Scholar] [CrossRef]
  57. Wollstadt, P.; Lizier, J.T.; Vicente, R.; Finn, C.; Martinez-Zarzuela, M.; Mediano, P.; Novelli, L.; Wibral, M. IDTxl: The Information Dynamics Toolkit xl: A Python package for the efficient analysis of multivariate information dynamics in networks. J. Open Source Softw. 2018, 4, 1081. [Google Scholar] [CrossRef]
  58. Nicholas, M. Timme, Christopher Lapish, A Tutorial for Information Theory in Neuroscience. eNuro 2018, 5. [Google Scholar] [CrossRef]
  59. Takahashi, T. Complexity of spontaneous brain activity in mental disorders. Prog. Neuro Psychopharmacol. Biol. Psychiatry 2013, 45, 258–266. [Google Scholar]
  60. Carhart-Harris, R.L.; Leech, R.; Hellyer, P.J.; Shanahan, M.; Feilding, A.; Tagliazucchi, E.; Chialvo, D.R.; Nutt, D. The entropic brain: A theory of conscious states informed by neuroimaging research with psychedelic drugs. Front. Hum. Neurosci. 2014, 8, 20. [Google Scholar] [CrossRef] [Green Version]
  61. Tononi, G.; Koch, C. Consciousness: Here, there and everywhere? Philos. Trans. R. Soc. B Biol. Sci. 2015, 370, 20140167. [Google Scholar] [CrossRef] [Green Version]
  62. Tononi, G.; Edelman, G.M. Consciousness and complexity. Science 1998, 282, 1846–1851. [Google Scholar] [CrossRef]
  63. Jeffery, K.J.; Rovelli, C. Transitions in brain evolution: Space, time and entropy. Trends Neurosci. 2020, 43, 467–474. [Google Scholar] [CrossRef] [PubMed]
  64. Panksepp, J. Affective Neuroscience: The Foundations of Human and Animal Emotions; Oxford University Press: Oxford, UK, 2004; pp. 128–129. [Google Scholar]
  65. Goldberger, A.L.; Peng, C.K.; Lipsitz, L.A. What is physiologic complexity and how does it change with aging and disease? Neurobiol. Aging 2002, 23, 23–26. [Google Scholar] [CrossRef]
  66. Manor, B.; Lipsitz, L.A. Physiologic complexity and aging: Implications for physical function and rehabilitation. Prog. Neuro Psychopharmacol. Biol. Psychiatry 2013, 45, 287–293. [Google Scholar]
  67. Sleimen-Malkoun, R.; Temprado, J.J.; Hong, S.L. Aging induced loss of complexity and dedifferentiation: Consequences for coordination dynamics within and between brain, muscular and behavioral levels. Front. Aging Neurosci. 2014, 6, 140. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  68. Beggs, J.M. The criticality hypothesis: How local cortical networks might optimize information processing. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2008, 366, 329–343. [Google Scholar] [CrossRef]
  69. Stam, C.J. Nonlinear dynamical analysis of EEG and MEG: Review of an emerging field. Clin. Neurophysiol. 2005, 116, 2266–2301. [Google Scholar] [CrossRef]
  70. Pereda, E.; Quiroga, R.Q.; Bhattacharya, J. Nonlinear multivariate analysis of neurophysiological signals. Prog. Neurobiol. 2005, 77, 1–37. [Google Scholar] [CrossRef] [Green Version]
  71. Cover, T.M.; Thomas, J.A. Elements of Information Theory, 2nd ed.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2006. [Google Scholar]
  72. Costa, M.; Goldberger, A.L.; Peng, C.K. Multiscale entropy analysis of complex physiologic time series. Phys. Rev. Lett. 2002, 89, 068102. [Google Scholar] [CrossRef] [Green Version]
  73. Zanin, M.; Zunino, L.; Rosso, O.A.; Papo, D. Permutation entropy and its main biomedical and econophysics applications: A review. Entropy 2012, 14, 1553–1577. [Google Scholar] [CrossRef]
  74. Wutz, A.; Loonis, R.; Roy, J.E.; Donoghue, J.A.; Miller, E.K. Different levels of category abstraction by different dynamics in different prefrontal areas. Neuron 2018, 97, 1–11. [Google Scholar] [CrossRef]
  75. King, J.R.; Sitt, J.D.; Faugeras, F.; Rohaut, B.; El Karoui, I.; Cohen, L.; Naccache, L.; Dehaene, S. Information sharing in the brain indexes consciousness in noncommunicative patients. Curr. Biol. 2013, 23, 1914–1919. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  76. Lehembre, R.; Bruno, M.A.; Vanhaudenhuyse, A.; Chatelle, C.; Cologan, V.; Leclercq, Y.; Soddu, A.; Macq, B.; Laureys, S.; Noirhomme, Q. Resting-state EEG study of comatose patients: A connectivity and frequency analysis to find differences between vegetative and minimally conscious states. Funct. Neurol. 2012, 27, 41–47. [Google Scholar] [PubMed]
  77. Fingelkurts, A.A.; Fingelkurts, A.A.; Bagnato, S.; Boccagni, C.; Galardi, G. DMN operational synchrony relates to self-consciousness: Evidence from patients in vegetative and minimally conscious states. Open Neuroimaging J. 2012, 6, 55–68. [Google Scholar] [CrossRef] [PubMed]
  78. Baars, B.J. A Cognitive Theory of Consciousness; Cambridge University Press: Cambridge, UK, 1989. [Google Scholar]
  79. Rees, G.; Kreiman, G.; Koch, C. Neural correlates of consciousness in humans. Nat. Rev. Neurosci. 2002, 3, 261–270. [Google Scholar] [CrossRef] [PubMed]
  80. Dehaene, S.; Changeux, J.P.; Naccache, L.; Sackur, J.; Sergent, C. Conscious, preconscious, and subliminal processing: A testable taxonomy. Trends Cogn. Sci. 2006, 10, 204–211. [Google Scholar] [CrossRef] [Green Version]
  81. Tononi, G.; Boly, M.; Massimini, M.; Koch, C. Integrated information theory: From consciousness to its physical substrate. Nat. Rev. Neurosci. 2016, 17, 450–461. [Google Scholar] [CrossRef]
  82. Laureys, S.; Schiff, N.D. Coma and consciousness: Paradigms (re) framed by neuroimaging. Neuroimage 2012, 61, 478–491. [Google Scholar] [CrossRef] [Green Version]
  83. Ekroot, L.; Cover, T.M. The entropy of Markov trajectories. IEEE Trans. Inf. Theory 1993, 39, 1418–1421. [Google Scholar] [CrossRef] [Green Version]
  84. Raichle, M.E.; MacLeod, A.M.; Snyder, A.Z.; Powers, W.J.; Gusnard, D.A.; Shulman, G.L. A default mode of brain function. Proc. Natl. Acad. Sci. USA 2001, 98, 676–682. [Google Scholar] [CrossRef] [Green Version]
  85. Corbetta, M.; Shulman, G.L. Control of goal-directed and stimulus-driven attention in the brain. Nat. Rev. Neurosci. 2002, 3, 201–215. [Google Scholar] [CrossRef]
  86. Fox, M.D.; Snyder, A.Z.; Vincent, J.L.; Corbetta, M.; Van Essen, D.C.; Raichle, M.E. The human brain is intrinsically organized into dynamic, anticorrelated functional networks. Proc. Natl. Acad. Sci. USA 2005, 102, 9673–9678. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  87. Fox, M.D.; Snyder, A.Z.; Vincent, J.L.; Corbetta, M.; Van Essen, D.C.; Raichle, M.E.; Demertzi, A.; Soddu, A.; Laureys, S. Consciousness supporting networks. Curr. Opin. Neurobiol. 2013, 23, 239–244. [Google Scholar]
  88. Carhart-Harris, R.L.; Friston, K.J. The default-mode, ego-functions and free-energy: A neurobiological account of Freudian ideas. Brain 2010, 133, 1265–1283. [Google Scholar] [CrossRef] [PubMed]
  89. Boveroux, P.; Vanhaudenhuyse, A.; Bruno, M.A.; Noirhomme, Q.; Lauwick, S.; Luxen, A.; Degueldre, C.; Plenevaux, A.; Schnakers, C.; Phillips, C.; et al. Breakdown of within- and between-network resting state functional magnetic resonance imaging connectivity during propofol-induced loss of consciousness. Anesthesiol. J. Am. Soc. Anesthesiol. 2010, 113, 1038–1053. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  90. Di Perri, C.; Bahri, M.A.; Amico, E.; Thibaut, A.; Heine, L.; Antonopoulos, G.; Charland-Verville, V.; Wannez, S.; Gomez, F.; Hustinx, R.; et al. Neural correlates of consciousness in patients who have emerged from a minimally conscious state: A cross-sectional multimodal imaging study. Lancet Neurol. 2016, 15, 830–842. [Google Scholar] [CrossRef]
  91. Threlkeld, Z.D.; Bodien, Y.G.; Rosenthal, E.S.; Giacino, J.T.; Nieto-Castanon, A.; Wu, O.; Whitfield-Gabrieli, S.; Edlow, B.L. Functional networks reemerge during recovery of consciousness after acute severe traumatic brain injury. Cortex 2018, 106, 299–308. [Google Scholar] [CrossRef]
  92. Shine, J.M.; Bissett, P.G.; Bell, P.T.; Koyejo, O.; Balsters, J.H.; Gorgolewski, K.J.; Moodie, C.A.; Poldrack, R.A. The dynamics of functional brain networks: Integrated network states during cognitive task performance. Neuron 2016, 92, 544–554. [Google Scholar] [CrossRef] [Green Version]
  93. Tononi, G. Consciousness as integrated information: A provisional manifesto. Biol. Bull. 2008, 215, 216–242. [Google Scholar] [CrossRef]
  94. Dehaene, S.; Changeux, J.P. Experimental and theoretical approaches to conscious processing. Neuron 2011, 70, 200–227. [Google Scholar] [CrossRef] [Green Version]
  95. Northoff, G.; Huang, Z. How do the brain?s time and space mediate consciousness and its different dimensions? Temporo-spatial theory of consciousness (TTC). Neurosci. Biobehav. Rev. 2017, 80, 630–645. [Google Scholar] [CrossRef]
  96. Deco, G.; Kringelbach, M.L. Metastability and coherence: Extending the communication through coherence hypothesis using a whole-brain computational perspective. Trends Neurosci. 2016, 39, 125–135. [Google Scholar] [CrossRef]
  97. Lindquist, K.A.; Satpute, A.B.; Wager, T.D.; Weber, J.; Barrett, L.F. The brain basis of positive and negative affect: Evidence from a meta-analysis of the human neuroimaging literature. Cereb. Cortex 2016, 26, 1910–1922. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  98. Keshmiri, S.; Shiomi, M.; Ishiguro, H. Entropy of the Multi-Channel EEG Recordings Identifies the Distributed Signatures of Negative, Neutral and Positive Affect in Whole-Brain Variability. Entropy 2019, 21, 1228. [Google Scholar] [CrossRef] [Green Version]
  99. Barttfeld, P.; Uhrig, L.; Sitt, J.D.; Sigman, M.; Jarraya, B.; Dehaene, S. Signature of consciousness in the dynamics of resting-state brain activity. Proc. Natl. Acad. Sci. USA 2015, 112, 887–892. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  100. Miskovic, V.; MacDonald, K.J.; Rhodes, L.J.; Cote, K.A. Changes in EEG multiscale entropy and power-law frequency scaling during the human sleep cycle. Hum. Brain Mapp. 2019, 40, 538–551. [Google Scholar] [CrossRef]
  101. Sheehan, T.C.; Sreekumar, V.; Inati, S.K.; Zaghloul, K.A. Signal complexity of human intracranial EEG tracks successful associative-memory formation across individuals. J. Neurosci. 2018, 38, 1744–1755. [Google Scholar] [CrossRef] [Green Version]
  102. Waschke, L.; Wöstmann, M.; Obleser, J. States and traits of neural irregularity in the age-varying human brain. Sci. Rep. 2017, 7, 1. [Google Scholar] [CrossRef] [Green Version]
  103. Voytek, B.; Knight, R.T. Dynamic network communication as a unifying neural basis for cognition, development, aging, and disease. Biol. Psychiatry 2015, 77, 1089–1097. [Google Scholar] [CrossRef] [Green Version]
  104. Voytek, B.; Kramer, M.A.; Case, J.; Lepage, K.Q.; Tempesta, Z.R.; Knight, R.T.; Gazzaley, A. Age-related changes in 1 f neural electrophysiological noise. J. Neurosci. 2015, 35, 13257–13265. [Google Scholar] [CrossRef]
  105. Heisz, J.J.; Gould, M.; McIntosh, A.R. Age-related shift in neural complexity related to task performance and physical activity. J. Cogn. Neurosci. 2015, 27, 605–613. [Google Scholar] [CrossRef]
  106. Mizuno, T.; Takahashi, T.; Cho, R.Y.; Kikuchi, M.; Murata, T.; Takahashi, K.; Wada, Y. Assessment of EEG dynamical complexity in Alzheimer’s disease using multiscale entropy. Clin. Neurophysiol. 2010, 121, 1438–1446. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  107. Vakorin, V.A.; Lippé, S.; McIntosh, A.R. Variability of brain signals processed locally transforms into higher connectivity with brain development. J. Neurosci. 2011, 31, 6405–6413. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  108. Wang, H.; McIntosh, A.R.; Kovacevic, N.; Karachalios, M.; Protzner, A.B. Age-related multiscale changes in brain signal variability in pre-task versus post-task resting-state EEG. J. Cogn. Neurosci. 2016, 28, 971–984. [Google Scholar] [CrossRef] [PubMed]
  109. McIntosh, A.R.; Vakorin, V.; Kovacevic, N.; Wang, H.; Diaconescu, A.; Protzner, A.B. Spatiotemporal dependency of age-related changes in brain signal variability. Cereb. Cortex 2014, 24, 1806–1817. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  110. McDonough, I.M.; Nashiro, K. Network complexity as a measure of information processing across resting-state networks: Evidence from the Human Connectome Project. Front. Hum. Neurosci. 2014, 8, 409. [Google Scholar] [CrossRef] [Green Version]
  111. Tononi, G.; Koch, C. The neural correlates of consciousness: An update. Ann. N. Y. Acad. Sci. 2008, 1124, 239–261. [Google Scholar] [CrossRef]
  112. Cote, K.A. Probing awareness during sleep with the auditory odd-ball paradigm. Int. J. Psychophysiol. 2002, 46, 227–241. [Google Scholar] [CrossRef]
  113. Cote, K.A.; Etienne, L.; Campbell, K.B. Neurophysiological evidence for the detection of external stimuli during sleep. Sleep 2001, 24, 1–13. [Google Scholar]
  114. Shi, W.; Shang, P.; Ma, Y.; Sun, S.; Yeh, C.H. A comparison study on stages of sleep: Quantifying multiscale complexity using higher moments on coarse-graining. Commun. Nonlinear Sci. Numer. Simul. 2016, 44, 292–303. [Google Scholar] [CrossRef]
  115. Olofsen, E.; Sleigh, J.W.; Dahan, A. Permutation entropy of the electroencephalogram: A measure of anaesthetic drug effect. Br. J. Anaesth. 2008, 101, 810–821. [Google Scholar] [CrossRef] [Green Version]
  116. Jordan, D.; Schneider, G.; Kochs, E.F. EEG permutation entropy separates consciousness from unconsciousness during anesthesia. Anesthesiology 2006, 105, A1551. [Google Scholar]
  117. Jordan, D.; Stockmanns, G.; Kochs, E.F.; Schneider, G. Permutation entropy of the EEG indicates increase and decrease of the anesthetic level. Anesthesiology 2007, 101, A800. [Google Scholar]
  118. Silva, A.; Cardoso-Cruz, H.; Silva, F.; Galhardo, V.; Antunes, L. Comparison of anesthetic depth indexes based on thalamocortical local field potentials in rats. Anesthesiology 2010, 112, 355–363. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  119. Rainville, P.; Price, D.D. Hypnosis phenomenology and the neurobiology of consciousness. Int. J. Clin. Exp. Hypn. 2003, 51, 105–129. [Google Scholar] [CrossRef]
  120. Rainville, P.; Hofbauer, R.K.; Bushnell, M.C.; Duncan, G.H.; Price, D.D. Hypnosis modulates activity in brain structures involved in the regulation of consciousness. J. Cogn. Neurosci. 2002, 14, 887–901. [Google Scholar] [CrossRef]
  121. Fingelkurts, A.A.; Fingelkurts, A.A.; Kallio, S.; Revonsuo, A. Cortex functional connectivity as a neurophysiological correlate of hypnosis: An EEG study. Neuropsychologia 2007, 45, 1452–1462. [Google Scholar] [CrossRef]
  122. Terhune, D.B.; Cardeña, E.; Lindgren, M. Differential frontal-parietal phase synchrony during hypnosis as a function of hypnotic suggestibility. Psychophysiology 2011, 48, 1444–1447. [Google Scholar] [CrossRef]
  123. Cardeña, E.; Jonsson, P.; Terhune, D.B.; Marcusson-Clavertz, D. The neurophenomenology of neutral hypnosis. Cortex 2013, 49, 375–385. [Google Scholar] [CrossRef]
  124. Shaw, J.C. Correlation and coherence analysis of the EEG—A selective tutorial review. Int. J. Psychophysiol. 1984, 1, 255–266. [Google Scholar] [CrossRef]
  125. Nolte, G.; Bai, O.; Wheaton, L.; Mari, Z.; Vorbach, S.; Hallett, M. Identifying true brain interaction from EEG data using the imaginary part of coherency. Clin. Neurophysiol. 2004, 115, 2292–2307. [Google Scholar] [CrossRef]
  126. Deivanayagi, S.; Manivannan, M.; Fernandez, P. Spectral analysis of EEG signals during hypnosis. Int. J. Syst. Cybern. Inform. 2007, 4, 75–80. [Google Scholar]
  127. Sabourin, M.E.; Cutcomb, S.D.; Crawford, H.J.; Pribram, K. EEG correlates of hypnotic susceptibility and hypnotic trance: Spectral analysis and coherence. Int. J. Psychophysiol. 1990, 10, 125–142. [Google Scholar] [CrossRef]
  128. Jamieson, G.A.; Burgess, A.P. Hypnotic induction is followed by state-like changes in the organization of EEG functional connectivity in the theta and beta frequency bands in high-hypnotically susceptible individuals. Front. Hum. Neurosci. 2014, 8, 528. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  129. Keshmiri, S.; Alimardani, M.; Shiomi, M.; Sumioka, H.; Ishiguro, H.; Hiraki, K. Higher hypnotic suggestibility is associated with the lower EEG signal variability in theta, alpha, and beta frequency bands. PLoS ONE 2020, 15, e0230853. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  130. Han, Y.; Kebschull, J.M.; Campbell, R.A.A.; Cowan, D.; Imhof, F.; Zador, A.M.; Mrsic-Flogel, T.D. The logic of single-cell projections from visual cortex. Nature 2018, 556, 51–56. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  131. Burgess, A.P.; Gruzelier, J.H. Short duration power changes in the EEG during recognition memory for words and faces. Psychophysiology 2000, 37, 596–606. [Google Scholar] [CrossRef]
  132. Buzsaki, G. The hippocampo-neocortical dialogue. Cereb. Cortex 1996, 6, 81–92. [Google Scholar] [CrossRef] [Green Version]
  133. Tesche, C.D.; Karhu, J. Theta oscillations index human hippocampal activation during a working memory task. Proc. Natl. Acad. Sci. USA 2000, 97, 919–924. [Google Scholar] [CrossRef] [Green Version]
  134. Klimesch, W. EEG alpha and theta oscillations reflect cognitive and memory performance: A review and analysis. Brain Res. Brain Res. Rev. 1999, 29, 169–195. [Google Scholar] [CrossRef]
  135. McIntosh, A.R.; Kovacevic, N.; Itier, R.J. Increased brain signal variability accompanies lower behavioral variability in development. PLoS Comput. Biol. 2008, 4, e1000106. [Google Scholar] [CrossRef] [Green Version]
  136. Anokhin, A.P.; Lutzenberger, W.; Nikolaev, A.; Birbaumer, N. Complexity of electrocortical dynamics in children: Developmental aspects. Dev. Psychobiol. J. Int. Soc. Dev. Psychobiol. 2000, 36, 9–22. [Google Scholar] [CrossRef]
  137. Meyer-Lindenberg, A. The evolution of complexity in human brain development: An EEG study. Electroencephalogr. Clin. Neurophysiol. 1996, 99, 405–411. [Google Scholar] [CrossRef]
  138. Douglass, J.K.; Wilkens, L.; Pantazelou, E.; Moss, F. Noise enhancement of information transfer in crayfish mechanoreceptors by stochastic resonance. Nature 1993, 365, 337–340. [Google Scholar] [CrossRef] [PubMed]
  139. Destexhe, A.; Contreras, D. Neuronal computations with stochastic network states. Science 2006, 314, 85–90. [Google Scholar] [CrossRef] [Green Version]
  140. Traynelis, S.F.; Jaramillo, F. Getting the most out of noise in the central nervous system. Trends Neurosci. 1998, 21, 137–145. [Google Scholar] [CrossRef]
  141. Stacey, W.C.; Durand, D.M. Stochastic resonance improves signal detection in hippocampal CA1 neurons. J. Neurophysiol. 2000, 83, 1394–1402. [Google Scholar] [CrossRef]
  142. Manjarrez, E.; Rojas-Piloni, G.; Méndez, I.; Flores, A. Stochastic resonance within the somatosensory system: Effects of noise on evoked field potentials elicited by tactile stimuli. J. Neurosci. 2003, 23, 1997–2001. [Google Scholar] [CrossRef] [Green Version]
  143. McNamara, B.; Wiesenfeld, K. Theory of stochastic resonance. Phys. Rev. A 1989, 39, 4854–4869. [Google Scholar] [CrossRef]
  144. Ward, L.M.; Doesburg, S.M.; Kitajo, K.; MacLean, S.E.; Roggeveen, A.B. Neural synchrony in stochastic resonance, attention, and consciousness. Can. J. Exp. Psychol. Rev. Can. Psychol. Exp. 2006, 60, 319–326. [Google Scholar] [CrossRef]
  145. Fuchs, E.; Ayali, A.; Robinson, A.; Hulata, E.; Ben-Jacob, E. Coemergence of regularity and complexity during neural network development. Dev. Neurobiol. 2007, 67, 1802–1814. [Google Scholar] [CrossRef]
  146. Takahashi, T.; Cho, R.Y.; Murata, T.; Mizuno, T.; Kikuchi, M.; Mizukami, K.; Kosaka, H.; Takahashi, K.; Wada, Y. Age-related variation in EEG complexity to photic stimulation: A multiscale entropy analysis. Clin. Neurophysiol. 2009, 120, 476–483. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  147. Kyriazis, M. Practical applications of chaos theory to the modulation of human ageing: Nature prefers chaos to regularity. Biogerontology 2003, 4, 75–90. [Google Scholar] [CrossRef] [PubMed]
  148. Pincus, S.M. Assessing serial irregularity and its implications for health. Ann. N. Y. Acad. Sci. 2001, 954, 245–267. [Google Scholar] [CrossRef]
  149. Lipsitz, L.A.; Goldberger, A.L. Loss of ‘complexity’ and aging: Potential applications of fractals and chaos theory to senescence. JAMA 1992, 267, 1806–1809. [Google Scholar] [CrossRef] [PubMed]
  150. Yang, A.C.; Huang, C.C.; Yeh, H.L.; Liu, M.E.; Hong, C.J.; Tu, P.C.; Chen, J.F.; Huang, N.E.; Peng, C.K.; Lin, C.P.; et al. Complexity of spontaneous BOLD activity in default mode network is correlated with cognitive function in normal male elderly: A multiscale entropy analysis. Neurobiol. Aging 2013, 34, 428–438. [Google Scholar] [CrossRef]
  151. Fox, M.D.; Snyder, A.Z.; Zacks, J.M.; Raichle, M.E. Coherent spontaneous activity accounts for trial-to-trial variability in human evoked brain responses. Nat. Neurosci. 2006, 9, 23–25. [Google Scholar] [CrossRef]
  152. Friston, K.J. Theoretical neurobiology and schizophrenia. Br. Med. Bull. 1996, 52, 644–655. [Google Scholar] [CrossRef]
  153. Garrett, D.D.; Kovacevic, N.; McIntosh, A.R.; Grady, C.L. Blood oxygen level-dependent signal variability is more than just noise. J. Neurosci. 2010, 30, 4914–4921. [Google Scholar] [CrossRef] [Green Version]
  154. Nir, Y.; Mukamel, R.; Dinstein, I.; Privman, E.; Harel, M.; Fisch, L.; Gelbard-Sagiv, H.; Kipervasser, S.; Andelman, F.; Neufeld, M.Y.; et al. Interhemispheric correlations of slow spontaneous neuronal fluctuations revealed in human sensory cortex. Nat. Neurosci. 2008, 11, 1100–1108. [Google Scholar] [CrossRef] [Green Version]
  155. Damoiseaux, J.S.; Beckmann, C.F.; Arigita, E.S.; Barkhof, F.; Scheltens, P.; Stam, C.J.; Smith, S.M.; Rombouts, S.A.R.B. Reduced resting-state brain activity in the “default network” in normal aging. Cereb. Cortex 2008, 18, 1856–1864. [Google Scholar] [CrossRef] [Green Version]
  156. Koch, W.; Teipel, S.; Mueller, S.; Buerger, K.; Bokde, A.L.; Hampel, H.; Coates, U.; Reiser, M.; Meindl, T. Effects of aging on default mode network activity in resting state fMRI: Does the method of analysis matter? Neuroimage 2010, 51, 280–287. [Google Scholar] [CrossRef] [PubMed]
  157. Achard, S.; Bullmore, E. Efficiency and cost of economical brain functional networks. PLoS Comput. Biol. 2007, 3, e17. [Google Scholar] [CrossRef] [PubMed]
  158. Meunier, D.; Achard, S.; Morcom, A.; Bullmore, E. Age-related changes in modular organization of human brain functional networks. Neuroimage 2009, 44, 715–723. [Google Scholar] [CrossRef] [PubMed]
  159. Dustman, R.E.; Shearer, D.E.; Emmerson, R.Y. EEG and event-related potentials in normal aging. Prog. Neurobiol. 1993, 41, 369–401. [Google Scholar] [CrossRef]
  160. Dustman, R.E.; Shearer, D.E.; Emmerson, R.Y. Life-span changes in EEG spectral amplitude, amplitude variability and mean frequency. Clin. Neurophysiol. 1999, 110, 1399–1409. [Google Scholar] [CrossRef]
  161. Duffy, F.H.; Mcanulty, G.B.; Albert, M.S. Effects of age upon interhemispheric EEG coherence in normal adults. Neurobiol. Aging 1996, 17, 587–599. [Google Scholar] [CrossRef]
  162. Kikuchi, M.; Wada, Y.; Koshino, Y.; Nanbu, Y.; Hashimoto, T. Effect of normal aging upon interhemispheric EEG coherence: Analysis during rest and photic stimulation. Clin. Electroencephalogr. 2000, 31, 170–174. [Google Scholar] [CrossRef]
  163. Baltes, P.B.; Cornelius, S.W.; Spiro, A.; Nesselroade, J.R.; Willis, S.L. Integration versus differentiation of fluid/crytallized intelligence in old age. Dev. Psychol. 1980, 16, 625. [Google Scholar] [CrossRef]
  164. Baltes, B.P.; Lindenberger, U.; Staudinger, U.M. Life-span theory in developmental psychology. In Theoretical Models of Human Development, 5th ed.; Lerner, R.M., Ed.; Wiley: NewYork, NY, USA, 1998; pp. 1029–1143. [Google Scholar]
  165. Birren, J.E. Age changes in speed of behavior: Its central nature and physiological correlates. Behav. Aging Nerv. Syst. 1965, 191–216. [Google Scholar]
  166. Li, S.C.; Lindenberger, U.; Sikström, S. Aging cognition: From neuromodulation to representation. Trends Cogn. Sci. 2001, 5, 479–486. [Google Scholar] [CrossRef] [Green Version]
  167. Lindenberger, U.; Ghisletta, P. Cognitive and sensory declines in old age: Gauging the evidence for a common cause. Psychol. Aging 2009, 24, 1. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  168. Tononi, G.; Edelman, G.M.; Libet, B.; Koch, C.; Gray, J. Consciousness and the integration of information in the brain. Discussion. Adv. Neurol. 1998, 77, 245–280. [Google Scholar]
  169. Leopold, D.A.; Logothetis, N.K. Activity changes in early visual cortex reflect monkeys’ percepts during binocular rivalry. Nature 1996, 379, 549–553. [Google Scholar] [CrossRef] [PubMed]
  170. Tononi, G.; Srinivasan, R.; Russell, D.P.; Edelman, G.M. Investigating neural correlates of conscious perception by frequency-tagged neuromagnetic responses. Proc. Natl. Acad. Sci. USA 1998, 95, 3198–3203. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  171. Gazzaniga, M.S.; Ivry, R.B.; Mangun, G.R. Cognitive neuroscience. In The Biology of the Mind, 5th ed.; Norton: New York, NY, USA, 2019. [Google Scholar]
  172. Beckmann, C.F.; Mackay, C.E.; Filippini, N.; Smith, S.M. Group comparison of resting-state FMRI data using multi-subject ICA and dual regression. Neuroimage 2009, 47, 4496–4503. [Google Scholar] [CrossRef]
  173. Filippini, N.; MacIntosh, B.J.; Hough, M.G.; Goodwin, G.M.; Frisoni, G.B.; Smith, S.M.; Matthews, P.M.; Beckmann, C.F.; Mackay, C.E. Distinct patterns of brain activity in young carriers of the APOE-ϵ4 allele. Proc. Natl. Acad. Sci. USA 2009, 106, 7209–7214. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  174. Calhoun, V.D.; Adali, T. Multisubject independent component analysis of fMRI: A decade of intrinsic networks, default mode, and neurodiagnostic discovery. IEEE Rev. Biomed. Eng. 2012, 5, 60–73. [Google Scholar] [CrossRef] [Green Version]
  175. Baptista, M.S.; Kurths, J. Transmission of information in active networks. Phys. Rev. E 2008, 77, 026205. [Google Scholar] [CrossRef] [Green Version]
  176. Mesulam, M.M. From sensation to cognition. Brain J. Neurol. 1998, 121, 1013–1052. [Google Scholar] [CrossRef]
  177. Bassett, D.S.; Bullmore, E.D. Small-world brain networks. Neuroscientist 2006, 12, 512–523. [Google Scholar] [CrossRef]
  178. Buckner, R.L.; Sepulcre, J.; Talukdar, T.; Krienen, F.M.; Liu, H.; Hedden, T.; Andrews-Hanna, J.R.; Sperling, R.A.; Johnson, K.A. Cortical hubs revealed by intrinsic functional connectivity: Mapping, assessment of stability, and relation to Alzheimer’s disease. J. Neurosci. 2009, 29, 1860–1873. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  179. Buckner, R.L.; Andrews-Hanna, J.R.; Schacter, D.L. The brain’s default network: Anatomy, function, and relevance to disease. Ann. N. Y. Acad. Sci. 2008, 1124, 1–38. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  180. Li, Y.; Liu, Y.; Li, J.; Qin, W.; Li, K.; Yu, C.; Jiang, T. Brain anatomical network and intelligence. PLoS Comput. Biol. 2009, 5, e1000395. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  181. Spreng, R.N.; Mar, R.A.; Kim, A.S. The common neural basis of autobiographical memory, prospection, navigation, theory of mind, and the default mode: A quantitative meta-analysis. J. Cogn. Neurosci. 2009, 21, 489–510. [Google Scholar] [CrossRef]
  182. Van Den Heuvel, M.P.; Stam, C.J.; Kahn, R.S.; Pol, H.E.H. Efficiency of functional brain networks and intellectual performance. J. Neurosci. 2009, 29, 7619–7624. [Google Scholar] [CrossRef] [Green Version]
  183. Hampson, M.; Driesen, N.R.; Skudlarski, P.; Gore, J.C.; Constable, R.T. Brain connectivity related to working memory performance. J. Neurosci. 2006, 26, 13338–13343. [Google Scholar] [CrossRef]
  184. Wang, D.J.; Jann, K.; Fan, C.; Qiao, Y.; Zang, Y.F.; Lu, H.; Yang, Y. Neurophysiological basis of multi-scale entropy of brain complexity and its relationship with functional connectivity. Front. Neurosci. 2018, 12, 352. [Google Scholar] [CrossRef]
  185. Liu, M.; Song, C.; Liang, Y.; Knöpfel, T.; Zhou, C. Assessing spatiotemporal variability of brain spontaneous activity by multiscale entropy and functional connectivity. Neuroimage 2019, 198, 198–220. [Google Scholar] [CrossRef]
  186. Larson-Prior, L.J.; Zempel, J.M.; Nolan, T.S.; Prior, F.W.; Snyder, A.Z.; Raichle, M.E. Cortical network functional connectivity in the descent to sleep. Proc. Natl. Acad. Sci. USA 2009, 106, 4489–4494. [Google Scholar] [CrossRef] [Green Version]
  187. Dennis, E.L.; Thompson, P.M. Functional brain connectivity using fMRI in aging and Alzheimer’s disease. Neuropsychol. Rev. 2014, 24, 49–62. [Google Scholar] [CrossRef] [PubMed]
  188. Ferreira, L.K.; Busatto, G.F. Resting-state functional connectivity in normal brain aging. Neurosci. Biobehav. Rev. 2013, 37, 384–400. [Google Scholar] [CrossRef] [PubMed]
  189. Reid, A.T.; Headley, D.B.; Mill, R.D.; Sanchez-Romero, R.; Uddin, L.Q.; Marinazzo, D.; Lurie, D.J.; Valdés-Sosa, P.A.; Hanson, S.J.; Biswal, B.B.; et al. Advancing functional connectivity research from association to causation. Nat. Neurosci. 2019, 22, 1751–1760. [Google Scholar] [CrossRef] [PubMed]
  190. Horwitz, B. The elusive concept of brain connectivity. Neuroimage 2003, 19, 466–470. [Google Scholar] [CrossRef]
  191. Friston, K.J. Functional and effective connectivity: A review. Brain Connect. 2011, 1, 3–36. [Google Scholar] [CrossRef]
  192. Friston, K.; Moran, R.; Seth, A.K. Analysing connectivity with Granger causality and dynamic causal modelling. Curr. Opin. Neurobiol. 2013, 23, 172–178. [Google Scholar] [CrossRef] [Green Version]
  193. Seth, A.K.; Barrett, A.B.; Barnett, L. Granger causality analysis in neuroscience and neuroimaging. J. Neurosci. 2015, 35, 3293–3297. [Google Scholar] [CrossRef]
  194. Wilber, A.A.; Skelin, I.; Wu, W.; McNaughton, B.L. Laminar organization of encoding and memory reactivation in the parietal cortex. Neuron 2017, 95, 1406–1419. [Google Scholar] [CrossRef] [Green Version]
  195. Schreiber, T. Measuring information transfer. Phys. Rev. Lett. 2000, 85, 461–464. [Google Scholar] [CrossRef] [Green Version]
  196. Kaiser, A.; Schreiber, T. Information transfer in continuous processes. Physica 2002, 166, 43–62. [Google Scholar] [CrossRef]
  197. Granger, C.W.J. Investigating causal relations by econometric models and cross-spectral methods. Econometrica 1969, 37, 424–438. [Google Scholar] [CrossRef]
  198. Geweke, J. Measures of conditional linear dependence and feedback between time series. J. Am. Stat. Assoc. 1984, 79, 907–915. [Google Scholar] [CrossRef]
  199. Barnett, L.; Barrett, A.B.; Seth, A.K. Granger causality and transfer entropy are equivalent for Gaussian variables. Phys. Rev. Lett. 2009, 103, 238701. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  200. Tononi, G. An information integration theory of consciousness. BMC Neurosci. 2004, 5, 42. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  201. Seth, A.K.; Izhikevich, E.; Reeke, G.N.; Edelman, G.M. Theories and measures of consciousness: An extended framework. Proc. Natl. Acad. Sci. USA 2006, 103, 10799–10804. [Google Scholar] [CrossRef] [Green Version]
  202. Barrett, A.B.; Seth, A.K. Practical measures of integrated information for time-series data. PLoS Comput. Biol. 2011, 7, e1001052. [Google Scholar] [CrossRef] [Green Version]
  203. Toker, D.; Sommer, F.T. Information integration in large brain networks. PLoS Comput. Biol. 2019, 15, e1006807. [Google Scholar] [CrossRef] [Green Version]
  204. Vyšata, O.; Schätz, M.; Kopal, J.; Burian, J.; Prochâzka, A.; Jiri, K.; Hort, J.; Vališ, M. Non-Linear EEG measures in meditation. J. Biomed. Sci. Eng. 2014, 7, 731–738. [Google Scholar] [CrossRef] [Green Version]
  205. Sumioka, H.; Keshmiri, S.; Ishiguro, H. Information-theoretic investigation of impact of huggable communication medium on prefrontal brain activation. Adv. Robot. 2019, 33, 1019–1029. [Google Scholar] [CrossRef]
  206. MacDuffie, K.E.; Knodt, A.R.; Radtke, S.R.; Strauman, T.J.; Hariri, A.R. Self-rated amygdala activity: An auto-biological index of affective distress. Personal. Neurosci. 2019, 2, e1. [Google Scholar] [CrossRef] [Green Version]
  207. Kakumanu, R.J.; Nair, A.K.; Venugopal, R.; Sasidharan, A.; Ghosh, P.K.; John, J.P.; Mehrotra, S.; Panth, R.; Kutty, B.M. Dissociating meditation proficiency and experience dependent EEG changes during traditional Vipassana meditation practice. Biol. Psychol. 2018, 135, 65–75. [Google Scholar] [CrossRef] [PubMed]
  208. Keshmiri, S.; Sumioka, H.; Yamazaki, R.; Ishiguro, H. Multiscale Entropy Quantifies the Differential Effect of the Medium Embodiment on Older Adults Prefrontal Cortex during the Story Comprehension: A Comparative Analysis. Entropy 2019, 21, 199. [Google Scholar] [CrossRef] [Green Version]
  209. Keshmiri, S.; Sumioka, H.; Yamazaki, R.; Ishiguro, H. Differential Effect of the Physical Embodiment on the Prefrontal Cortex Activity as Quantified by Its Entropy. Entropy 2019, 21, 875. [Google Scholar] [CrossRef] [Green Version]
  210. Henschel, A.; Hortensius, R.; Cross, E.S. Social Cognition in the Age of Human-Robot Interaction. Trends Neurosci. 2020, 43, 373–384. [Google Scholar] [CrossRef]
  211. Rauchbauer, B.; Nazarian, B.; Bourhis, M.; Ochs, M.; Prévot, L.; Chaminade, T. Brain activity during reciprocal social interaction investigated using conversational robots as control condition. Philos. Trans. R. Soc. B 2020, 374, 20180033. [Google Scholar] [CrossRef] [Green Version]
  212. Klapper, A.; Ramsey, R.; Wigboldus, D.; Cross, E.S. The control of automatic imitation based on bottom-up and top-down cues to animacy: Insights from brain and behavior. J. Cogn. Neurosci. 2014, 26, 2503–2513. [Google Scholar] [CrossRef] [Green Version]
  213. Price, C.J. A review and synthesis of the first 20 years of PET and fMRI studies of heard speech, spoken language and reading. Neuroimage 2012, 26, 816–847. [Google Scholar] [CrossRef] [Green Version]
  214. Bandt, C.; Pompe, B. Permutation entropy: A natural complexity measure for time series. Phys. Rev. Lett. 2002, 88, 174102. [Google Scholar] [CrossRef]
  215. Richman, J.S.; Moorman, J.R. Physiological time-series analysis using approximate entropy and sample entropy. Am. J. Physiol. Heart Circ. Physiol. 2000, 278, H2039–H2049. [Google Scholar] [CrossRef] [Green Version]
  216. Shannon, C.E. A Mathematical Theory of Communication. ACM Sigmobile Mob. Comput. Commun. Rev. 2001, 5, 3–55. [Google Scholar] [CrossRef]
  217. Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev. 1957, 106, 620. [Google Scholar] [CrossRef]
  218. Jaynes, E.T. Information theory and statistical mechanics. II. Phys. Rev. 1957, 108, 171. [Google Scholar] [CrossRef]
  219. Dayan, P.; Abbott, L.F. Information Theory. In Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems; MIT Press: Cambridge, UK, 2001; pp. 129–135. [Google Scholar]
  220. Solomonoff, R.J. A formal theory of inductive inference. Part I. Inf. Control 1964, 7, 1–22. [Google Scholar] [CrossRef] [Green Version]
  221. Kolmogorov, A.N. Combinatorial foundations of information theory and the calculus of probabilities. Russ. Math. Surv. 1983, 38, 29–40. [Google Scholar] [CrossRef]
  222. Chaitin, G.J. Information, randomness & incompleteness. Pap. Algorithm. Inf. Theory 1990, 8, 29. [Google Scholar]
  223. Gleick, J. The Information: A History, a Theory, a Flood; Harper Collins Publishers: London, UK, 2011. [Google Scholar]
  224. Sokunbi, M.O. Sample entropy reveals high discriminative power between young and elderly adults in short fMRI data sets. Front. Neuroinform. 2014, 8, 69. [Google Scholar] [CrossRef]
  225. Courtiol, J.; Perdikis, D.; Petkoski, S.; Muller, V.; Huys, R.; Sleimen-Malkoun, R.; Jirsa, V.K. The multiscale entropy: Guidelines for use and interpretation in brain signal analysis. J. Neurosci. Methods 2016, 273, 175–190. [Google Scholar] [CrossRef]
  226. Little, D.J.; Kane, D.M. Permutation entropy of finite-length white-noise time series. Phys. Rev. E 2016, 94, 022118. [Google Scholar] [CrossRef]
  227. Cuesta-Frau, D.; Murillo-Escobar, J.P.; Orrego, D.A.; Delgado-Trejos, E. Embedded Dimension and Time Series Length. Practical Influence on Permutation Entropy and Its Applications. Entropy 2019, 21, 385. [Google Scholar] [CrossRef] [Green Version]
  228. Stone, J.V. Information Theory: A Tutorial Introduction; JSebtel Press: Warszawa, Poland, 2015. [Google Scholar]
  229. Victor, J.D. Binless strategies for estimation of information from neural data. Phys. Rev. E 2002, 66, 051903. [Google Scholar] [CrossRef] [Green Version]
  230. Charzyńska, A.; Gambin, A. Improvement of the k-NN entropy estimator with applications in systems biology. Entropy 2016, 18, 13. [Google Scholar] [CrossRef] [Green Version]
  231. Kraskov, A.; Stögbauer, H.; Grassberger, P. Estimating mutual information. Phys. Rev. E 2004, 69, 066138. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  232. Gilden, D.L.; Thornton, T.; Mallon, M.W. 1f noise in human cognition. Science 1995, 267, 1837–1839. [Google Scholar] [CrossRef] [PubMed]
  233. Munk, M.H.; Roelfsema, P.R.; König, P.; Engel, A.K.; Singer, W. Role of reticular activation in the modulation of intracortical synchronization. Science 1996, 272, 271–274. [Google Scholar] [CrossRef] [PubMed] [Green Version]

Share and Cite

MDPI and ACS Style

Keshmiri, S. Entropy and the Brain: An Overview. Entropy 2020, 22, 917. https://0-doi-org.brum.beds.ac.uk/10.3390/e22090917

AMA Style

Keshmiri S. Entropy and the Brain: An Overview. Entropy. 2020; 22(9):917. https://0-doi-org.brum.beds.ac.uk/10.3390/e22090917

Chicago/Turabian Style

Keshmiri, Soheil. 2020. "Entropy and the Brain: An Overview" Entropy 22, no. 9: 917. https://0-doi-org.brum.beds.ac.uk/10.3390/e22090917

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop