Next Article in Journal
Representation of the Universe as a Dendrogramic Hologram Endowed with Relational Interpretation
Next Article in Special Issue
A Traditional Scientific Perspective on the Integrated Information Theory of Consciousness
Previous Article in Journal
The Phase Space Model of Nonrelativistic Quantum Mechanics
Previous Article in Special Issue
Mechanism Integrated Information
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Implications of Noise on Neural Correlates of Consciousness: A Computational Analysis of Stochastic Systems of Mutually Connected Processes

Academy of Integrated Science, Division of Systems Biology, Virginia Polytechnic Institute and State University, Blacksburg, VA 24061, USA
Submission received: 1 April 2021 / Revised: 26 April 2021 / Accepted: 28 April 2021 / Published: 8 May 2021
(This article belongs to the Special Issue Integrated Information Theory and Consciousness)

Abstract

:
Random fluctuations in neuronal processes may contribute to variability in perception and increase the information capacity of neuronal networks. Various sources of random processes have been characterized in the nervous system on different levels. However, in the context of neural correlates of consciousness, the robustness of mechanisms of conscious perception against inherent noise in neural dynamical systems is poorly understood. In this paper, a stochastic model is developed to study the implications of noise on dynamical systems that mimic neural correlates of consciousness. We computed power spectral densities and spectral entropy values for dynamical systems that contain a number of mutually connected processes. Interestingly, we found that spectral entropy decreases linearly as the number of processes within the system doubles. Further, power spectral density frequencies shift to higher values as system size increases, revealing an increasing impact of negative feedback loops and regulations on the dynamics of larger systems. Overall, our stochastic modeling and analysis results reveal that large dynamical systems of mutually connected and negatively regulated processes are more robust against inherent noise than small systems.

1. Introduction

Noise is ubiquitous in neuronal circuits [1,2]. Hence, neuronal networks may produce highly variable responses to the repeated presentation of the same stimulus [3,4]. The dominant sources of noise in neuronal systems include voltage-gated channel noise [5,6] and synaptic noise [7,8], as well as sensory-motor sources of noise [2]. Several studies have examined the implications of noise on membrane potential [1], propagation of action potential [9], and spike train coding [10].
In bistable and threshold-like systems, noise can significantly affect the information-processing of sub-threshold periodic signals that are more likely to cross the threshold in the presence of noise. Therefore, the propagation of weak periodic signals can be enhanced by the presence of a certain level of noise. This phenomenon is called stochastic resonance. The noise-induced transmission of information has been observed in cat visual neurons [11], rat [12], and crayfish [13]. Several theoretical studies of information-processing in threshold-like neuronal systems show that additive noise can increase the mutual information of threshold neurons [14,15,16]. Stochastic resonance can also modulate behavioral responses. For example, noise has been shown to improve human motor performance and balance control [17] and enhance the normal feeding behavior of river paddlefish [18]. Therefore, noise can influence perception and behavior by altering the information transmission and processing in nervous systems.
Overall, noise is a key component of the sensorimotor loop and, thus, may have direct behavioral consequences [2]. Noise may also increase the information capacity of neuronal networks and maximize mutual information [16]. Average mutual information across all bipartitions of a neural network is used as a metric for neural complexity in the information integration theory of consciousness [19,20]. Thus, noise can be directly linked with conscious perception. Undoubtedly, it is also important to consider noise in the neural correlates of consciousness studies that pave the way for understanding neural mechanisms of perception and consciousness [21,22,23,24]. Neural correlates of consciousness are identified by the minimal set of neuronal events sufficient for any one specific conscious percept [23]. Noise can have a substantial impact on the system dynamics that must be considered when the minimal set of neuronal events sufficient for any one specific conscious percept is defined.
In this work, we study the implications of stochastic noise on systems of processes exhibiting dynamics isomorphic with a specific conscious percept [25,26]. During the past few decades, several comprehensive theoretical models of consciousness have been developed based on dynamical and operational system frameworks [27,28,29,30], information theory [19,31], temporo-spatial theory [32,33], and several other mathematical and physical frameworks [34]. It is widely accepted that consciousness is a dynamic process that requires the proper execution of physical neuronal operations [19]. Conscious states are observed only when both the neuronal processes are correctly executed and running and the connections between different parts of neuronal networks (brain areas) are largely maintained and functional [35]. It has also been proposed that the specific conscious percept can be associated with a particular functional organization of a neuronal system [36], or that the operations of transient neuronal assemblies must be functionally isomorphic with phenomenal qualities [28]. Our dynamic model exhibits this important property.
To study the implication of noise on neural correlates of consciousness, we developed a stochastic model based on our previous dynamical system framework for a conscious percept [25,26]. The main purpose of this work is to investigate how noise affects the dynamical systems of different sizes. We computed the spectral entropy for different systems and determined the dependence of entropy on system size. Overall, we believe that our stochastic model can help us better understand the impact of noise on neural correlates of consciousness. Understanding the impact of noise on perception can help us deduce processes related to conscious perception, apply this knowledge to modulate behavioral responses [17,18] and design computer vision systems [37,38].

2. Materials and Methods

To generate stochastic trajectories for processes, we used Gillespie’s stochastic simulation algorithm. This method is often applied to handle stochastic fluctuations in chemical reaction systems [39,40,41]. In Gillespie’s formalism, the kinetic terms describing rates of biochemical reactions are treated as stochastic propensities for a reaction event to occur over a small interval of time. This algorithm is the most accurate for elementary reaction mechanisms expressed in terms of mass-action rate laws [39]. However, the algorithm has been also used to handle stochastic fluctuations in chemical reaction systems described by other phenomenological rate laws (like Hill function) that are commonly used in deterministic models of biochemical systems [40]. Overall, Gillespie’s scheme is often considered as a convenient way to turn a deterministic simulation into a stochastic realization by reinterpreting kinetic rate laws as propensities of a reaction. Gillespie’s stochastic simulation algorithm has been also applied to simulate reaction-diffusion systems [42], the predator-prey cycles [43], and the evolutionary dynamics of a multi-species population [44]. Here we use Gillespie’s scheme to introduce randomness into the system of processes which we had previously described by deterministic differential equations [25].
The deterministic systems (2, 3) were converted into stochastic models using the Gillespie method, where the right-hand-side terms of ordinary differential equations describe propensity functions determining the probability for a process p i to increase, decrease, or transition to a process z i . The algorithm can be described by the following general Gillespie scheme [39]:
  • Initialize the process state vector, P , and set the initial time at 0.
  • Calculate the propensities, a k ( P ) .
  • Generate a uniform random number, r 1 .
  • Compute the time for the next event, τ = 1 k a k ( P ) ln r 1 .
  • Generate a uniform random number, r 2 .
  • Find which event is next, I = i , if k = 1 i 1 a k ( P ) k a k ( P ) r 2 < k = 1 i a k ( P ) k a k ( P )
  • Update state vector, P P + y i .
  • Update time, t t + τ .
  • Repeat (2)–(8).
XPP/XPPAUT software (http://www.math.pitt.edu/~bard/xpp/xpp.html, accessed on 24 March 2021) was used to solve systems of ordinary differential equations, compute one-parameter bifurcation diagrams (Figure 3A), and simulate stochastic models. XPPAUT codes that were used to simulate results in Figures 1, 3A and 4A,D,G are provided in Appendix A.
The eigenvalues shown in Figure 3B were computed using Wolfram Mathematica software.
To implement Gillespie’s stochastic simulation algorithm in the XPPAUT codes (provided in Appendix A), we first computed sums of event propensities x i = k = 1 i a k ( P ) for all values of i , then we found the next event (steps 5, 6), updated the time (4, 8) and the states of processes P (step 7).
We used the spectrum analysis and spectral entropy to quantify noise effects in the system of processes. These techniques are common tools that are often applied to analyzed data obtained in neurophysiological studies [45,46,47,48]. These tools are also commonly applied in signal processing, control systems engineering and diagnosis. For example, spectrum techniques are often used to monitor dynamics of complex machines and their fault diagnosis [49,50]. Spectral entropy in our work is based on Shannon’s entropy formalism that is a foundational concept of information theory [51]. The entropy metric is also an essential part of information integration theory of consciousness [19,31]. Particularly, the entropy metric is used to quantify integrated information of a system composed of interacting elements. We used the spectral entropy to quantify the effects of noise on the system of interacting processes and how the impact of noise changes when the system size increases.
The fast Fourier transform (FFT) of each process p ( t ) was computed using the Fourier Analysis function in Excel’s Analysis ToolPak. 1024 points were used to obtain the signal p ( f ) in the frequency domain. 1024 points corresponded to a total simulation time of ~370 arb. u. and the period of oscillations ranged between ~5–10 arb. u. (see Figure 4C,F,I). The sampling frequency, f , was computed by dividing the number of points by the time interval, Δ t . The frequency magnitudes were calculated using Excel’s IMABS function. The power spectral density was defined as P S D i = | p ( f i ) | 2 / 2 Δ f . We used 512 data points to compute spectral densities. The spectral entropy was computed using the following equation:
S = k i = 1 512 P S D i ^ Log 2 ( P S D ^ i ) ,  
where k = 1 Log 2 ( 512 ) 0.1 and P S D ^ is the normalized power spectral density. P S D ^ was computed by dividing the power spectral density by the total power [52].

3. Results

We considered a dynamical model that describes mutually connected processes. In this model, the processes maintain specific dynamic relationships, which are designed to be isomorphic with a hypothetical conscious percept to mimic a mechanism for neural correlates of consciousness [26]. The specific conscious percept was represented as a specific function (property) that was performed and maintained by the dynamical system. This is in line with William James’s interpretation of consciousness as a dynamical process, not a capacity, memory, or information [53].
A nonlinear system of two processes and linear systems of four, eight, and sixteen processes were used to mimic some specific conscious percepts exhibited by these dynamical systems. The former system was used to illustrate the concept that the system can execute and maintain a specific relationship between processes p 1 = f ( p 2 ) and p 2 = f ( p 1 ) where f ( ) is a nonlinear function. The latter system was used to investigate the effect of the size of the system. In the linear system of mutually connected processes, each process could be expressed through all other processes as P = A P , where P is a vector of processes and A is the hollow matrix. The corresponding stochastic models were used to investigate the effects of noise on these systems.
First, we consider a system of two nonlinear differential equations:
d p 1 d t = p 2 + p 1 ( R 2 p 1 2 p 2 2 ) d p 2 d t = p 1 + p 2 ( R 2 p 1 2 p 2 2 ) .
System (2) has a periodic solution p 1 = R   cos ( t ) , p 2 = R   sin ( t ) shown in Figure 1A. The solution could also be represented by a limit cycle in the p 1 p 2 -phase plane (see Figure 1B). Thus, the relationship between the two processes is defined by the limit cycle, which is maintained in time. The parameter R defines the radius of the limit cycle. For System (2), the specific dynamic relationship between p 1 and p 2 is isomorphic with a circular motion. This relationship itself is a part of the dynamical system. We assume that the specific conscious percept is represented by the dynamical property that emerges in the dynamics of the system, which, in this case, is isomorphic with a circular motion.
We introduced noise into System (2) by applying a stochastic formulation. Here, we used the Gillespie algorithm, described in the Materials and Methods section, which allowed us to describe the evolution of processes using propensities derived from rates that govern the dynamics of the processes in System (2). The simulated stochastic trajectories for the p1 and p2 processes are shown in Figure 1C and the corresponding limit cycle in the p1p2-phase plane is shown in Figure 1D. Using stochastic simulation results, we computed power spectral densities (see Figure 1E) from stochastic trajectories as well as the spectral entropy using Equation (1) as described in the Materials and Methods section. For the stochastic version of a nonlinear system (2), we found that spectral entropy was ~0.5 for both the p1 and p2 processes.
Nonlinear relationships among the processes are expected for any nonlinear system and a system consisting of more than two processes could represent a challenge for mathematical analysis. Thus, to investigate how system size alters the impact of stochastic noise on a system, we used a scalable linear system of interacting oscillating processes and we analyzed a system of coupled oscillators that was described by a set of linear differential equations [25]. This system of coupled oscillators represents a set of interacting processes that have the following two properties: (i) each process in the set could be represented by a linear combination containing all other processes in the set, and (ii) the relationships among the processes are isomorphic to a distance matrix. We then developed a stochastic model describing this system to study the implications of noise on a system of mutually connected processes. We consider two sets of n-processes, P = ( p 1 , p 2 , , p n ) and Z = ( z 1 , z 2 , , z n ) , which are described by the following system of equations:
d P d t = A P ( Z + P ) d Z d t = P .
The deterministic system (3) was extensively analyzed in Ref. [25]; in this paper, our goal was to analyze the stochastic version of System (3). System (3) has oscillatory solutions such that P = A P , where A is a hollow distance matrix [25]. Matrix A defines how each process p i in the system is connected to all other processes, and we considered the following relationship between processes:
p i = j = 1 n ( i j ) 2 ε p j
and thus
A   =   ( 0 ε ε         0             ( n 1 ) 2 ε ( n 2 ) 2 ε         ( n 1 ) 2 ε ( n 2 ) 2 ε 0 ) .
Figure 2 shows a wiring diagram presenting the relationships among processes described by System (3).
System (3) has one parameter ε that is considered as a bifurcation parameter for the system (3). The typical one parameter bifurcation diagram for system (3) is shown in Figure 3A. The Hopf bifurcation value of ε depends on the number of processes in the system. For systems consisting of 4, 6, …, 20 processes, the Hopf bifurcations occurs at ε equals to ± 1 , ± 1 / 4 , ± 1 / 10 , ± 1 / 20 , ± 1 / 35 , ± 1 / 56 , ± 1 / 84 , ± 1 / 120 , and ± 1 / 165 , respectively. For a system of two p i and two z i processes, the stability of the steady states of system (3) is described by four eigenvalues. The real and imaginary parts of these eigenvalues, as a function of parameter ε, are shown in Figure 3B. The real parts of all eigenvalues are negative for 1 < ε < 1 , indicating a spiral sink for this range of ε. For | ε | > 1 the spiral source is observed. For ε = ± 1 the system exhibits oscillations with a constant amplitude.
Noise was introduced into system (3) by applying the Gillespie stochastic formulation described in the Materials and Methods section. The stochastic model was then used to investigate how noise affects the dynamical systems that consists of a different number of processes. We performed simulations for a system of four ( p 1 , p 2 , z 1 , z 2 ), eight ( p 1 ,…, p 4 , z 1 ,…, z 4 ), and sixteen ( p 1 ,…, p 8 , z 1 ,…, z 8 ) processes that interact as shown in Figure 2. Stochastic trajectories for processes p i in three systems consisting of two, four, and eight p i processes are shown in Figure 4A,D,G, respectively. Note that the number of z i processes in the system is always the same as the number of p i processes. However, we only describe the dynamics of the p i processes because their dynamics exhibit the property that each process p i has a specific relationship with all other processes (see Equation (4)), and this relationship is isomorphic with the distance matrix. Figure 4 also shows the distribution functions (see Figure 4B,E,H) and the normalized power spectral densities (see Figure 4C,F,I) computed using stochastic trajectories for the selected process p 2 . The method to compute and normalize power spectral densities is described in the Materials and Methods section. For other processes included in the system, the distribution functions and power spectral density plots look nearly identical to those shown for the process p 2   in Figure 4.
To quantify the implications of noise on systems of different sizes, we computed spectral entropy values for all processes in the systems using Equation (1). The results are summarized in Table 1. We observed that the spectral entropy values decrease as the number of processes in the system increases. Figure 5 shows the average spectral entropy values as a function of system size, which is represented by the number of processes p i constituting the system. Interestingly, spectral entropy decreases linearly as the system size doubled. This indicates that a larger system is more robust against the influence of inherent fluctuations than smaller systems.

4. Discussion

In this paper, we studied how inherent noise impacted the dynamical systems that mimic a mechanism for neural correlates of consciousness. Our modeled system exhibits a dynamic behavior that is isomorphic with a specific conscious percept [25,26]. Details on how phenomenal conscious states are assumed to arise in dynamical systems are provided in our previous works [25,26]. The neural correlates of consciousness are defined as a minimal mechanism sufficient for any one specific conscious percept. Here, our analysis is concentrated on implications of noise on the mechanisms that are scaled to different sizes. To study and characterize noise effects on the dynamic behavior of the mechanism, we developed a stochastic version of our deterministic model described in Ref. [25].
The main finding of our study is that the larger system of mutually connected and negatively regulated processes is more robust against the influence of inherent fluctuations. We found that spectral entropy of the system decreases linearly as system size doubled (see Figure 5 and Table 1). In addition, we found that the frequency domain, for which the power spectral density values are significant, shrinks as system size increases (see Figure 4C,F,I). This indicates that the noise impact is more restricted when the number of regulatory feedback loops (see Figure 2) in the mechanism increases. Our results agree with several other experimental and computational studies of noise in biological regulatory circuits, which revealed that negative feedback loops suppress the effects of noise on the dynamic behavior of the circuits [54,55].
Comparing the power spectral densities shown in Figure 4C,F,I, we also observed that the frequencies are shifted from low to high values. This agrees with independent studies of noise effects in gene regulatory networks, where it has been shown that negative feedback loops shift noise frequency from low to high values compared to non-regulated circuits [56,57]. Therefore, we conclude that the shift to higher frequencies occurs due to a stronger influence of negative feedback loops in the larger systems that were analyzed in this work. This indicates that network wiring and architecture can influence the noise spectra. Remarkably, the noise suppression strategy in biological systems is different from standard frequency domain techniques that are commonly used in control systems engineering, electronics, and signal processing systems [58,59].
Interestingly, the spectral entropy value obtained for small nonlinear systems was smaller than spectral entropy values for the larger linear systems shown in Table 1. This may indicate that nonlinear systems can control and suppress noise more effectively than linear systems. However, the systems described by Equations (2) and (3) are very different and cannot be used for any conclusive comparison.
Power spectral density and spectral entropy are common tools that are often used to characterize electroencephalography and magnetic resonance encephalography recordings. For example, spectral analysis of electroencephalograms is used to study the neurophysiology of sleep [45], to detect differences in brain activities of subjects under normal and hypnosis conditions [46], and healthy subjects and schizophrenic patients [47]. Electroencephalography and magnetic resonance encephalography recordings in patients with drug-resistant epilepsy reveal the altered spectral entropy for electrophysiological and hemodynamic signals [48]. Spectral entropy for electroencephalographic signals can also be used to predict changes in memory performance [52]. We used the spectral analysis tools to characterize possible impacts of noise on signals generated by dynamical systems that are isomorphic with hypothetical specific conscious percepts.
Overall, our study showed that negative feedback loops in dynamical systems could suppress noise and shift it to a higher frequency. This property can be applied to neuronal dynamical systems that involve negative feedback loops; higher noise frequencies in a neuronal network can be more easily filtered out by other parts of the system that are composed of several connected subnetworks. We can also conclude that it is important to understand the contribution of noise to the dynamics of neural systems to successfully determine a minimal mechanism sufficient for any one specific conscious percept. Our study and analysis of a simple dynamical model that mimics a mechanism for neural correlates of consciousness revealed the particular impact of inherent fluctuations on the system and the influence of system size and architecture on noise spectra.

Funding

This research received no external funding.

Data Availability Statement

Data sharing not applicable.

Conflicts of Interest

The author declares no conflict of interest.

Appendix A

The XPPAUT code A was used to simulate results in Figure 1A,B.
# code A
par R=1
init p1=-0.1412 p2=-0.2316
 
p1’=-p2+p1*(R^2-p1^2-p2^2)
p2’=p1+p2*(R^2-p1^2-p2^2)
 
@ dt=.025, total=40, xplot=t,yplot=p1
@ xmin=0,xmax=30,ymin=-1,ymax=1
done
		
The XPPAUT code B was used to simulate results in Figure 1C,D.
#code B
 
par R=100
init p1=-14.12,p2=-23.16
 
# compute the sum of all event propensities
x1=abs(p2)
x2=x1+abs((R^2-p1*p1-p2*p2)*p1)
x3=x2+abs(p1)
x4=x3+abs((R^2-p1*p1-p2*p2)*p2)
 
# choose random event
s2=ran(1)*x4
y1=(s2<x1)
y2=(s2<x2)&(s2>=x1)
y3=(s2<x3)&(s2>=x2)
y4=(s2>=x3)
 
# time for next event
tr’=tr-log(ran(1))/x4
 
p1’=p1-sign(p2)*y1+sign((R^2-p1*p1-p2*p2)*p1)*y2
p2’=p2+sign(p1)*y3+sign((R^2-p1*p1-p2*p2)*p2)*y4
 
@ bound=10000000000,meth=discrete,total=1000000,njmp=1000
@ xp=tr,yp=p1
@ xlo=0,ylo=-100,xhi=30,yhi=100
 
done
		
The XPPAUT code C was used to simulate the one-parameter bifurcation diagram in Figure 3A.
# code C
init p1=1 p2=0 z1=0 z2=0
par eps=-1.0
p1’=eps*p2-z1-p1
z1’=p1
p2’=eps*p1-z2-p2
z2’=p2
@ dt=.025, total=40, xplot=t,yplot=p1
@ xmin=0,xmax=40,ymin=-1,ymax=1
done
		
The XPPAUT code D was used to simulate the results in Figure 4A.
# code D
 
par eps=-1
init p1=1000,p2=0, z1=0, z2=0
 
# compute the sum of all event propensities
x1=abs(eps*p2)
x2=x1+abs(eps*p1)
x3=x2+abs(z1)
x4=x3+abs(z2)
x5=x4+abs(p1)
x6=x5+abs(p2)
 
# choose random event#
s2=ran(1)*x6
y1=(s2<x1)
y2=(s2<x2)&(s2>=x1)
y3=(s2<x3)&(s2>=x2)
y4=(s2<x4)&(s2>=x3)
y5=(s2<x5)&(s2>=x4)
y6=(s2>=x5)
 
# time for the next event
tr’=tr-log(ran(1))/x6
 
p1’=p1+sign(eps)*sign(p2)*y1-sign(z1)*y3-sign(p1)*y5
p2’=p2+sign(eps)*sign(p1)*y2-sign(z2)*y4-sign(p2)*y6
z1’=z1+sign(p1)*y5
z2’=z2+sign(p2)*y6
 
@ bound=100000000,meth=discrete,total=1000000,njmp=1000
@ xp=tr,yp=p1
@ xlo=0,ylo=-1000,xhi=40,yhi=1000
 
done
		
The XPPAUT code E was used to simulate the results in Figure 4D.
# code E
 
par eps=-0.1
init p1=1000,p2=0, p3=0, p4=0, z1=0, z2=0, z3=0, z4=0
 
# compute the sum of all event propensities
x11=abs(eps*p2)
x12=x11+abs(4*eps*p3)
x13=x12+abs(9*eps*p4)
x21=x13+abs(eps*p1)
x22=x21+abs(eps*p3)
x23=x22+abs(4*eps*p4)
x31=x23+abs(4*eps*p1)
x32=x31+abs(eps*p2)
x33=x32+abs(eps*p4)
x41=x33+abs(9*eps*p1)
x42=x41+abs(4*eps*p2)
x43=x42+abs(eps*p3)
 
x51=x43+abs(z1)
x52=x51+abs(z2)
x53=x52+abs(z3)
x54=x53+abs(z4)
x61=x54+abs(p1)
x62=x61+abs(p2)
x63=x62+abs(p3)
x64=x63+abs(p4)
 
# choose random event
s2=ran(1)*x64
y1=(s2<x11)
y2=(s2<x12)&(s2>=x11)
y3=(s2<x13)&(s2>=x12)
y4=(s2<x21)&(s2>=x13)
y5=(s2<x22)&(s2>=x21)
y6=(s2<x23)&(s2>=x22)
y7=(s2<x31)&(s2>=x23)
y8=(s2<x32)&(s2>=x31)
y9=(s2<x33)&(s2>=x32)
y10=(s2<x41)&(s2>=x33)
y11=(s2<x42)&(s2>=x41)
y12=(s2<x43)&(s2>=x42)
y13=(s2<x51)&(s2>=x43)
y14=(s2<x52)&(s2>=x51)
y15=(s2<x53)&(s2>=x52)
y16=(s2<x54)&(s2>=x53)
y17=(s2<x61)&(s2>=x54)
y18=(s2<x62)&(s2>=x61)
y19=(s2<x63)&(s2>=x62)
y20=(s2>=x63)
 
# time for next event
tr’=tr-log(ran(1))/x64
 
p1’=p1+sign(eps)*sign(p2)*y1+sign(eps)*sign(p3)*y2+sign(eps)*sign(p4)*y3-sign(z1)*y13-sign(p1)*y17
p2’=p2+sign(eps)*sign(p1)*y4+sign(eps)*sign(p3)*y5+sign(eps)*sign(p4)*y6-sign(z2)*y14-sign(p2)*y18
p3’=p3+sign(eps)*sign(p1)*y7+sign(eps)*sign(p2)*y8+sign(eps)*sign(p4)*y9-sign(z3)*y15-sign(p3)*y19
p4’=p4+sign(eps)*sign(p1)*y10+sign(eps)*sign(p2)*y11+sign(eps)*sign(p3)*y12-sign(z4)*y16-sign(p4)*y20
z1’=z1+sign(p1)*y17
z2’=z2+sign(p2)*y18
z3’=z3+sign(p3)*y19
z4’=z4+sign(p4)*y20
 
@ bound=100000000,meth=discrete,total=1000000,njmp=1000
@ xp=tr,yp=p1
@ xlo=0,ylo=-1000,xhi=40,yhi=1000
 
done
		
The XPPAUT code F was used to simulate the results in Figure 4G.
# code F
 
par eps=-0.0119047619
init p1=1000,p2=0, p3=0, p4=0, p5=0, p6=0, p7=0, p8=0, z1=0, z2=0, z3=0, z4=0, z5=0, z6=0, z7=0, z8=0
 
# compute the sum of all event propensities
x11=abs(eps*p2)
x12=x11+abs(4*eps*p3)
x13=x12+abs(9*eps*p4)
x14=x13+abs(16*eps*p5)
x15=x14+abs(25*eps*p6)
x16=x15+abs(36*eps*p7)
x17=x16+abs(49*eps*p8)
x21=x17+abs(eps*p1)
x22=x21+abs(eps*p3)
x23=x22+abs(4*eps*p4)
x24=x23+abs(9*eps*p5)
x25=x24+abs(16*eps*p6)
x26=x25+abs(25*eps*p7)
x27=x26+abs(36*eps*p8)
x31=x27+abs(4*eps*p1)
x32=x31+abs(eps*p2)
x33=x32+abs(eps*p4)
x34=x33+abs(4*eps*p5)
x35=x34+abs(9*eps*p6)
x36=x35+abs(16*eps*p7)
x37=x36+abs(25*eps*p8)
x41=x37+abs(9*eps*p1)
x42=x41+abs(4*eps*p2)
x43=x42+abs(eps*p3)
x44=x43+abs(eps*p5)
x45=x44+abs(4*eps*p6)
x46=x45+abs(9*eps*p7)
x47=x46+abs(16*eps*p8)
x51=x47+abs(16*eps*p1)
x52=x51+abs(9*eps*p2)
x53=x52+abs(4*eps*p3)
x54=x53+abs(eps*p4)
x55=x54+abs(eps*p6)
x56=x55+abs(4*eps*p7)
x57=x56+abs(9*eps*p8)
x61=x57+abs(25*eps*p1)
x62=x61+abs(16*eps*p2)
x63=x62+abs(9*eps*p3)
x64=x63+abs(4*eps*p4)
x65=x64+abs(eps*p5)
x66=x65+abs(eps*p7)
x67=x66+abs(4*eps*p8)
x71=x67+abs(36*eps*p1)
x72=x71+abs(25*eps*p2)
x73=x72+abs(16*eps*p3)
x74=x73+abs(9*eps*p4)
x75=x74+abs(4*eps*p5)
x76=x75+abs(eps*p6)
x77=x76+abs(eps*p8)
x81=x77+abs(49*eps*p1)
x82=x81+abs(36*eps*p2)
x83=x82+abs(25*eps*p3)
x84=x83+abs(16*eps*p4)
x85=x84+abs(9*eps*p5)
x86=x85+abs(4*eps*p6)
x87=x86+abs(eps*p7)
 
x91=x87+abs(z1)
x92=x91+abs(z2)
x93=x92+abs(z3)
x94=x93+abs(z4)
x95=x94+abs(z5)
x96=x95+abs(z6)
x97=x96+abs(z7)
x98=x97+abs(z8)
x101=x98+abs(p1)
x102=x101+abs(p2)
x103=x102+abs(p3)
x104=x103+abs(p4)
x105=x104+abs(p5)
x106=x105+abs(p6)
x107=x106+abs(p7)
x108=x107+abs(p8)
 
# choose random event#
s2=ran(1)*x108
y1=(s2<x11)
y2=(s2<x12)&(s2>=x11)
y3=(s2<x13)&(s2>=x12)
y4=(s2<x14)&(s2>=x13)
y5=(s2<x15)&(s2>=x14)
y6=(s2<x16)&(s2>=x15)
y7=(s2<x17)&(s2>=x16)
y8=(s2<x21)&(s2>=x17)
y9=(s2<x22)&(s2>=x21)
y10=(s2<x23)&(s2>=x22)
y11=(s2<x24)&(s2>=x23)
y12=(s2<x25)&(s2>=x24)
y13=(s2<x26)&(s2>=x25)
y14=(s2<x27)&(s2>=x26)
y15=(s2<x31)&(s2>=x27)
y16=(s2<x32)&(s2>=x31)
y17=(s2<x33)&(s2>=x32)
y18=(s2<x34)&(s2>=x33)
y19=(s2<x35)&(s2>=x34)
y20=(s2<x36)&(s2>=x35)
y21=(s2<x37)&(s2>=x36)
y22=(s2<x41)&(s2>=x37)
y23=(s2<x42)&(s2>=x41)
y24=(s2<x43)&(s2>=x42)
y25=(s2<x44)&(s2>=x43)
y26=(s2<x45)&(s2>=x44)
y27=(s2<x46)&(s2>=x45)
y28=(s2<x47)&(s2>=x46)
y29=(s2<x51)&(s2>=x47)
y30=(s2<x52)&(s2>=x51)
y31=(s2<x53)&(s2>=x52)
y32=(s2<x54)&(s2>=x53)
y33=(s2<x55)&(s2>=x54)
y34=(s2<x56)&(s2>=x55)
y35=(s2<x57)&(s2>=x56)
y36=(s2<x61)&(s2>=x57)
y37=(s2<x62)&(s2>=x61)
y38=(s2<x63)&(s2>=x62)
y39=(s2<x64)&(s2>=x63)
y40=(s2<x65)&(s2>=x64)
y41=(s2<x66)&(s2>=x65)
y42=(s2<x67)&(s2>=x66)
y43=(s2<x71)&(s2>=x67)
y44=(s2<x72)&(s2>=x71)
y45=(s2<x73)&(s2>=x72)
y46=(s2<x74)&(s2>=x73)
y47=(s2<x75)&(s2>=x74)
y48=(s2<x76)&(s2>=x75)
y49=(s2<x77)&(s2>=x76)
y50=(s2<x81)&(s2>=x77)
y51=(s2<x82)&(s2>=x81)
y52=(s2<x83)&(s2>=x82)
y53=(s2<x84)&(s2>=x83)
y54=(s2<x85)&(s2>=x84)
y55=(s2<x86)&(s2>=x85)
y56=(s2<x87)&(s2>=x86)
y57=(s2<x91)&(s2>=x87)
y58=(s2<x92)&(s2>=x91)
y59=(s2<x93)&(s2>=x92)
y60=(s2<x94)&(s2>=x93)
y61=(s2<x95)&(s2>=x94)
y62=(s2<x96)&(s2>=x95)
y63=(s2<x97)&(s2>=x96)
y64=(s2<x98)&(s2>=x97)
y65=(s2<x101)&(s2>=x98)
y66=(s2<x102)&(s2>=x101)
y67=(s2<x103)&(s2>=x102)
y68=(s2<x104)&(s2>=x103)
y69=(s2<x105)&(s2>=x104)
y70=(s2<x106)&(s2>=x105)
y71=(s2<x107)&(s2>=x106)
y72=(s2>=x107)
 
# time for the next event
tr’=tr-log(ran(1))/x108
 
p1’=p1+sign(eps)*sign(p2)*y1+sign(eps)*sign(p3)*y2+sign(eps)*sign(p4)*y3+sign(eps)*sign(p5)*y4+sign(eps)*sign(p6)*y5+sign(eps)*sign(p7)*y6+sign(eps)*sign(p8)*y7-sign(z1)*y57-sign(p1)*y65
p2’=p2+sign(eps)*sign(p1)*y8+sign(eps)*sign(p3)*y9+sign(eps)*sign(p4)*y10+sign(eps)*sign(p5)*y11+sign(eps)*sign(p6)*y12+sign(eps)*sign(p7)*y13+sign(eps)*sign(p8)*y14-sign(z2)*y58-sign(p2)*y66
p3’=p3+sign(eps)*sign(p1)*y15+sign(eps)*sign(p2)*y16+sign(eps)*sign(p4)*y17+sign(eps)*sign(p5)*y18+sign(eps)*sign(p6)*y19+sign(eps)*sign(p7)*y20+sign(eps)*sign(p8)*y21-sign(z3)*y59-sign(p3)*y67
p4’=p4+sign(eps)*sign(p1)*y22+sign(eps)*sign(p2)*y23+sign(eps)*sign(p3)*y24+sign(eps)*sign(p5)*y25+sign(eps)*sign(p6)*y26+sign(eps)*sign(p7)*y27+sign(eps)*sign(p8)*y28-sign(z4)*y60-sign(p4)*y68
p5’=p5+sign(eps)*sign(p1)*y29+sign(eps)*sign(p2)*y30+sign(eps)*sign(p3)*y31+sign(eps)*sign(p4)*y32+sign(eps)*sign(p6)*y33+sign(eps)*sign(p7)*y34+sign(eps)*sign(p8)*y35-sign(z5)*y61-sign(p5)*y69
p6’=p6+sign(eps)*sign(p1)*y36+sign(eps)*sign(p2)*y37+sign(eps)*sign(p3)*y38+sign(eps)*sign(p4)*y39+sign(eps)*sign(p5)*y40+sign(eps)*sign(p7)*y41+sign(eps)*sign(p8)*y42-sign(z6)*y62-sign(p6)*y70
p7’=p7+sign(eps)*sign(p1)*y43+sign(eps)*sign(p2)*y44+sign(eps)*sign(p3)*y45+sign(eps)*sign(p4)*y46+sign(eps)*sign(p5)*y47+sign(eps)*sign(p6)*y48+sign(eps)*sign(p8)*y49-sign(z7)*y63-sign(p7)*y71
p8’=p8+sign(eps)*sign(p1)*y50+sign(eps)*sign(p2)*y51+sign(eps)*sign(p3)*y52+sign(eps)*sign(p4)*y53+sign(eps)*sign(p5)*y54+sign(eps)*sign(p6)*y55+sign(eps)*sign(p7)*y56-sign(z8)*y64-sign(p8)*y72
 
	z1’=z1+sign(p1)*y65
	z2’=z2+sign(p2)*y66
	z3’=z3+sign(p3)*y67
	z4’=z4+sign(p4)*y68
	z5’=z5+sign(p5)*y69
	z6’=z6+sign(p6)*y70
	z7’=z7+sign(p7)*y71
	z8’=z8+sign(p8)*y72
	 
	@ bound=100,000,000,meth=discrete,total=1,000,000,njmp=1000
	@ xp=tr,yp=p1
	@ xlo=0,ylo=-1000,xhi=40,yhi=1000
	 
	done
		

References

  1. Jacobson, G.A.; Diba, K.; Yaron-Jakoubovitch, A.; Oz, Y.; Koch, C.; Segev, I.; Yarom, Y. Subthreshold voltage noise of rat neocortical pyramidal neurones. J. Physiol. 2005, 564, 145–160. [Google Scholar] [CrossRef] [PubMed]
  2. Faisal, A.A.; Selen, L.P.; Wolpert, D.M. Noise in the nervous system. Nat. Rev. Neurosci. 2008, 9, 292–303. [Google Scholar]
  3. Azouz, R.; Gray, C.M. Cellular mechanisms contributing to response variability of cortical neurons in vivo. J. Neurosci. 1999, 19, 2209–2223. [Google Scholar] [CrossRef] [Green Version]
  4. Deweese, M.R.; Zador, A.M. Shared and private variability in the auditory cortex. J. Neurophysiol. 2004, 92, 1840–1855. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Steinmetz, P.N.; Manwani, A.; Koch, C.; London, M.; Segev, I. Subthreshold voltage noise due to channel fluctuations in active neuronal membranes. J. Comput. Neurosci. 2000, 9, 133–148. [Google Scholar] [CrossRef] [PubMed]
  6. White, J.A.; Rubinstein, J.T.; Kay, A.R. Channel noise in neurons. Trends Neurosci. 2000, 23, 131–137. [Google Scholar] [CrossRef]
  7. Calvin, W.H.; Stevens, C.F. Synaptic noise and other sources of randomness in motoneuron interspike intervals. J. Neurophysiol. 1968, 31, 574–587. [Google Scholar] [CrossRef] [PubMed]
  8. Fellous, J.M.; Rudolph, M.; Destexhe, A.; Sejnowski, T.J. Synaptic background noise controls the input/output characteristics of single cells in an in vitro model of in vivo activity. Neuroscience 2003, 122, 811–829. [Google Scholar] [CrossRef]
  9. Faisal, A.A.; Laughlin, S.B. Stochastic simulations on the reliability of action potential propagation in thin axons. PLoS Comput. Biol. 2007, 3, e79. [Google Scholar]
  10. Van Rossum, M.C.; O’Brien, B.J.; Smith, R.G. Effects of noise on the spike timing precision of retinal ganglion cells. J. Neurophysiol. 2003, 89, 2406–2419. [Google Scholar] [CrossRef]
  11. Longtin, A.; Bulsara, A.; Moss, F. Time-interval sequences in bistable systems and the noise-induced transmission of information by sensory neurons. Phys. Rev. Lett. 1991, 67, 656–659. [Google Scholar] [CrossRef] [PubMed]
  12. Collins, J.J.; Imhoff, T.T.; Grigg, P. Noise-enhanced information transmission in rat SA1 cutaneous mechanoreceptors via aperiodic stochastic resonance. J. Neurophysiol. 1996, 76, 642–645. [Google Scholar] [CrossRef]
  13. Douglass, J.K.; Wilkens, L.; Pantazelou, E.; Moss, F. Noise enhancement of information transfer in crayfish mechanoreceptors by stochastic resonance. Nature 1993, 365, 337–340. [Google Scholar] [CrossRef] [PubMed]
  14. Kosko, B.; Mitaim, S. Stochastic resonance in noisy threshold neurons. Neural Netw. 2003, 16, 755–761. [Google Scholar] [CrossRef]
  15. Mitaim, S.; Kosko, B. Adaptive stochastic resonance in noisy neurons based on mutual information. IEEE Trans. Neural Netw. 2004, 15, 1526–1540. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Bulsara, A.R.; Zador, A. Threshold detection of wideband signals: A noise-induced maximum in the mutual information. Phys. Rev. E Stat. Phys. Plasmas Fluids Relat. Interdiscip. Topics 1996, 54, R2185–R2188. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Priplata, A.A.; Niemi, J.B.; Harry, J.D.; Lipsitz, L.A.; Collins, J.J. Vibrating insoles and balance control in elderly people. Lancet 2003, 362, 1123–1124. [Google Scholar] [CrossRef]
  18. Russell, D.F.; Wilkens, L.A.; Moss, F. Use of behavioural stochastic resonance by paddle fish for feeding. Nature 1999, 402, 291–294. [Google Scholar] [CrossRef]
  19. Seth, A.K.; Izhikevich, E.; Reeke, G.N.; Edelman, G.M. Theories and measures of consciousness: An extended framework. Proc. Natl. Acad. Sci. USA 2006, 103, 10799–10804. [Google Scholar] [CrossRef] [Green Version]
  20. Tononi, G.; Edelman, G.M. Consciousness and complexity. Science 1998, 282, 1846–1851. [Google Scholar] [CrossRef]
  21. Metzinger, T. Neural correlates of consciousness: Empirical and conceptual questions. In Bradford Bks; MIT Press: Cambridge, MA, USA, 2000; 350p. [Google Scholar]
  22. Crick, F.; Koch, C. A framework for consciousness. Nat. Neurosci. 2003, 6, 119–126. [Google Scholar] [CrossRef] [PubMed]
  23. Koch, C.; Massimini, M.; Boly, M.; Tononi, G. Neural correlates of consciousness: Progress and problems. Nat. Rev. Neurosci. 2016, 17, 307–321. [Google Scholar] [CrossRef]
  24. Crick, F.; Koch, C. Consciousness and neuroscience. Cereb. Cortex 1998, 8, 97–107. [Google Scholar] [CrossRef] [PubMed]
  25. Kraikivski, P. Systems of oscillators designed for a specific conscious percept. N. Math. Nat. Comput. 2020, 16, 73–88. [Google Scholar] [CrossRef] [Green Version]
  26. Kraikivski, P. Building systems capable of consciousness. Mind Matter 2017, 15, 185–195. [Google Scholar]
  27. Freeman, W.J. Neurodynamics: An exploration in mesoscopic brain dynamics. In Perspectives in Neural Computing; Springer: London, UK, 2000; 398p. [Google Scholar]
  28. Fingelkurts, A.A.; Fingelkurts, A.A.; Neves, C.F.H. Phenomenological architecture of a mind and operational architectonics of the brain: The unified metastable continuum. J. N. Math. Nat. Comput. 2009, 5, 221–244. [Google Scholar] [CrossRef] [Green Version]
  29. Fingelkurts, A.A.; Fingelkurts, A.A.; Neves, C.F.H. Consciousness as a phenomenon in the operational architectonics of brain organization: Criticality and self-organization considerations. Chaos Solitons Fractals 2013, 55, 13–31. [Google Scholar] [CrossRef]
  30. Freeman, W.J. A field-theoretic approach to understanding scale-free neocortical dynamics. Biol. Cybern. 2005, 92, 350–359. [Google Scholar] [CrossRef] [Green Version]
  31. Tononi, G. An information integration theory of consciousness. BMC Neurosci. 2004, 5, 42. [Google Scholar] [CrossRef] [Green Version]
  32. Northoff, G. What the brain’s intrinsic activity can tell us about consciousness? A tri-dimensional view. Neurosci. Biobehav. R 2013, 37, 726–738. [Google Scholar] [CrossRef]
  33. Northoff, G.; Huang, Z.R. How do the brain’s time and space mediate consciousness and its different dimensions? Temporo-spatial theory of consciousness (TTC). Neurosci. Biobehav. R 2017, 80, 630–645. [Google Scholar] [CrossRef] [PubMed]
  34. Reggia, J.A. The rise of machine consciousness: Studying consciousness with computational models. Neural Netw. 2013, 44, 112–131. [Google Scholar] [CrossRef]
  35. Laureys, S. The neural correlate of (un)awareness: Lessons from the vegetative state. Trends Cogn. Sci. 2005, 9, 556–559. [Google Scholar] [CrossRef]
  36. Chalmers, D.J. Absent qualia, fading qualia, dancing qualia. In Conscious Experience. Ferdinand Schoningh; Metzinger, T., Ed.; Imprint Academic: Exeter, UK, 1995. [Google Scholar]
  37. Martinez-Garcia, M.; Gordon, T. A new model of human steering using far-point error perception and multiplicative control. In Proceedings of the 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Miyazaki, Japan, 7–10 October 2018; pp. 1245–1250. [Google Scholar]
  38. Zhang, Y.; Martinez-Garcia, M.; Gordon, T. Human response delay estimation and monitoring using gamma distribution analysis. In Proceedings of the 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Miyazaki, Japan, 7–10 October 2018; pp. 807–812. [Google Scholar]
  39. Gillespie, D.T. Exact stochastic simulation of coupled chemical reactions. J. Phys. Chem. 1977, 81, 2340–2361. [Google Scholar] [CrossRef]
  40. Rao, C.V.; Arkin, A.P. Stochastic chemical kinetics and the quasi-steady-state assumption: Application to the Gillespie algorithm. J. Phys. Chem. 2003, 118, 4999–5010. [Google Scholar] [CrossRef] [Green Version]
  41. Tyson, J.J.; Laomettachit, T.; Kraikivski, P. Modeling the dynamic behavior of biochemical regulatory networks. J. Theor. Biol. 2019, 462, 514–527. [Google Scholar] [CrossRef]
  42. Bernstein, D. Simulating mesoscopic reaction-diffusion systems using the Gillespie algorithm. Phys. Rev. E 2005, 71, 041103. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  43. McKane, A.J.; Newman, T.J. Predator-prey cycles from resonant amplification of demographic stochasticity. Phys. Rev. Lett. 2005, 94, 218102. [Google Scholar] [CrossRef] [Green Version]
  44. Mather, W.H.; Hasty, J.; Tsimring, L.S. Fast stochastic algorithm for simulating evolutionary population dynamics. Bioinformatics 2012, 28, 1230–1238. [Google Scholar] [CrossRef] [Green Version]
  45. Prerau, M.J.; Brown, R.E.; Bianchi, M.T.; Ellenbogen, J.M.; Purdon, P.L. Sleep neurophysiological dynamics through the lens of multitaper spectral analysis. Physiology 2017, 32, 60–92. [Google Scholar] [CrossRef] [Green Version]
  46. Tuominen, J.; Kallio, S.; Kaasinen, V.; Railo, H. Segregated brain state during hypnosis. Neurosci. Conscious. 2021, 2021, niab002. [Google Scholar] [CrossRef]
  47. Thilakavathi, B.; Devi, S.S.; Malaiappan, M.; Bhanu, K. EEG power spectrum analysis for schizophrenia during mental activity. Australas. Phys. Eng. Sci. Med. 2019, 42, 887–897. [Google Scholar] [CrossRef]
  48. Helakari, H.; Kananen, J.; Huotari, N.; Raitamaa, L.; Tuovinen, T.; Borchardt, V.; Rasila, A.; Raatikainen, V.; Starck, T.; Hautaniemi, T.; et al. Spectral entropy indicates electrophysiological and hemodynamic changes in drug-resistant epilepsy—A multimodal MREG study. Neuroimage Clin. 2019, 22, 101763. [Google Scholar] [CrossRef]
  49. Yunusa-Kaltungo, A.; Sinha, J.K.; Elbhbah, K. An improved data fusion technique for faults diagnosis in rotating machines. Measurement 2014, 58, 27–32. [Google Scholar] [CrossRef]
  50. Luwei, K.C.; Yunusa-Kaltungo, A.; Sha’aban, Y.A. Integrated fault detection framework for classifying rotating machine faults using frequency domain data fusion and artificial neural networks. Machines 2018, 6, 59. [Google Scholar] [CrossRef] [Green Version]
  51. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  52. Tian, Y.; Zhang, H.; Xu, W.; Zhang, H.; Yang, L.; Zheng, S.; Shi, Y. Spectral entropy can predict changes of working memory performance reduced by short-time training in the delayed-match-to-sample task. Front. Hum. Neurosci. 2017, 11, 437. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  53. James, W. Does ‘consciousness’ exist? J. Philos. Psychol. Sci. Methods 1904, 1, 477–491. [Google Scholar] [CrossRef]
  54. Dublanche, Y.; Michalodimitrakis, K.; Kummerer, N.; Foglierini, M.; Serrano, L. Noise in transcription negative feedback loops: Simulation and experimental analysis. Mol. Syst. Biol. 2006, 2, 41. [Google Scholar] [CrossRef] [Green Version]
  55. Nacher, J.C.; Ochiai, T. Transcription and noise in negative feedback loops. Biosystems 2008, 91, 76–82. [Google Scholar] [CrossRef]
  56. Austin, D.W.; Allen, M.S.; McCollum, J.M.; Dar, R.D.; Wilgus, J.R.; Sayler, G.S.; Samatova, N.F.; Cox, C.D.; Simpson, M.L. Gene network shaping of inherent noise spectra. Nature 2006, 439, 608–611. [Google Scholar] [CrossRef] [PubMed]
  57. Simpson, M.L.; Cox, C.D.; Sayler, G.S. Frequency domain analysis of noise in autoregulated gene circuits. Proc. Natl. Acad. Sci. USA 2003, 100, 4551–4556. [Google Scholar] [CrossRef] [Green Version]
  58. Boll, S. Suppression of acoustic noise in speech using spectral subtraction. IEEE Trans. Acoust. Speech Signal Process. 1979, 27, 113–120. [Google Scholar] [CrossRef] [Green Version]
  59. Yoshizawa, T.; Hirobayashi, S.; Misawa, T. Noise reduction for periodic signals using high-resolution frequency analysis. EURASIP J. Audio Speech Music Process. 2011, 2011, 5. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Dynamic behavior of a deterministic system (2) and the corresponding stochastic system. (A) Trajectories for processes p1 and p2 described by the system of differential equations (2), (B) the limit cycle in the p1p2-phase plane, (C) stochastic trajectories, (D) the limit cycle describing the solution of the stochastic system, (E) the normalized power spectral density characterizing the noise spectra in stochastic trajectories. The parameter R = 1 for the deterministic system (2) and R = 100 for the corresponding stochastic system.
Figure 1. Dynamic behavior of a deterministic system (2) and the corresponding stochastic system. (A) Trajectories for processes p1 and p2 described by the system of differential equations (2), (B) the limit cycle in the p1p2-phase plane, (C) stochastic trajectories, (D) the limit cycle describing the solution of the stochastic system, (E) the normalized power spectral density characterizing the noise spectra in stochastic trajectories. The parameter R = 1 for the deterministic system (2) and R = 100 for the corresponding stochastic system.
Entropy 23 00583 g001
Figure 2. The influence diagram for processes described by the system of equations (3). Arrow-headed lines represent a positive influence and bar-headed lines represent a negative influence of one process on another or itself. The dot-headed lines represent positive or negative influence depending on the sign of the ε parameter. Different line colors are used for tracking purposes. Red lines represent interactions between p i 1 , p i + 1 , and p i ; green lines represent interactions between p i , p i + 1 and p i 1 ; blue lines wire p i 1 , p i with p i + 1 .
Figure 2. The influence diagram for processes described by the system of equations (3). Arrow-headed lines represent a positive influence and bar-headed lines represent a negative influence of one process on another or itself. The dot-headed lines represent positive or negative influence depending on the sign of the ε parameter. Different line colors are used for tracking purposes. Red lines represent interactions between p i 1 , p i + 1 , and p i ; green lines represent interactions between p i , p i + 1 and p i 1 ; blue lines wire p i 1 , p i with p i + 1 .
Entropy 23 00583 g002
Figure 3. One parameter bifurcation diagram for the system of two p i and two z i processes (A), and real parts (blue curves) and imaginary parts (red curves) of eigenvalues as a function of parameter ε (B). Hopf bifurcation points (HB) were obtained at ε = ±1. The solid black line indicates the values of ε for which a spiral sink solution was obtained, the dashed black line indicates the values of ε for which a spiral source solution was observed, and open circles indicate periodic solutions. Further, the spiral sink solution was confirmed by the fact that real parts of all eigenvalues are negative for 1 < ε < 1 as shown in (B).
Figure 3. One parameter bifurcation diagram for the system of two p i and two z i processes (A), and real parts (blue curves) and imaginary parts (red curves) of eigenvalues as a function of parameter ε (B). Hopf bifurcation points (HB) were obtained at ε = ±1. The solid black line indicates the values of ε for which a spiral sink solution was obtained, the dashed black line indicates the values of ε for which a spiral source solution was observed, and open circles indicate periodic solutions. Further, the spiral sink solution was confirmed by the fact that real parts of all eigenvalues are negative for 1 < ε < 1 as shown in (B).
Entropy 23 00583 g003
Figure 4. The implication of noise on system dynamics depends on system size (the number of processes in the system): (A,D,G) Stochastic trajectories for all processes p i , (B,E,H) distribution histograms for process p 2 , and (C,F,I) normalized power spectral densities for process p 2 , which were obtained using systems of two, four, and eight p i -processes, respectively. The power spectral density for a process p i depends on the number of processes constituting the system.
Figure 4. The implication of noise on system dynamics depends on system size (the number of processes in the system): (A,D,G) Stochastic trajectories for all processes p i , (B,E,H) distribution histograms for process p 2 , and (C,F,I) normalized power spectral densities for process p 2 , which were obtained using systems of two, four, and eight p i -processes, respectively. The power spectral density for a process p i depends on the number of processes constituting the system.
Entropy 23 00583 g004
Figure 5. Spectral entropy decreases as a function of system size (the number of processes in the system). Open circles represent the average values of spectral entropy provided in Table 1. The solid line is a linear fit with the function displayed in the chart area.
Figure 5. Spectral entropy decreases as a function of system size (the number of processes in the system). Open circles represent the average values of spectral entropy provided in Table 1. The solid line is a linear fit with the function displayed in the chart area.
Entropy 23 00583 g005
Table 1. The dependence of spectral entropy for processes on system size.
Table 1. The dependence of spectral entropy for processes on system size.
The System of Two pi ProcessesThe System of Eight pi Processes
Process NameSpectral EntropyProcess NameSpectral Entropy
p10.5735p10.483
p20.5693p20.466
The average entropy value = 0.5714p30.474
The System of Four pi Processesp40.512
Process NameSpectral Entropyp50.4986
p10.539p60.4688
p20.542p70.4686
p30.5375p80.4664
p40.5343The average entropy value = 0.48
The average entropy value = 0.538
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kraikivski, P. Implications of Noise on Neural Correlates of Consciousness: A Computational Analysis of Stochastic Systems of Mutually Connected Processes. Entropy 2021, 23, 583. https://0-doi-org.brum.beds.ac.uk/10.3390/e23050583

AMA Style

Kraikivski P. Implications of Noise on Neural Correlates of Consciousness: A Computational Analysis of Stochastic Systems of Mutually Connected Processes. Entropy. 2021; 23(5):583. https://0-doi-org.brum.beds.ac.uk/10.3390/e23050583

Chicago/Turabian Style

Kraikivski, Pavel. 2021. "Implications of Noise on Neural Correlates of Consciousness: A Computational Analysis of Stochastic Systems of Mutually Connected Processes" Entropy 23, no. 5: 583. https://0-doi-org.brum.beds.ac.uk/10.3390/e23050583

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop