Skip to main content
Advertisement
  • Loading metrics

Leveraging heterogeneity for neural computation with fading memory in layer 2/3 cortical microcircuits

  • Renato Duarte ,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Resources, Software, Validation, Visualization, Writing – original draft, Writing – review & editing

    r.duarte@fz-juelich.de

    Affiliations Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6) and JARA Institute Brain Structure-Function Relationships (JBI-1 / INM-10), Jülich Research Centre, Jülich, Germany, Bernstein Center Freiburg, Albert-Ludwig University of Freiburg, Freiburg im Breisgau, Germany, Faculty of Biology, Albert-Ludwig University of Freiburg, Freiburg im Breisgau, Germany, Institute of Adaptive and Neural Computation, School of Informatics, University of Edinburgh, Edinburgh, United Kingdom

  • Abigail Morrison

    Roles Conceptualization, Funding acquisition, Methodology, Project administration, Resources, Software, Supervision, Validation, Writing – original draft, Writing – review & editing

    Affiliations Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6) and JARA Institute Brain Structure-Function Relationships (JBI-1 / INM-10), Jülich Research Centre, Jülich, Germany, Bernstein Center Freiburg, Albert-Ludwig University of Freiburg, Freiburg im Breisgau, Germany, Institute of Cognitive Neuroscience, Faculty of Psychology, Ruhr-University Bochum, Bochum, Germany

Abstract

Complexity and heterogeneity are intrinsic to neurobiological systems, manifest in every process, at every scale, and are inextricably linked to the systems’ emergent collective behaviours and function. However, the majority of studies addressing the dynamics and computational properties of biologically inspired cortical microcircuits tend to assume (often for the sake of analytical tractability) a great degree of homogeneity in both neuronal and synaptic/connectivity parameters. While simplification and reductionism are necessary to understand the brain’s functional principles, disregarding the existence of the multiple heterogeneities in the cortical composition, which may be at the core of its computational proficiency, will inevitably fail to account for important phenomena and limit the scope and generalizability of cortical models. We address these issues by studying the individual and composite functional roles of heterogeneities in neuronal, synaptic and structural properties in a biophysically plausible layer 2/3 microcircuit model, built and constrained by multiple sources of empirical data. This approach was made possible by the emergence of large-scale, well curated databases, as well as the substantial improvements in experimental methodologies achieved over the last few years. Our results show that variability in single neuron parameters is the dominant source of functional specialization, leading to highly proficient microcircuits with much higher computational power than their homogeneous counterparts. We further show that fully heterogeneous circuits, which are closest to the biophysical reality, owe their response properties to the differential contribution of different sources of heterogeneity.

Author summary

Cortical microcircuits are highly inhomogeneous dynamical systems whose information processing capacity is determined by the characteristics of its heterogeneous components and their complex interactions. The high degree of variability that characterizes macroscopic population dynamics, both during ongoing, spontaneous activity and active processing states reflects the underlying complexity and heterogeneity which has the potential to dramatically constrain the space of functions that any given circuit can compute, leading to richer and more expressive information processing systems. In this study, we identify different tentative sources of heterogeneity and assess their differential and cumulative contribution to the microcircuit’s dynamics and information processing capacity. We study these properties in a generic Layer 2/3 cortical microcircuit model, built and constrained by multiple sources of experimental data, and demonstrate that heterogeneity in neuronal properties and microconnectivity structure are important sources of functional specialization, greatly improving the circuit’s processing capacity, while capturing various important features of cortical physiology.

Introduction

Heterogeneity and diversity are ubiquitous design principles in neurobiology (or in any biological system, for that matter), covering components and mechanisms at every descriptive scale [1]. While many of these specializations, as well as their inherent complexity and diversity, are functionally meaningful, intrinsically linked and responsible for the brain’s computational capacity and efficiency (see e.g. [24]), others are bound to reflect epiphenomena, by-products of evolution, bearing little or no functional significance [5], or to subserve metabolic/maintenance tasks [6] that, while crucial for healthy function, are not directly involved in the computational process.

In order to study the functional role of heterogeneity in cortical processing, we need to modularize complexity [7]: exploit the degenerate nature of the system [8, 9] and heuristically identify groups of components that may behave as singular modules (depending on the scale and processes of interest). Once these tentative ‘building blocks’ are identified, we need to specify adequate levels of descriptive complexity that may shed light onto the underlying functional principles. These pursuits, however, pose severe epistemological problems as we currently have no clear intuition as to what ‘adequacy’ means in this context (see, e.g. [1012]).

Despite substantial progress, our ability to clearly identify the system’s core component ‘building blocks’ [3, 13] and to systematically characterize their relative contributions and potential functional roles is still a daunting task given the multiple spatial and temporal scales at which they operate, their complex, nested interactions and the, often incomplete or inconsistent, empirical evidence. Nevertheless, when studying neuronal computation, one needs to keep in mind that, despite its tremendous complexity or because of it, the brain is a machine ‘fit-for-purpose’ and optimized to process information and operate in complex, dynamic and uncertain environments, whose spatiotemporal structure it must extract in order to reliably compute [1, 4]. As such, it is important to disentangle and quantify which and how the different components of the system may modulate functional neurodynamics in meaningful ways that can be paralleled with related experimental observations, and to which degree these specializations affect the system’s operations.

In the next section, we attempt to identify an appropriate partition of potential building blocks of complexity and heterogeneity in neocortical microcircuits. We then proceed to building a biologically inspired and strongly data-driven microcircuit model that explicitly employs this partition in an attempt to understand and quantify the differential functional roles and consequences of these different sources of heterogeneity applying systematic and generic benchmarks. We will focus on generic properties in both the microcircuit architecture and its information processing capacity. Rather than modelling specific microcircuits in specific cortical regions and corresponding functional specializations (as in e.g. [14, 15]), we generalize our approach and collate data pertaining to layer 2/3 microcircuit features (anatomical and physiological) from different cortical regions, while evaluating the differential contributions of different forms of heterogeneity on the system’s dynamics and generic computational properties, focusing particularly on their suitability for continuous, online processing with fading memory [1618]. The approach also aims at providing an intermediate level of descriptive complexity for studying computation in cortical microcircuit models, as discussed below.

Heterogeneous building blocks in the neocortical circuitry

On a macroscopic level, hierarchical modularity is easily identifiable as a parsimonious design principle underlying various structural and functional aspects of cortical organization [1923]. Different anatomophysiological [24, 25], genetic / biochemical [2630] or functional [3133] criteria give rise to slightly different modular parcellations but, in combination, these criteria reveal the relevant ‘building blocks’, the most important features whose variations and recombinations give rise to the complexity and diversity of the cortical tissue [34].

For convenience, these features can be coarsely (and tentatively) grouped into neuronal, synaptic and structural components (see also [13]). Neuronal features refer to the different cell classes and their laminar and regional distributions [35] along with their characteristic electrophysiological and biochemical diversity [36, 37]. Synaptic components refer to a molecular default organization characterized by variations in the differential expression and transcription of genes involved in synaptic transmission [26, 28], which is reflected, for example, in regional receptor architectonics [3840]. Structural aspects include variations in cortical thickness and laminar depth [41] along with neuronal and synaptic density [42, 43] and input-output (both local and long-range) connectivity patterns [44, 45]. In combination, these features highlight default organizational principles whose variations across the cortical sheet are likely to contribute to the corresponding functional specializations.

Based either on morphological, electrophysiological or biochemical features (or, preferably, a combination thereof), several different classes of neurons can be identified throughout the neocortex (see e.g. [3537, 4648]). Apart from pronounced regional and laminar differences in the types of neurons that make up the cortex and their relative spatial distributions, every microcircuit in every cortical column is composed of diverse neuron types, with heterogeneous properties and heterogeneous behaviour.

Electrochemical communication between these diverse neuronal classes is an intricate, dynamic and very complex process involving a multitude of nested inter- and intracellular signalling networks [4951]. Their functional range spans multiple spatial and temporal scales [5254] and has, arguably, the most critical role in modulating microcircuit dynamics and information processing within and across neuronal populations [2, 3, 55]. The specificities of receptor composition and kinetics underlie the substantial diversity observed in the elicited post-synaptic potentials [56, 57] across different synapse and neuronal types (see, e.g. [5862]). This occurs because the receptors mediating these events have distinct biochemical and physiological properties depending on the type of neuron they are expressed in and, naturally, the type of neurotransmitter they are responsive to. These varying properties have known and non-negligible implications in the characteristic kinetics of synaptic transmission events occurring between different neurons [63] and strongly constrain the circuit’s operations.

Additionally, cortical microcircuits are not randomly coupled, but over-express specific connectivity motifs [6470], which bias and skew the network’s degree distributions [71] and/or introduce correlations among specific connections [72], thus selectively modifying the impact of specific pre-synaptic neurons on their post-synaptic targets. Analogously to the heterogeneities in neuronal and synaptic properties, such structural features are known to significantly impact the circuit’s properties [7377].

Descriptive adequacy

As discussed above, the variations in the anatomy and physiology of a cortical microcircuit are experimentally well established and have been shown to influence computational properties. However, the absence of complete descriptions of biophysical heterogeneity and well-substantiated empirical evidence to support them (primarily due to technical limitations), has forced computational studies to take a simplified approach. This has multiple additional advantages, such as greater likelihood of analytical tractability and lower overhead for the researcher in specifying the network parameters. Simplified, homogeneous models of spiking networks have proven to be a valuable tool for a theoretically grounded exploration of microcircuit dynamics, emerging from the interaction of excitatory and inhibitory populations [7881].

The generic principles established by studying simple balanced random networks have subsequently been applied to model specific cortical microcircuits with integrated connectivity maps and realistic numbers of neurons and synapses [82]. This approach revealed that some prominent features of spontaneous and stimulus-evoked activity and its dynamic flows through a cortical column can be accounted for by the macroscopic connectivity structure, mediated by local and long-range interactions [83]. However, by focusing on emergent dynamics, these studies neglect the functional aspects and the fact that cortical interactions serve computational purposes (but, see [84] for a study on the computational properties of the [82] microcircuit model). In addition, although there are good reasons for taking a minimalist approach, assuming uniformity and homogeneity on every component of the system tends to lack cogency with respect to established anatomical and physiological facts and to disregard biophysical and biochemical plausibility.

Some of these limitations were circumvented by [85], who not only accounted for detailed and empirically-informed connectivity maps, but also employed more biologically motivated models of neuronal and synaptic dynamics and placed them in an explicit functional/computational context. In line with the results obtained with the simpler microcircuit models [82, 84], this study demonstrated that considering realistic structural constraints is beneficial and significantly improves the computational capabilities of the circuit. Several other studies provide important steps to move away from homogeneous systems by incorporating variability (e.g. [75, 86, 87]), but tend to do so in a relatively arbitrary manner and/or focusing on specific forms of heterogeneity while retaining homogeneity in other components (depending on the scientific objectives of the study).

A completely different set of priorities for modelling cortical microcircuits are espoused by [88] in the framework of the continuous efforts of the Blue Brain project [89]. The Blue Brain approach lies on the other extreme of the descriptive scale, in that it attempts to model a cortical column in full detail, explicitly accounting for the complexities of cellular composition (based on neuronal morphology and electrophysiology), synaptic anatomy and physiology, as well as thalamic innervation, essentially constituting an in silico reconstruction of a cortical column (see also [90]). This approach is, naturally, extremely computationally expensive and its explanatory power is limited. The model complexity at this end of the spectrum is so close to the biophysical reality that it might not lend itself to a comprehensive understanding of dissociable and important functional principles any more readily than studying the real thing does. Nonetheless, it provides valuable insights in that it carefully replicates a lot of in vivo and in vitro responses of a real cortical column, while generating a wealth of complete and comprehensive data [88, 91].

Thus, we conclude that while simpler models are preferable, as they are generic enough to be broadly insightful and allow us to uncover general principles, we should ask the question: what is the cost of simplification? If a model simplifies away the core computational elements of the system, our ability to account for its operations is lost. The findings discussed above indicate that heterogeneity may be critical for the mechanisms of computation; therefore models aiming at uncovering computational principles in specific biophysical systems, such as a cortical column or microcircuit, should account for these features.

In this study, we attempt to bridge this descriptive gap by building microcircuit models, inspired and constrained by the composition of Layer 2/3, that account for key heterogeneities in neuronal, synaptic and structural properties. We implement all types of heterogeneity such that they can be switched on or off, thus enabling us to systematically disentangle and evaluate the roles played by the different types of heterogeneity in the different tentative building blocks, and how they collectively interact to shape the circuit’s dynamics and information processing capacity.

The choices and characteristics of the models and parameter sets used throughout this study, as well as the general microcircuit composition are constrained and inspired by multiple sources of experimental data (see section Data-driven microcircuit model and S1 Appendix) and account for the prevalence of different neuronal sub-types and their heterogeneous physiological and biochemical properties, the specificities of instantaneous synaptic kinetics and its inherent diversity as well as specific structural biases in cortical micro-connectivity. All models and model parameters were, as far as possible, chosen to directly match relevant experimental reports and minimize the introduction of arbitrary model parameters, in order to ensure that the effects observed are caused by realistic forms of complexity and heterogeneity and avoid imposing excessive assumptions or preconceptions on the systems studied, i.e. to “allow biology to speak for itself”.

In section Data-driven microcircuit model (complemented by the Methods section and the Supplementary Materials), we explain all the details of the models and model parameters used to build and constrain the microcircuit, as well as the underlying empirical observations that motivate the choices. After specifying and fixing all the relevant parameters to, as closely as possible, match multiple sources of empirical data, we study the effects of heterogeneity on population dynamics in a quiet state, where the circuit is passively excited by background noise (section Emergent population dynamics) and in an active state, where the circuit is directly engaged in information processing (section Active processing and computation). We evaluate the circuit’s sensitivity and responsiveness, as well as its memory and processing capacity, demonstrating a clear and unambiguous role of heterogeneity in shaping the proficiency of the system by greatly increasing the space of computable functions.

Results

Data-driven microcircuit model

In this section, we describe the process of building a complex data-driven cortical microcircuit model capturing some of the fundamental features of layer 2/3. We specify the detailed architecture, composition and dynamics of the microcircuits explored throughout this study as well as the motivation behind all model and parameter choices. In each relevant section, we highlight the differences between the respective homogeneous and heterogeneous conditions. A summarized, tabular description of the main models is provided in S1 Table, along with a list of the primary sources of experimental data used to constrain the model parameters, provided in S1 Appendix.

All the circuits analysed throughout this study are composed of N = 2500 point neurons (roughly corresponding to the size of a layer 2/3 microcircuit in an anatomically defined cortical column; [92]), divided into NE = 0.8 N excitatory, glutamatergic neurons and NI = 0.2 N inhibitory, GABAergic neurons. In addition, we further subdivide the inhibitory population in two sub-classes, I1 and I2 (with and ), corresponding to fast-spiking and non-fast-spiking interneurons, respectively (see Neuronal properties). Accordingly, there are nine different synapse types (all possible connections between neuronal populations), with distinct, specific response properties (see Synaptic properties). Similarly, there are nine connection probabilities from which random connections are drawn (see Structural properties).

For each of the key features of neuronal, synaptic and structural properties, we differentiate between the homogeneous case, where all properties are identical, and the heterogeneous case, where properties are drawn from appropriately chosen distributions. In this way, we can tease apart the differential effects of the three sources of heterogeneity considered here: neuronal, synaptic and structural.

For consistency, all the circuits’ structural (and synaptic) features are constrained primarily by the composition of layer 2/3 in the C2 barrel column in the mouse primary somatosensory cortex (S1), given the availability of direct, complete and significantly explored experimental datasets (e.g. [9294]).

Neuronal properties.

We divide the neurons into one excitatory (E: glutamatergic, pyramidal neurons) and two inhibitory (I1: GABAergic, fast spiking interneurons, I2: GABAergic, non-fast spiking) classes, consistent with the reports in [92, 93] and [94]. The three classes differ in relative excitability and firing properties, providing substantially more electrophysiological diversity than commonly exhibited in cortical models on the abstraction level of point neurons (e.g. [81, 82, 9597]), while still being of a manageable degree of complexity. All neurons are modelled using a simplified adaptive leaky integrate-and-fire scheme (see Methods and [98, 99]).

The parameters used for the different neuron types (summarized in Table 1) were chosen to match the respective ranges reported in the literature, considering both the data collected in the NeuroElectro database [36, 100] encompassing tens of unique data points from different experimental sources (see S1 Appendix) and those reported in [92, 93, 101, 102], and [94] given the completeness of these reports and the direct similarities with our case study.

thumbnail
Table 1. Single neuron parameter sets.

In the heterogeneous condition, three neuronal types have the values of the described parameters randomly drawn from a normal () or lognormal () distribution. The parameter values for these distributions were determined taking into account multiple sources of experimental data (see S1 Appendix). Note that, to make comparisons simpler, the values displayed for the lognormal distributions correspond to the mean and standard deviation of the distribution, not the actual μ and σ parameters (see Materials and methods).

https://doi.org/10.1371/journal.pcbi.1006781.t001

Due to the nature of the chosen neuronal formalism (see Neuronal dynamics), some model parameters have no direct proxy with experimental measures and were instead determined considering their relations to other variables for which direct experimental data exists. For example, Vreset is not, strictly speaking, a biophysically meaningful variable; its value was chosen considering the data for afterhyperpolarization potentials EAHP relative to the resting membrane potentials EL. Whenever such discrepancies occurred or when parameters had to be derived relative to others, we selected those values that better matched the experimental data. Overall, we observed a remarkable consistency in the ranges of parameter values considered across the different data sources.

Neuronal heterogeneity.

In the homogeneous condition, the parameters for all neurons of a given class are fixed and chosen as a representative example of that class (see left side of Table 1 and bold fI curves in Fig 1). To incorporate neuronal heterogeneity, the specific values of each parameter are independently drawn from a probability distribution, specific to each neuron class (see right side of Table 1 and individual fI curves in Fig 1).

thumbnail
Fig 1. Response properties of the three different neuronal types, E (left), I1 (middle) and I2 (right).

For each neuronal class, the central panels depict single neuron fI curves and the marginal panels give the corresponding distributions of the neuron’s rheobase currents (Irh[pA], bottom), minimum firing rates (νmin[Hz], left); maximum firing rates (νmax[Hz], right) and the slope of the fI curve (Slope[Hz/nA], top). The data was obtained from 1000 neurons of each class. The membrane potential traces depicted in the bottom correspond to the response of the homogeneous neurons (bold traces in the fI curves) to a stimulus step of amplitude Irh + 10 pA, for a duration of 1 second.

https://doi.org/10.1371/journal.pcbi.1006781.g001

To ensure comparability among conditions, we tuned the intrinsic adaptation parameters (a, b, see Neuronal dynamics) independently for each neuron type, in order to retain the following relations among neuronal classes:

  • Rheobase current (excitability)—Irh(E) ≈ Irh(I1) > Irh(I2)
  • Slope of f-I curve (gain)—g(I1) > g(I2) > g(E)
  • Minimum firing rate—νmin(I1)> νmin(I2) > νmin(E)
  • Maximum firing rate—νmax(I1) > νmax(I2) > νmax(E)

This led to the following values (a, b) for sub-threshold and spike-triggered adaptation, respectively: E = (4, 30), I1 = (0, 0), I2 = (2, 10).

One of the advantages of using the modelling approach presented in this study is the possibility to directly interpret the biophysical meaning of the different parameters. In this case, these results highlight the fact that fast spiking interneurons (I1) do not exhibit any form of intrinsic adaptation, which is a reasonable result for a neuron whose primary role is to respond quickly and provide dense and fast, feed-forward inhibition [103]. It should be noted that, after fixing all the neuronal parameters as described above, the absolute values of these four properties did not exactly match the corresponding experimental reports. For example, the values obtained for the rheobase currents were, on average, larger than the ranges reported experimentally, for all 3 neuron classes (see S2 Table). These discrepancies in absolute values and their potential impact were ameliorated by ensuring the above relations between the key response properties of the different neuronal classes were retained.

Synaptic properties.

With three neuronal populations, as described above, there are nine possible types of synaptic connection, i.e. syn ∈ {EE, EI1, EI2, I1E, I2E, I1I1, I1I2, I2I1, I2I2}. Synapse types can be grouped by transmitter composition and/or post-synaptic effect, as excitatory synE = {EE, I1E, I2E} and inhibitory synI = {EI1, EI2, I1 I1, I1 I2, I2 I1, I2 I2}, as illustrated in Fig 2a. For simplicity, we consider all synaptic transmission as being mediated by either glutamate (excitatory synapses) or gamma-aminobutyric acid (GABA, inhibitory synapses), as illustrated in Fig 2b. This is a reasonable simplification given that these are, by far, the primary neurotransmitters used in the neocortex, as demonstrated by immunohistochemistry [104] and receptor autoradiography studies [39, 105, 106]. Additionally, this is a common assumption underlying the great majority of theoretical and computational studies.

thumbnail
Fig 2. Diversity of synaptic transmission in the microcircuit.

(a) Illustration of the three neuron and nine synapse types considered in this study. (b) Characteristics of synaptic transmission in excitatory (blue) and inhibitory (red) synapses, comprising fast and slow components. (c) Kinetics of spike-triggered PSPs dependent on pre- and post-synaptic neuron type and determined by the specific receptor kinetics and composition in the postsynaptic neuron. The depicted traces are the PSPs at rest in E neurons (top row) and at a fixed holding potential of −55 mV for inhibitory neurons (I1: middle row, I2: bottom row), as a function of receptor composition and correspond to the values reported in Table 2 (last column). (d) Distributions of PSP amplitudes and latencies after rescaling (by wsyn and dsyn, respectively, see Table 3) in the homogeneous (top arrows) and heterogeneous (distributions) conditions, for synapses onto E (top), I1 (middle) and I2 (bottom) neurons. Note that the latency distributions are discrete in that for technical reasons, they can only assume values that are a multiple of the simulation resolution.

https://doi.org/10.1371/journal.pcbi.1006781.g002

To accommodate the wealth of data available regarding the phenomenology of synaptic transmission and to provide a significant step forward from the traditional approaches, we chose a relatively complex biophysical model [99, 107109]; primarily due to its plausibility but also due to the availability of direct parameters in the experimental literature (e.g. [109]). The model, described fully in the Materials and Methods, captures the detailed kinetics of single receptor conductances (Fig 2b).

The use of this model allows us to specify different receptor parameters depending on the neuron type they are expressed in (see Table 2), in order to directly match empirical data on receptor kinetics and relative conductance ratios in different neuronal classes. In the absence of such direct data for specific synapses (particularly those involving I2 neurons), we tuned these parameters to match the resulting PSP/PSC kinetics, as well as the relative ratios of total charge elicited by the receptors that compose such synapses. For a detailed list of the data sources used to constrain these parameters, consult S1 Appendix.

thumbnail
Table 2. Differential receptor expression in the different neuron types.

The kinetics and relative conductance of the different receptors that make up an inhibitory or excitatory synapse onto each neuron results in post-synaptic potentials with equally discernible kinetics. The parameters were chosen based on the corresponding receptor conductance data, if directly available and/or on the characteristics of the resulting PSPs, resulting in a substantial diversity of postsynaptic responses (see Fig 2c). (*) The PSP values reported in this table were obtained by fitting a double exponential function to single, spike-triggered PSPs, recorded at rest for E neurons and at a fixed holding potential of −55 mV for I neurons.

https://doi.org/10.1371/journal.pcbi.1006781.t002

Having fixed the kinetics of post-synaptic responses according to neuron class (Table 2), we finally rescale the PSP amplitudes (wsyn) and latencies (dsyn) independently for each synapse type (see below), in order to account for the effects of different presynaptic neuron classes and to explicitly match the data reported in [93]. As a result of this parameter fitting process, the responses generated by the synaptic model are good matches to the responses experimentally observed in the nine types of biological synapses represented in this study.

Synaptic heterogeneity.

As all the receptor parameters are fixed and neuron-specific (Table 2), we introduce synaptic heterogeneity by simply distributing the individual values of weights and delays (Fig 2d and Table 3). Whereas in the homogeneous condition, synaptic efficacies (wsyn) and conduction delays (dsyn) are fixed and equal for all connections of a given type, in the heterogeneous condition, these values are randomly drawn from lognormal distributions, left-truncated at 0 for weight distributions and 0.1 for delay distributions, and parameterized such that the distributions’ means are equal to the homogeneous value. For consistency among the various data sources, we fix the connectivity parameters, including not only structural aspects, but also synaptic weights and delays, to match the data reported in [93]. It is worth noting that variability in delay distributions could also be considered structural since they are primarily determined by morphological features (axonal length). However, the synaptic delays were chosen to match the observed latencies in postsynaptic responses. So, for convenience and simplicity, we treat variations in synaptic weights and delays as synaptic properties.

thumbnail
Table 3. Synaptic and structural parameters in the microcircuit.

Each of the nine connection types (which can be grouped as indicated) is characterized by a specific connection density, weight and delay. In the homogeneous condition, weights and delays are fixed and equal to the mean values (, ) for all synapses of a given type, whereas in the heterogeneous condition they are independently drawn from lognormal distributions with the corresponding mean and standard deviation. The last two rows in the table are the connection-specific structural bias parameters, used to skew the network’s degree and weight distributions. The indicated values were taken directly from [110] and [71]. The cases marked with—or x, correspond to connections that were either tested, revealing no significant effect (-) or untested due to missing data (x). In both cases, we set the corresponding values to 0.

https://doi.org/10.1371/journal.pcbi.1006781.t003

The mean weight values were chosen to rescale the PSP amplitude of each synapse type to the target value. Additionally, as described below, if both structural and synaptic heterogeneity conditions are considered simultaneously, the weight distributions are skewed in order to introduce structural weight correlations (see Generating structural heterogeneity in Materials and methods).

Structural properties.

The structure of the network is defined by the density parameter psyn, which is specific for each of the nine connection types. For consistency with the results and methods presented in [110] and [71], we set the connection densities (psyn) to the values reported in [93] (see Table 3). However, it is worth noting that the values reported in the literature can vary substantially, possibly reflecting methodological/technical limitations and/or the fact that connection density is a highly region- / species-specific feature. In fact, among all the complex parameter sets used throughout this study, the single parameter that was most difficult to reconcile across multiple sources was psyn. In the homogeneous case, random connections are created between neurons in a source population pre and target population post (with pre, post ∈ {E, I1, I2}) with a probability given by psyn.

Structural heterogeneity.

In order to account for structural heterogeneity, we bias the network’s degree distributions by modifying the structure of the connectivity matrix Asyn, following the methods introduced in [71, 110] and validated against the same primary sources of experimental data used in this study.

By controlling the skewness of the out-/in-degree distributions (kout/in parameters, see Generating structural heterogeneity in Materials and methods), we can generate completely random, uniform connectivity (kout/in = 0, Fig 3a) or highly structured in-/out-degree distributions, with a larger variance in the number of connections per neuron (kout/in > 0, Fig 3b). For the structural heterogeneity condition, these parameters were fixed to the values that were shown to provide a better fit for the experimentally determined connectivity data (see [71, 110] and Table 3).

thumbnail
Fig 3. Microcircuit connectivity for (a) homogeneous and (b) heterogeneous network structures.

The large panels show the circuit’s complete connectivity matrix Asyn, comprising all connections among all neuron classes. The panels on the left sides show the corresponding out-degree (kout) distributions for the neuronal populations (E, I1 and I2 from top to bottom); the panels at the bottom show the corresponding in-degree (kin) distributions (E, I1 and I2 from left to right).

https://doi.org/10.1371/journal.pcbi.1006781.g003

Additionally, in conditions where synaptic heterogeneity is also present (fully heterogeneous circuit), structural heterogeneity is further expressed as a bias in the synaptic efficacies for all incoming and outgoing connections to a given neuron. Following [71, 110] and [72], this bias is implemented by rescaling individual synaptic efficacies in order to introduce correlations between them, see Table 3 and Materials and methods. It should be noted that structural heterogeneity only modified connections from E neurons as most of the other connections were shown to have negligible effects (marked with—in Table 3) or were not successfully tested (marked with x), due to technical constraints. Even though this is likely to be incomplete, it appears to be sufficient to capture the most significant structural effects and their impact in population dynamics, while explicitly accounting for the experimental data in [93]. So, for consistency, we implemented and parameterized structural heterogeneity in the same manner as that reported in [71] and [110].

Emergent population dynamics

Throughout this study, and in order to isolate the effects of different sources of heterogeneity, we consider five different microcircuits: fully homogeneous (Hom), structural (Str), neuronal (Neu) or synaptic (Syn) heterogeneity in isolation and a fully heterogeneous circuit (Het), accounting for the combined effects. In this section, we set out to quantify and evaluate the specific impact of the different forms of heterogeneity on the characteristics of population activity. To do so, we consider the circuit’s responses to an unspecific and stochastic external input, modelling cortical background / ongoing activity (see Input specifications in Materials and methods). We determine and compare the circuit’s responsiveness by looking at the population rate transfer functions, as exemplified in Fig 4a for I2 neurons (complete results are provided in S1 Fig), and summarize the results by the change in absolute gain (ΔGain) and offset (ΔOffset) introduced by each source of heterogeneity, relative to the homogeneous condition (Fig 4b).

thumbnail
Fig 4. Characteristics of population activity in the quiet state.

(a): Rate transfer function of the I2 neurons in the homogeneous (dots) and structurally heterogeneous (squares) conditions. (b): Change in absolute gain (ΔGain, expressed as a percentage of rate gain relative to the baseline homogeneous condition) and offset (ΔOffset), see annotations in (a), for the E (blue), I1 (red) and I2 (orange) populations, for each heterogeneous condition (structural, neuronal, synaptic, and all forms combined), relative to the homogeneous condition. Error bars indicate the standard deviation across ten simulations per condition. (c): Fraction of active E neurons in the different conditions as a function of input rate νin. (d): Distributions of absolute distances between neurons’ mean membrane potentials 〈Vm〉 and their firing threshold Vth, for the three neuronal populations considered, under the different heterogeneity conditions. The dashed lines indicate the corresponding values reported in [94]. For illustrative purposes, we depict Vm traces for a small set of randomly chosen neurons in the background. (e): Effective membrane time constants (τeff = Cm/〈Gtotal〉) in the noise-driven regime, for the different neuronal classes and conditions. Error bars correspond to the means and standard deviations across each population. The dashed lines indicate the baseline values (τ0 = Cm/gleak). All results depicted were calculated over an observation period of 10 s. The results depicted in (d, e) were acquired with a fixed input rate of νin = 10 spikes/s and correspond to the distributions across the population, for a single simulation per condition.

https://doi.org/10.1371/journal.pcbi.1006781.g004

All heterogeneous conditions, particularly neuronal and synaptic, cause a slight offset for all neuron types (more significant for I2 neurons), making them more responsive (firing at lower input rates) but the effect is not substantial (Fig 4b, top). In most of the conditions analysed, the E population is rather unresponsive, with less than 1% of the neurons active (Fig 4c) and firing at rates inferior to 1 spikes/s, regardless of the input rate. While structural and neuronal heterogeneity are incapable of circumventing this effect, synaptic heterogeneity appears to be important for the network to fire at more reasonable rates (albeit, still very sparsely), resulting in a substantial modulation of the gain of the rate transfer function (Fig 4b, bottom).

It should be noted that the impact of structural heterogeneity alone is mitigated by the low E rates, since the structural bias exists only within excitatory synapses or between excitatory neurons and fast-spiking interneurons (i.e. E E, I1 E, see Table 3). So, if the E population rarely fires, it is difficult to ascertain the effects of structural heterogeneity, suggesting either that its relevance pertains mostly to active states, when population activity is slightly higher, or that it is negligible at this scale.

The extremely sparse firing of E neurons that we observe is consistent with physiological measurements in layer 2/3 (e.g. [92, 94, 111114]), but it significantly limits the degree to which we can quantify the effects of heterogeneity on population activity. So, in order to obtain a greater insight, we look at the sub-threshold responses and characteristics of membrane potential dynamics (Fig 4d and 4e). Excitatory neurons are always significantly hyperpolarized, with their mean membrane potentials kept far from threshold (Fig 4d, blue) and thus require much stronger depolarizing inputs to fire, compared with both inhibitory types. The inhibitory populations are, on average, much more depolarized and their membrane potentials fluctuate closer to their firing thresholds, particularly I1 (Fig 4d, red). Qualitatively, the ratio of average degree of depolarization among the different populations is retained across all conditions, with I1 neurons being strongly depolarized, followed by I2 and E and is consistent with experimental reports for circuits in a state of quiet wakefulness (Fig 4d, dashed lines). This feature stems directly from the electrophysiological properties of the different neuronal classes and the interactions among the 3 populations (given that it is already observed in the homogeneous circuit). Both synaptic and neuronal heterogeneity greatly increase the variability in the distribution of mean membrane potentials across all the neurons and cause a slight overlap between E and I2 populations, an effect that is also consistent with experimental evidence [94].

Active synapses contribute to the total membrane conductance and cause a deviation from the resting membrane time constant [115, 116]. This shunting effect may be mild in sparsely active circuits [117], but it provides a form of activity-dependent modulation of single neurons’ integrative properties [118], which constrain the circuit’s responsiveness. In the absence of synaptic input, I1 neurons have faster responses, characterized by a short baseline membrane time constant (τ0 = Cm/gleak ≈ 10.7 ms), whereas I2 and E neurons are slower (τ0 ≈ 22.3 and 25.1 ms, respectively) and can thus integrate their synaptic inputs over a larger time scale (dashed lines in Fig 4e). This relationship between the neuronal classes (τeff(I1) < τeff(I2) < τeff(E)) is a consequence of the neurons’ physiological properties and is consistent with empirical evidence [59, 118, 119]. However, when driven by external input, the ratio is modified and I2 neurons respond slowest, i.e. τeff(I2) > τeff(I1) > τeff (E). The presence of heterogeneous synapses is important to ameliorate the magnitude of this shunting effect (Fig 4e), which is very substantial in all conditions. It should be noted that, while the sparsity of recurrent activity (particularly that of E neurons), would prompt us to expect a very minor reduction in τeff, the observed results are caused by the large synaptic input provided as background.

Excitation/inhibition balance.

The balance of excitation and inhibition is one of the most important and widely observed features in the neocortex. It plays a pervasive role in modulating and stabilizing circuit dynamics [120], shifting the population state [121123], selectively gating and routing signals [124126] and maintaining sparse, distributed dynamics [114, 117], critical for adequate processing and computation [127129].

As demonstrated in the previous section (Fig 4), the different sources of heterogeneity significantly influence the circuit’s responsiveness, partially by modifying how the different neurons and neuronal populations receive and integrate their synaptic inputs. These differences can also be observed in the amplitude and time course of the total excitatory and inhibitory drive onto each neuron, as can be seen in Fig 5. The results shown are a compound effect of the kinetics of the specific receptors involved and the post-synaptic currents they elicit, the physiological properties of the different neuronal classes as well as the density of specific connections.

thumbnail
Fig 5. Balance of excitation and inhibition in excitatory neurons driven by background, Poissonian input at νin = 10 spikes/s.

(a): Distribution of mean amplitudes of excitatory (blue) and inhibitory (red) membrane currents onto E neurons in the different conditions, as well as the absolute difference between them (grey). For each condition, we show the single data points, where the currents onto a given neuron are summarized as a set of 3 points (〈Iexc〉 in blue, 〈Iinh〉 in red and |〈Iinh〉 − 〈Iexc〉| in grey). Overlaid on top of these data points are the distributions across all the neurons, summarized as box-plots: the box represents the first and third quartiles (IQR); the median is marked in red; the whiskers are placed at 1.5 IQR and the outliers can be seen in the underlying data points. The white markers in the middle display the mean difference of synaptic amplitudes across all neurons, for each condition. (b): zero-lag correlation coefficient between excitatory and inhibitory synaptic currents (mean and standard deviation across the E population). (c, d): Examples of the total excitatory and inhibitory synaptic currents received by a randomly chosen E neuron in the homogeneous and fully heterogeneous conditions, respectively. Results in (a) and (b) were gathered from 10 simulations per condition.

https://doi.org/10.1371/journal.pcbi.1006781.g005

In a globally balanced state, the amplitudes of excitatory and inhibitory synaptic currents cancel each other on average. This occurs in our microcircuit model only in the presence of neuronal heterogeneity (Fig 5a). Variability in connectivity structure is indistinguishable from the homogeneous condition, whereas variability in synaptic weights and delays significantly increases the variance in the distribution of post-synaptic current amplitudes, but does not shift the mean. This results in an inhibition-dominated synaptic input, resembling that of the homogeneous condition (see also Fig 5c), despite the substantially different distributions.

Apart from being balanced on average, a condition of “detailed” balance [126, 130] is characterized by E and I currents that closely track each other and are strongly anti-correlated (Fig 5b). In the homogeneous circuit, excitatory and inhibitory currents are most weakly anti-correlated (CC ≈ −0.63, see also Fig 5c). Synaptic heterogeneity causes a slight improvement, but the most important contribution to this effect comes from neuronal heterogeneity. In this condition, the mean correlation coefficient reaches CC ≈ −0.8, although it also introduces a greater variance than synaptic or structural heterogeneity (see also Fig 5d).

Both global and detailed balance thus appear to be emergent properties of heterogeneous microcircuits, primarily due to neuronal diversity, but encompassing also a clear influence of synaptic diversity. As is the case with all results presented so far, the fully heterogeneous circuit retains several key properties of interest, but appears to inherit them from different sources. As before, the effects of structural heterogeneity are mitigated by the very sparse firing of the E population, which render its effects moot and no significant deviation from the homogeneous condition is observed.

Active processing and computation

In order to induce a functional state, engaging the circuit in active processing, we introduce an additional input signal, directly encoded as a piece-wise constant somatic current (see Input specifications in Materials and methods). We began by tuning the input amplitudes (of both background input firing rate νin and external input current ρu) independently, for each condition, in order to approximate the relative ratio of mean firing rates among the different populations (see S2 Fig), i.e. we attempt to find a combination of input parameters that allows the mean firing rates to remain within realistic bounds (νE ∈ [0.5, 5], , , considering the values reported in [93, 94, 111, 114]).

We consider the circuit’s responses to this input signal as an active state, as opposed to the condition explored in the previous sections, where the circuit was driven solely by background, stochastic input (noise). It is worth noting, however, that the similarities between what we call quiet and active states and their biological counterparts are limited (see Discussion). In the following, we show that despite these limitations, the actively engaged circuit operates in similar dynamic regimes to its biological counterpart and maintains the key statistical features that are most likely to play a significant role in modulating the circuit’s processing capacity.

In this section, we assess the microcircuit’s capacity to compute complex functions of the input signal, as described in the Materials and Methods. Note that we purposefully removed any predetermined structure in the input signal, such that the measurements reflect the properties of the system and not the acquisition of structural information in the input. If we were to consider naturalistic sensory input as the driving signal, this would not be the case. Furthermore, we intentionally focus on generic information processing as the ability the perform arbitrary transformations on an input signal and not on specific functions which might be performed by specific microcircuits.

Spiking activity in the active state.

Due to the extremely sparse firing observed in the quiet state, an adequate comparison of spiking statistics is only sensible in the active condition. While no explicit effort was taken to constrain the circuit’s operating point when tuning the input parameters (the focus was purely on average firing rates), all conditions operate on an asynchronous irregular regime (exemplified by the raster plots in Fig 6a). This regime is characterized by low pairwise correlations (CC < 0.03) and high coefficients of variation (CVISI ≈ 0.9), for all neuron classes, in all conditions. Each condition generates different spiking responses with slight variations in activity statistics. The profiles of the activity statistics for the three neuronal classes are summarized in Fig 6c–6g for the five different conditions. The complete results, displaying the specific profiles for the different classes and conditions separately can be consulted in S3 Fig.

thumbnail
Fig 6. Statistical properties of spiking activity in the active state for the E (blue), I1 (red) and I2 (orange) populations.

(a, b): Example raster plots depicting the activity of a small, randomly chosen subset of neurons, over a recording period of 2.5s in a homogeneous and fully heterogeneous circuit, respectively. (c)-(g): Activity statistics profiles for the different populations (overlaid) in the different conditions. The radial axes represent: mean pairwise correlation coefficient (CC), coefficient of variation of the inter-spike intervals (CVISI), mean firing rate (ν[spikes/s]), entropy (HISI[bits]) and burstiness (ISI5%[s]) of the firing patterns. The values along the radial axes are normalized to the largest value across all neuron classes in each condition. The units have been removed for better legibility (see S3 Fig). (h)-(j): Distribution of firing rates ν[spikes/s], CVISI and CC for the E population in the different heterogeneity conditions (see legend in (j)). The results depicted are population averages, obtained from one simulation per condition.

https://doi.org/10.1371/journal.pcbi.1006781.g006

In the homogeneous condition (Fig 6a and 6c), I1 neurons exhibit the most distinctive profile, with the highest amount of synchrony and the largest firing rates as well as a very bursty and regular firing relative to any other neuronal class and any other condition. Consistent with empirical observations [94, 114, 131, 132], E neurons retain an extremely low firing rate (νE ≤ 1 spikes/s), even when stimulated by the extra input current that characterizes the active state. The main difference is that a larger fraction of the population is engaged and actively firing. This result demonstrates that sparse firing of E neurons is a stable characteristic of layer 2/3 microcircuits, emerging from the strong and dense inhibition provided primarily by I1 neurons that, firing at very high rates, strongly inhibit the E population, regardless of the variations introduced by different sources of heterogeneity and the addition of the extra excitatory input.

The impact of the various heterogeneity conditions particularly affects the degree of synchronization and burstiness across the different populations. Only the firing rates of I1 neurons are significantly modified by heterogeneity, whereas for all other neuron classes, they remain consistently low. Irregularity and randomness in the firing patterns are mostly unaffected as is clear by observing the similarity in the respective axes (HISI and CVISI in Fig 6c–6g).

The effects of structural heterogeneity (Fig 6d) are only noticeable on the neuronal classes that are directly affected (E and I1, see Table 3); no changes in activity statistics are observable for the I2 population (orange profiles in Fig 6c and 6d). Excitatory neurons fire less synchronously and exhibit a much lower tendency to fire in short spike bursts, compared with the homogeneous condition. On the other hand, I1 neurons show a slight decrease in synchrony and firing rate.

Diversity in neuronal parameters (Fig 6e) strongly affects the response properties of I2 neurons, slightly increasing their firing rates and correlation coefficient. The most noticeable effect, however is a greatly increased tendency to fire in short bursts (ISI5% ≈ 1s) which is the most significant deviation of the standard profile exhibited by this neuronal class in all other conditions (for a more complete comparison, consult S3 Fig). Heterogeneous E neurons have a higher tendency to fire synchronously (albeit still with very low CC), compared to any other condition. As for the I1 population, the most significant effect of neuronal heterogeneity is a reduction of the mean firing rate.

Synaptic heterogeneity (Fig 6f) causes a clear alteration of the firing profile of all neuron classes, particularly E and I1, resulting in a noticeable decrease in synchronization in all populations, thus having a marked decorrelating effect. It also produces a substantial reduction in the tendency for burst spiking in the E and I2 populations. The firing rates of both inhibitory populations are reduced (due to the chosen input parameters, see S2 Fig) which, consequently, leads to a slight increase in the E neurons’ firing rates that also fire slightly more regularly.

Interestingly, the firing profile observed in the fully heterogeneous circuit (Fig 6b and 6g) exhibits some unique features, different from any of those created by the individual sources of heterogeneity in isolation. Particularly prominent is the complete abolishment of any degree of synchronization in any of the neuronal populations, which show the smallest correlation coefficients of all the cases considered. This effect is likely primarily acquired from synaptic heterogeneity, but goes further than the effect observed there. The firing profile of I2 neurons in the fully heterogeneous circuit retains all the features observed in the homogeneous circuit, indicating that the variations introduced by neuronal and synaptic heterogeneity are counteracted by the complex interactions between the different sources of heterogeneity.

Overall, the statistics of population activity clearly demonstrate that the fully heterogeneous circuit is more than the sum of its parts, i.e. the variations introduced by the combination of multiple sources of heterogeneity cannot be fully accounted for by their individual effects and lead to more complex interactions that strongly modulate the circuit’s operating point. In addition, all heterogeneity conditions give rise to similar distributions (Fig 6h–6j), i.e. lognormal distributions of firing rates (argued to be a beneficial feature [72, 133]) and correlation coefficients as well as a Gaussian distribution of CVISI, with mean close to 1. The different conditions simply modulate the parameters of the distributions: synaptic and neuronal heterogeneity broaden the firing rate distributions; synaptic heterogeneity alone is responsible for skewing the CCs to smaller values, an effect that is stronger and more pronounced in the fully heterogeneous circuit. It should be noted that the tails of the rate distributions in networks with heterogeneous neurons and/or synapses fall beyond the range typically observed in layer 2/3 microcircuits. The extra somatic current with which we emulate the active state drives the targeted sub-population to fire excessively in these conditions, which highlights a limitation of our approach (see Limitations and future work).

Temporal tuning and memory capacity.

Online processing requires the continuous acquisition and integration of temporally extended information arriving through multiple time-varying input streams. As such, cortical microcircuits need to retain information over time (fading memory) and combine it in meaningful ways. This generic operating principle thus constitutes an important feature underlying cortical information processing (see e.g. [4, 134136]), and is primarily determined by architectural constraints that potentially modulate the circuit’s operating timescales. In this section, we investigate the effect of heterogeneity on the ability of our microcircuit model to retain information over long timescales, enabling it to operate over a broad dynamic range.

The baseline, homogeneous circuit already exhibits these properties (Fig 7a), with an average intrinsic time constant (measured as the decay of the membrane potential autocorrelation functions, see Materials and methods) of ≈ 126.75 ms in the quiet state and a relatively broad dynamic range. Thus, even when all the system’s components are homogeneous, neurons appear to operate at relatively long time scales. This can be partially attributed to the fact that each input elicits very small, sub-millivolt responses [137, 138] and these neurons are strongly hyperpolarized. Therefore, the microcircuit complexity, in combination with the nature of the sub-threshold dynamics, is inherently sufficient to allow it to operate over a large and long temporal range and to rapidly switch to the timescales of its primary driving input (Fig 7a, top).

thumbnail
Fig 7. Temporal tuning and linear memory capacity.

(a): distributions of intrinsic timescales (τint) in the quiet (Q, bottom) and active (A, top) states, for each of the different conditions. (b) Optimal stimulus resolution (Δt) that allows each circuit to perfectly track its input signal (see also red star markers in (a) and S4 Fig). (c): Fading memory functions for the different heterogeneity conditions, determined as the ability to reconstruct the input signal at different delays (k). Colours as in panel (d) below. (d): Total memory capacity, corresponding to the area under the curves in (c). Apart from the main conditions depicted in (c), pair-wise combinations among conditions are depicted in between. (e) Effects of synaptic heterogeneity on memory capacity, in conditions where the weight distributions are fixed, but re-shuffled such that the strongest weights are assigned to the connections among the input population (Shuffled1), from the input population to all excitatory neurons (Shuffled2), or from the input population to all neurons (Shuffled3). Results depicted in (b), (d) and (e) were gathered from 10 simulations per condition.

https://doi.org/10.1371/journal.pcbi.1006781.g007

In order to reliably compute, it is beneficial if the system’s high-dimensional dynamics become transiently ‘enslaved’ by the input [16]. This would correspond to a switch in the system’s intrinsic timescale to the dominant input time scale, allowing the intrinsic fluctuations to help the circuit track the input dynamics. The observed features of having a broad dynamic range in the quiet state, along with the ability to rapidly switch to the shorter timescale of the driving input during active processing appear to stem primarily from the microcircuit’s composition and dynamics (homogeneous condition). However, they are present to greater extents in the presence of structural and neuronal heterogeneity (see Fig 7a). The same pattern is true for memory capacity, whereby neuronal heterogeneity causes a large extension of the circuit’s memory range, leading to a slowly fading and relatively long memory store (Fig 7c and 7d).

We express the memory functions as the ability to use the current state of the circuit x[n] to reconstruct the input sequence at different time lags u[nk] for k ∈ [0, 1, 2, …, T] (as depicted in Fig 7c). The total memory capacity CM (Fig 7d) is then the sum of the individual capacities at different lags k and thus characterizes the maximum extent to which information about past inputs is retained in the current state. It is worth noting, however, that these results should be interpreted cautiously as the different circuits exhibit different responsiveness to the input signal. For comparability, the time constant at which the input varies (Δt) was chosen independently for each condition as the value that maximizes the circuit’s ability to reconstruct its input, i.e. the value that allowed optimal performance at 0 time lag (see Fig 7b and S4 Fig).

The memory range of structurally heterogeneous networks is similar to that of the fully heterogeneous circuit (CM ≈ 3.65 and 3.49, respectively) and both are markedly larger than in the corresponding homogeneous case (CM ≈ 2.67, Fig 7c and 7d). Thus, despite having a barely noticeable effect on microcircuit dynamics and state transitions, structural heterogeneity appears to have non-negligible functional effects. This is somewhat surprising, in light of all the results discussed so far, but consistent with e.g. [139], who proposed that a heterogeneous network structure can give rise to broad and diverse temporal tuning.

The fact that physiological diversity in single neuron properties (neuronal heterogeneity) extends the dynamic range and memory capacity, is to be expected since it directly decreases redundancy and adds variability to the population responses. However, the magnitude of the effect is very significant, nearly doubling the memory capacity, thus making neuronal heterogeneity stand-out as the most functionally relevant condition. In the presence of neuronal heterogeneity alone, the circuit becomes much more responsive, with broader temporal tuning and memory capacity (Fig 7) and capable of achieving optimal performance in reconstructing the input signal, even when it varies at short timescales (peak reconstruction performance is achieved with Δt ≈ 3.9 ms versus Δt ≈ 13.3 ms in the homogeneous circuit, see Fig 7b and S4 Fig).

Counter-intuitively, it appears that synaptic heterogeneity makes the circuit ‘sluggish’, in the sense that it appears to be less responsive and incapable of tracking fast fluctuations in the input signal (Δt ≈ 20 ms, Fig 7b and S4 Fig). These circuits are also endowed with a very short memory capacity (CM ≈ 2.7) that is similar to that of the homogeneous circuit. Accordingly, diversity in synaptic components enforces a very narrow temporal tuning, skewed towards short timescales in the quiet state (mean τint ≈ 30 ms) and reduces the circuit’s ability to acquire the input timescale (Fig 7a and 7b).

These, apparently deleterious, effects of synaptic heterogeneity hint at an important limitation of our implementation (further discussed in Limitations and future work): heterogeneous connection strengths in biological circuits are not randomly assigned, but result from learning and adaptation processes and sub-serve the development of a functional architecture see, e.g. [185], tailored to the circuit’s processing demands. For simplicity, however, we did not introduce any form of synaptic adaptation in our microcircuit model and we have randomly distributed the connection strengths across the network, which may have precluded us from capturing the functional relevance of synaptic heterogeneity.

In order to investigate whether non-random features of the distribution of synaptic weights play a role in the memory capacity, we perform an additional test whereby the weight distributions were retained as in the original implementation, but the individual values were re-shuffled according to three different assumptions about which connections are most likely to have the strongest weights (see Fig 7e): connections within the sub-population of E neurons that were directly stimulated (Shuffled1); connections between these input-driven neurons and other E neurons (Shuffled2); and connections from the input-driven neurons to every other neuron in the microcircuit (Shuffled3, i.e. also involving synapses onto I1 and I2 populations). The results depicted in Fig 7e demonstrate that any of these modifications improves the circuit’s memory capacity, relative to the random synaptic heterogeneity condition. These improvements are only marginal (reaching a maximum value of CM ≈ 2.78), but substantiate the claim that the non-random nature of synaptic heterogeneity is functionally meaningful and ought to be carefully scrutinized. Among the conditions tested, the strengthening of excitatory synapses connecting the input-driven neurons to the remaining E neurons (Shuffled2) appears to be the most beneficial.

Overall, these results demonstrate a clear relation between the system’s responsiveness to temporal fluctuations in the input signal, the intrinsic timescales that characterize the neurons’ activity and the circuit’s memory capacity. There appears to be a ‘push-and-pull’ phenomenon caused by the interactions of neuronal and synaptic heterogeneity, whereby the first significantly boosts the circuit’s dynamic range and memory capacity, whereas the second pulls it back to values similar to the homogeneous condition. Additionally, the independent sources of heterogeneity co-modulate each other’s effects in unexpected ways (Fig 7d). For example, structural and neuronal heterogeneity, which individually cause the most noticeable positive impact on the circuit’s memory capacity, fail to do so when combined. On the other hand, the negative impact of synaptic heterogeneity alone is ameliorated when it is combined with either neuronal or structural heterogeneity.

Processing capacity.

To complement the results of the previous sections and determine the microcircuit’s suitability for online processing with fading memory, we adopt the notion of information processing capacity introduced in [140], which allows us to quantify the system’s ability to employ different modes of information processing and, by combining them, determine the total computational capacity of the circuit (for a formal description, see Processing capacity in Materials and methods). By this definition, the memory capacity discussed in the previous section corresponds to the capacity to reconstruct the set of k different linear functions (degree 1 Legendre polynomials) of the input u, each corresponding to a specific time lag, see also [141]. As such, it corresponds to the fraction of the total capacity associated with linear functions (since no products are involved) and measures the circuit’s linear processing capacity (d = 1 in Fig 8). Accordingly, degrees d ≥ 2 correspond to larger and increasingly complex sets of non-linear basis functions (products of Legendre polynomials, see illustrative example in Fig 8a) and thus require increasingly more sophisticated computational capabilities.

thumbnail
Fig 8. Computational capacity in the different heterogeneity conditions.

(a) Illustration of the setup used to assess computational capacity. An input signal, u, is used to drive a sub-set of E neurons and the population responses, x, are recorded and gathered in state matrix X. These states are then used to reconstruct a set of time-dependent functions z = f(uk). These target functions vary in complexity (degree of nonlinearity, color-coded) and memory requirements. (b) Normalized capacity space, i.e. ability to reconstruct functions of u at different maximum delays (kmax, memory) and degrees (dmax, complexity/nonlinearity). For any given function, the capacity is normalized such that C[X, z] = 1 corresponds to perfect reconstruction of z. (c) Total processing capacity, expressed as the sum of all capacities for a given degree (the incremental color code in each bar corresponds to the maximum degree for each segment, varying from 1 to 4 and is also illustrated in (a)).

https://doi.org/10.1371/journal.pcbi.1006781.g008

We can thus distinguish between computational complexity / non-linearity (specified by the maximum degree of the basis functions used, dmax) and memory (specified by the maximum delay taken into consideration, kmax). By evaluating a very large set of functions of u, we can quantify the circuit’s information processing capacity over the space of basis functions (Fig 8b). The total capacity CT (Fig 8b), defined as the sum of the individual capacities (C[X, z]) for all the different functions z tested, thus quantifies the circuit’s ability to compute multiple transformations of u, with variable degrees of complexity and provides a summarized description of the system’s information processing capacity (see Processing capacity in Materials and methods).

In line with the results on the previous section, heterogeneity in neuronal parameters has the most significant effect, greatly extending the space of computable functions, both linear and nonlinear. By allowing the circuit to retain contextual information for longer (extended memory range), these circuits have a high capacity even for relatively large delays, as demonstrated by the slowly decaying memory curves in Fig 8b. As a consequence, the total capacity of microcircuits with heterogeneous neurons is the largest among all the conditions (Fig 8c).

Despite its very modest effects on population activity, structural heterogeneity has very interesting consequences on the microcircuit’s processing capacity, particularly in the ability to compute complex nonlinear functions. Although the memory functions decay abruptly (almost as abruptly as in the homogeneous condition, Fig 8b), the circuits achieve a larger capacity for more complex functions (Fig 8c) and the total capacity at d = 4 (largest degree evaluated) is the largest among all conditions tested, as can be seen by comparing the top crimson bars in Fig 8c.

Also in line with the results for d = 1 (linear memory capacity), synaptic heterogeneity has a deleterious effect on processing capacity, reducing it to values smaller than the homogeneous case (CT ≈ 13.9 versus CT ≈ 14.4 for the homogeneous condition, Fig 8c). Consequently, the beneficial effects introduced by both structural and neuronal heterogeneity are counteracted by the negative effect of synaptic heterogeneity, as observed in the previous section. These cancelling effects result in the fully heterogeneous circuit having a total capacity that is only modestly superior to the homogeneous case (CT ≈ 16.9).

Under idealized conditions, the total capacity is bound by the number of linearly independent state variables of the dynamical system, which, in the limit of T → ∞, equals N (the number of neurons), in systems that perfectly obey the fading memory property and whose neurons’ activity is linearly independent (for proofs, see [140]). In that respect, the values we have obtained for the total capacity are very modest and close to only 1% of the theoretical limit. On the one hand, this is due to methodological limitations (we could only investigate a short range of dmax), and on the other, suggests that there may be important aspects that were neglected in this study that significantly boost the total capacity and, at least partially explain the difference (see Limitations and future work).

Nevertheless, the results demonstrate a consistent pattern to that observed in the memory capacity, i.e. the functional consequences of the different sources of heterogeneity are consistent for both the ability to compute linear and non-linear functions with fading memory, as circuits with the largest linear memory capacity are also the ones with the largest non-linear processing capacity (namely neuronal heterogeneity). However, these results indicate that neuronal heterogeneity has its main effect on memory (kmax), greatly extending the capacity to compute functions of uk for larger values of k (Fig 8b) while structural heterogeneity (that has the second largest functionally beneficial effects) boosts the ability to compute more complex functions, with a main effect on dmax.

Discussion

Heterogeneity and diversity in cellular, biochemical and physiological properties seen within and across cortical regions and layers exerts a significant influence on population dynamics. Although often disregarded by the reduced models commonly used in computational neuroscience, these features of the neural tissue may well be partially responsible for the high computational proficiency and functional properties of these systems. In order to understand the functional relevance of the different ‘building blocks’ and their inherent complexity and diversity, it is important to start from relatively simple formalisms and gradually account for the biological complexity while maintaining coherence with the relevant empirical observations at multiple levels. The present study proposes a data-driven modelling approach as an exploratory strategy to systematically uncover the computational benefits of different microcircuit features in an attempt to elucidate and quantify the biophysical substrates of neural computation.

We have focused on the composition of layer 2/3 cortical microcircuits, since their highly recurrent connectivity [142, 143] and sparse, asynchronous activity [93, 111, 113, 117, 142, 144] are ideally suited to study the nature of sparse distributed processing in cortical microcircuit models. The relatively small extent of the neuritic processes (in comparison with deeper layers), makes it acceptable to assume that the potential role of dendritic compartmentalization [145, 146] and other effects caused by the detailed neuronal morphology [147, 148] are negligible, allowing us to use simple point-neuron models with limited loss—which would not be the case if we accounted for the deeper layers. Additionally, the input/output relations and unique position of layer 2/3 in a cortical column suggests a particularly prominent computational role as it must integrate and process multiple streams of information in meaningful ways.

In an attempt to disentangle the role played by heterogeneity in different components of the system, we tentatively partitioned it into neuronal, structural and synaptic components (see Introduction, Data-driven microcircuit model and Supplementary Materials). These different sources of heterogeneity differentially influence the characteristics of population responses: from introducing variability in how different neurons and neuronal classes respond to and integrate their synaptic inputs, to variations in the magnitude and distribution of those inputs, among other effects. These, often subtle, differences have complex effects at the population level and strongly condition the system’s operating point. The fully heterogeneous circuit provides the closest approximation to the biophysical reality, exhibiting important commonalities, and appears to inherit different features from different sources of heterogeneity. Naturally, given the simplifications required, our conclusions on the effects of the various heterogeneities are primarily qualitative. However, the extent to which the model responses differ from the empirical observations can also be informative about the potential impact of microcircuit features and processes that were not explicitly considered or were overlooked or oversimplified, as we discuss in the following section.

Collecting, validating and organizing experimental data relevant for these type of studies is still a monumental challenge. Manual annotation and parameter extraction are cumbersome, error-prone strategies and only feasible on well-constrained systems and well-defined problems. The creation and active curation of stable and reliable large-scale databases (of which good examples exist: Allen Brain Atlas [149], NeuroMorpho [150], NeuroElectro [100], NMC [91], ICGenealogy [151], to name a few), along with standard and widely accepted registration and sharing practices [152, 153] are increasingly a priority in a community-driven effort to better constrain neuroscience models and integrate knowledge from multiple disciplines. In addition, automated parameter extraction, estimation as well as model fitting and comparison is strictly and increasingly necessary for studies in this direction. These are complex challenges as they must meet the requirements of an ever-changing scientific field that, consequently, doesn’t lend itself easily to standardization.

Apart from the considerable efforts to explicitly include and account for experimental data to constrain the microcircuit models and make use of publicly available datasets, we additionally emphasize the importance of ensuring transparency, openness and reproducibility. To this end, the complete materials for this study are publicly available through the [154] (see S2 Appendix for details). Our efforts in that direction are a mere example and proof-of-concept, but demonstrate that the field is mature enough to embrace these practices, which should become a standard in computational neuroscience (see also [155]). Given the complexity of these studies, the ability to reproduce and verify are paramount not only to impose the scientific ‘golden standards’, but to extend and build upon existing work.

Limitations and future work

Despite providing a significant step towards biological verisimilitude, our results demonstrate important limitations that ought to be addressed in future work. At the neuron level, and even though we consider three different neuronal populations, including two separate inhibitory classes, further sub-divisions have been reported in neocortical layer 2/3 populations, both for glutametergic [46] and, in particular, for GABAergic neurons [113, 144, 156158]. It is possible that these reflect regional specializations particularly prominent in specific cortical areas (such as the prefrontal cortical regions; [34, 46, 159]) or that they represent separate instances of broader classes and can, for simplicity, be grouped together. Parameterized correctly, our choice of neuron model proved to be sufficient for the purposes of this study and allowed us to account for the most important physiological characteristics of the different neuronal classes and their relations (see Neuronal properties). Such simplifying assumptions, however, are bound to miss relevant structural and functional features, particularly when it comes to specialization of inhibitory neurons and synapses [160164], the effects of dendritic nonlinearities and active dendritic processes [145148], intrinsic adaptation processes [165], to name a few. It is also important, in future work along this direction, to consider the intricate relations between model parameters, i.e. explicitly include not only the empirical variability but also the covariance across multiple parameters (as e.g. [101]). In this context, it should be pointed that the neuronal heterogeneity condition entailed a modification of a larger number of parameters than the other forms of heterogeneity, and so further work is needed to disentangle their contributions and obtain a single-parameter level comparison of their effects. Overall, our results lead us to conclude that it is important to understand the role of multiple interacting populations (e.g. [166]), particularly including inhibitory sub-types and their different physiological properties and interactions, given their clearly distinct contributions.

When it comes to synaptic transmission, we have focused on the specificities of instantaneous response kinetics and its inherent diversity, disregarding any form of synaptic plasticity. However, in our model, synaptic heterogeneity was shown to severely constrain the microcircuit’s processing capacity and memory (Fig 8), counteracting the benefits introduced by neuronal and structural heterogeneity. Additionally, the fact that the total measured capacity is very modest even in the best-performing systems (only about 1% of the theoretical maximum), and considering the computational requirements posed on these systems in ecological conditions, this suggests it is reasonable to assume that there are important aspects of synaptic transmission that we have failed to consider, but contribute significantly to the circuit’s processing capacity. Adaptation and plasticity are likely to be important missing components, due to their critical roles in learning and memory processes [55]. Furthermore, variability in synaptic parameters, being the result of adaptive processes, is bound to reflect the circuit’s functional architecture, as demonstrated in e.g. [167]. Failure to consider the specificities of cortical connectivity is partially responsible for the absence of a substantial functional impact of synaptic heterogeneity in this study.

Throughout this study, we have investigated the behaviour of our microcircuit model in two dynamic regimes, which we associated with the biological quiet and active states. However, the stimulation applied to bring the circuits into the active state was not biologically realistic, as we purposefully removed any spatiotemporal structure in order to measure the computational properties of the system and not the acquisition of structural information present in the input signal. Thus, the degree to which we are able to account for and explicitly compare empirical observations with the model is restricted and only qualitative. Moreover, whilst measuring the capacity of the network, we significantly under-sampled the space, as the results clearly demonstrate (Fig 8a). A more complete set of basis functions would lead the capacity along both axes to decay to 0: as the complexity and memory requirements increase, the capacity to compute these functions decreases to negligible values in all systems. While accounting for delays of up to kmax = 100 allowed us to capture this effect (since the memory range in all conditions is inferior to that), we failed to account for a sufficiently large dmax. The primary reason for this was computational cost, as our current implementation is extremely time-consuming (see S2 Appendix). As a consequence, the capacity space is sub-normalized, incomplete and underestimated, due to the relatively small number of basis functions tested. Additionally, the limited sample size (T = 105) may bias the individual results.

While we have explored information processing capacity in a generic sense, future work along these lines would benefit from being more directed towards specific microcircuits engaged in specific computations. For example, the specific role of layer 2/3 microcircuits in primary and secondary visual cortices for long-range perceptual grouping have been systematically explored [14, 15, 168] and constitute a fruitful avenue for future research.

Cortical states

The state of any given cortical microcircuit, both in terms of macroscopic spiking statistics and, particularly, membrane potential dynamics can differ dramatically between behavioural states [94, 114, 131, 132] given that they require different levels of active ‘engagement’. The three neuronal classes behave in very specific ways, with specialized response features providing differential contributions to the different circuit states. These neuron-class-specific contributions play an important role in the observed dynamics, providing a potential mechanism to support state modulations [123, 169].

Spontaneous cortical activity during states of quiet wakefulness (a quiescent state in which the animal is awake but the circuit is not directly engaged in active processing), is commonly characterized by short-lasting, large amplitude depolarizations [132, 170, 171] that reflect the presence of strongly synchronized excitatory inputs and resemble the dynamics observed under light anaesthesia [114, 132, 138, 144]. Naturally, driven by a homogeneous Poisson process, the system does not exhibit such behaviour (see Results section on Emergent population dynamics), which indicates that such effects are partially inherited by the spatiotemporal structure of the background input [172], which in turn may reflect the structure of the sensory input [173]. Additionally, or alternatively, this may be a consequence of propagating waves of excitation [171] which are likely related to spatial connectivity features that were not taken into consideration in this study (see Limitations and future work).

Nevertheless, our quiet state, where the circuit is driven by background noise, highlights relevant features of population activity and their relations among different neuronal classes, emerging from the effects of the different sources of heterogeneity. The most prominent feature is the extremely sparse firing of E neurons (Fig 4c), which appears to stem directly from the circuit’s composition (homogeneous condition) and is a robust and replicable effect emerging as a direct consequence of dense, strong and fast inhibition. While structural heterogeneity has no measurable effects, synaptic heterogeneity makes the E population more responsive and places some of these neurons closer to their firing thresholds (Fig 4d). Neuronal heterogeneity, on the other hand, leads to more strongly hyperpolarized E and I1 populations, compared to all other conditions. This has the positive effect of shifting the distribution of membrane potential in the I1 population to a range that overlaps with the empirical values in [94]. However, E neurons become excessively hyperpolarized and their membrane potentials are kept farther from threshold and farther from the corresponding experimental value (to which all other conditions provide a better match). Despite these differences, neuronal heterogeneity is responsible for placing all three neuronal populations operating within the range of values reported in the literature (dashed lines in Fig 4d).

Neocortical pyramidal neurons (particularly in layer 2/3) fire very sparsely and are never driven to saturation, despite a large and constant synaptic bombardment. For this to occur, excitatory and inhibitory input currents onto each neuron must be carefully balanced such that, on average, they cancel each other, allowing the net mean input to be small and the output rates moderate [95, 174]. Co-active and balanced excitation and inhibition thus stabilizes and shapes the circuit’s activity and must be actively maintained to allow the networks to operate in stable regimes [127, 175, 176]. Importantly, it also plays a critical role in active processing and computation, with the most clear experimental evidence coming form the development of input selectivity in visual and auditory cortices (see e.g. [138, 177, 178] and references therein). We demonstrate that such balance condition is an emergent property from circuits with heterogeneous neurons, without the need for changing any of the system’s parameters. This observation may also provide a complementary mechanism by which cortical circuits are able to achieve and retain this dynamic balance, despite the large, potentially disruptive, variations introduced by other sources of heterogeneity, without necessarily requiring specific compensatory mechanisms as has been recently proposed by [86].

Heterogeneity and information processing

At any given point in time, the state of the circuit reflects a nonlinear combination of the current and past inputs, mediated by complex recurrent interactions. The state of each neuron is thus a nonlinear, fading memory function of the input (the characteristics of which are determined by the circuit’s specificities and input encoding) and the population state a set of N basis functions that can be linearly combined to approximate arbitrary nonlinear functions with fading memory. In that sense, these circuits are endowed with universal computing power on time invariant functions [1618, 134]. This is where complexity and heterogeneity play a particularly prominent role, as they can greatly extend the space of computable functions by diversifying population responses and, consequently, the richness of the circuit’s high-dimensional state-space.

With specific functions in mind, circuits can be “designed” to perform certain computations by explicitly solving the credit-assignment problem, i.e. determining how each neuron ought to contribute to the computation [179] in order to achieve the desired outcome. This is typically achieved by constraining the microcircuit connectivity [180, 181] and/or by postulating and building-in specific functionality (e.g. efficient coding; [128, 182]). The great majority of these approaches, however, assumes idealized conditions and neglects the complexities of real biophysics (but see, e.g. [183]), which limits their scope and generalizability.

Since we were not interested in specific functions, but in universal computational properties, instead of “designing” functional microcircuits or assuming specific computations, we sought to mimic fundamental design principles of the real neocortical microcircuitry and systematically evaluate how they affect the circuit’s computational capabilities. While this exploratory approach has its limitations, we were able to partially disentangle the computational role of complexity and heterogeneity in the microcircuit’s building blocks and pinpoint potential sources of functional specialization. Globally, the functional analysis on the computational benefits of the different sources of heterogeneity revealed the same effect: neuronal diversity, on its own, significantly boosts linear and nonlinear processing capacity and memory (see Results sections on memory and processing capacity) and dramatically increases its dynamic range and sensitivity. Surprisingly, and even though its effects on population activity were barely noticeable, structural heterogeneity has the second largest computational effect, particularly boosting the ability to compute highly nonlinear functions (capacity at d = 4 was much larger than any other condition, see Fig 8).

The functional benefits introduced by neuronal and structural heterogeneity are not reflected in the fully heterogeneous circuit, given that synaptic heterogeneity prevents this from happening. It would be expectable and desirable that the computational benefits would combine in a way that could dramatically increase the total capacity of the most realistic condition. As discussed above, these synaptic effects likely reflect important limitations in our ability to capture their real influence in the biological system. Nevertheless, some of these results are in line with recent works on the effects of heterogeneity and complexity. In particular, the impact of structural heterogeneity in both macro- and microscopic connectivity have been the subject of recent investigations and are increasingly recognized as critical sources of functional specialization, endowing a network with broad and diverse temporal tuning [139] and providing important contributions to efficient memory storage and robust recall in attractor networks [184, 185].

Despite limitations in our study, discussed above, our results highlight the importance of developing new theories of cortical function and dynamics based on the complex interactions of multiple neuronal sub-populations, as different neuronal classes have a non-negligible differential contribution to the circuit’s dynamics. Additionally, the prominent functional role of structural and neuronal heterogeneity suggest that they are part of a critical minimum necessary to account for computation in cortical microcircuit models as their effects appear to underlie a variety of important phenomena.

Materials and methods

Neuronal dynamics

In all systems analysed, the neuronal dynamics is modelled using a common, simplified adaptive leaky integrate-and-fire scheme [98], where the total current flow across the membrane of neuron i is governed by: (1)

The spike times of neuron i are defined as the set F(i) = {tf|Vi(tf)≥Vthresh}. At these times, the membrane potential is reset to the constant value V(t) = Vreset for all times t ∈ (tf, tf + trefr], after which integration is resumed as above. I(t) is the total synaptic current generated by inputs from all pre-synaptic neurons jpre mediated by synapse type ksyn.

To provide greater control over neuronal excitability properties and a more realistic account of cortical neuronal dynamics, we model intrinsic adaptation processes as proposed by [99]: (2) where the parameters a and b determine the relative contribution of sub-threshold and spike-triggered adaptation processes, respectively.

Synaptic dynamics

The synaptic current () elicited by each spike from presynaptic neuron j is determined by the conductivity (Grec) of the corresponding, responsive receptors (each synapse type being composed of a pre-determined set of receptors, see below): (3)

The amplitude of post-synaptic currents is rescaled by the dimensionless weight parameter (), specific to each connection type and whose value was chosen, such that the PSP amplitudes matched the data reported in [92, 93] (see Table 3). The synaptic conductivity in Eq 3 models the response of receptor k to spike arrival from pre-synaptic neuron j with a total conduction delay of : (4)

The conductance transient elicited by a single pre-synaptic event on a single post-synaptic receptor is then modelled as [99, 107109]: (5) where is the peak conductance of the corresponding receptor, nrec(V) is a voltage-gating function assuming a constant value of 1 for all receptor types, except NMDA, in which case [186]: (6)

This gating function is thus used to model the voltage-dependent magnesium block at NMDA receptors. For simplicity, we assume a fixed [Mg2+] = 1 mM. The remaining parameters in Eq 5 correspond to the receptors’ characteristic time constants, namely the rise, fast and slow decay times, as well as the relative balance between fast and slow decay (rrec). In order to account for the differential receptor composition and expression across different neuronal classes, all these parameters are specific for each receptor, synapse and neuron type.

Generating structural heterogeneity

Consider the sparse adjacency matrix Asyn, specifying the anatomical connectivity between all neurons in source population pre and target population post (with pre, post ∈ {E, I1, I2}). The indices i, j of the nonzero entries in Asyn are independently drawn from normalized, truncated exponential distributions, with probability: (7) (8) for pre- and postsynaptic neuron indices, respectively. Npre/post is the total number of pre-/postsynaptic neurons and kout/in are the parameters used to define the skewness of the out-/in-degree distributions, respectively. Setting kout/in = 0 corresponds to a random, uniform connectivity, whereas values >0 generate structured in-/out-degree distributions, with a larger variance in the number of connections per neuron.

Weight correlations.

For each existing connection, the individual synaptic efficacies () can be equal to a fixed scalar value (homogeneous synaptic condition) or randomly drawn from a lognormal distribution (heterogeneous condition). To introduce weight correlations, we specify additional scaling variables ζ for each pre- or postsynaptic neuron (ζj, ζi), whose values are independently drawn from . The parameters determine the strength of induced correlations, which we fix and set to the values reported in [110] and [71]. The original weight values are then re-scaled by ζ of the corresponding pre- and postsynaptic partner, i.e.: (9)

For completeness, and given that lognormal distributions are widely employed throughout this study, it is worth noting that the lognormal probability density function has the following form: (10) parameterized by scale (μ) shape (σ) values.

Profiling the microcircuits

To adequately quantify the relevant functional properties of the microcircuits and the impact of the different features analysed, we employ metrics that are system-agnostic, i.e. independent from the specificities of the circuit analysed and, preferably, parameter-free such that the choices of metric parameters do not influence the measured outcome and any results obtained are unbiased and objectively reflect the circuit’s properties. Of particular interest, for the purposes of this study, is the adequate quantification of the characteristics of population dynamics, under active synaptic bombardment as well as the circuit’s capacity for stimulus processing and computation, in order to establish links between the features of population dynamics, the circuit’s composition and complexity and its ability to perform complex computations.

Input specifications.

We model cortical background / ongoing activity as an unspecific and stochastic external input driving the circuit, considered to be excitatory, i.e. mediated by glutamatergic synapses, and to consist of independent Poissonian spike trains, at a fixed rate νin spikes/s. Given the small network size and, in order to compensate for the relatively small numbers of synapses involved, we rescale the input rates by a constant factor, Kin = 1000, such that each neuron receives, on average, background input through Kin synapses, with each presynaptic source firing at a fixed rate of νin spikes/s. The postsynaptic neuron’s responsiveness to this background input is then determined by its specific receptor composition (the kinetics of AMPA and NMDA receptors), with synaptic weights and delays equal to any other excitatory synapse onto that neuron, i.e. and , where α refers to the postsynaptic neuron class (see Table 3).

In addition, to emulate an active state and evaluate the microcircuit’s processing capacity, we introduce an additional input signal, directly encoded as a somatic input current (Iin(t)), which changes every Δt ms, and deliver it to a randomly chosen sub-set 25% of the E population. We choose this direct encoding strategy in order to ensure that the input signal has a direct influence on the membrane dynamics, making it easier to decode [187189]. As further specified below, the values of this piece-wise constant input current were independent and identically drawn (i.i.d.) from a uniform distribution over the interval [0, 1] and scaled by a constant factor ρu, independently chosen for each condition (see S2 Fig).

Population dynamics.

To quantify and characterize the population responses to different input conditions and assess how the different sources of heterogeneity modulate those responses, we look at the statistics of spiking activity as well as the relevant sub-threshold dynamics across the different neuronal populations.

The circuit’s state or operating point is typically determined primarily by the firing rate and the degree of population-wide synchrony and regularity [78], as measured by the following statistics:

Synchrony.—average spike count correlation coefficient (CC), computed pair-wise, for a large number of npairs = 500 randomly sampled, disjoint, neuronal pairs (see, e.g. [115]).

Regularity.—degree of dispersion of the inter-spike interval (ISI) distribution, as measured by the coefficient of variation (CVISI), averaged across the population. A value of 0 indicates a clock-like, regular firing pattern, whereas a completely irregular, Poisson process has a value of 1. Values larger than 1 are obtained for very bursty firing patterns.

Burstiness.—degree of burstiness in the firing patterns can be captured by the 5-th percentile of the ISI distribution (ISI5%), averaged across the population. This measure has been successfully used to classify neuronal firing patterns in identified populations [190, 191]. A low value indicates higher burstiness, for a given rate.

Randomness.—The entropy of the log-ISI distribution is an important metric that captures the randomness in a spike train [192]. It was recently demonstrated that this metric was one of the key features of cerebellar neurons’ spike trains, allowing their accurate identification [190]. This metric is defined on the probability density of the natural logarithm of the ISIs: (11) This last metric is added for completeness, and can be seen as an additional measure of regularity. The higher the entropy value, the more irregular the firing pattern.

On the sub-threshold level, we assess and summarize the characteristics of membrane potential dynamics, synaptic currents and conductances. We analyse the distributions of mean membrane potentials (〈Vm〉) and their variances (σ2(Vm)), as well as the mean excitatory and inhibitory synaptic currents (〈Isyn〉) onto each neuron. For the computational analyses, we consider the dynamics of the membrane potentials the main state variable [189].

Intrinsic timescale (τint).—to quantify the characteristic timescale of population activity, we look at the autocorrelation function of each neuron’s membrane potentials: (12)

For each neuron, we fit the autocorrelation by an exponential function: (13) where τint specifies the decay time constant, characterizing the neuron’s intrinsic timescale.

Processing capacity.

To analyse the processing capacity of the networks, following [141], we begin by defining an input sequence u[n], of finite total length T, comprising values independent- and identically drawn (i.i.d.) from a pre-determined probability distribution p(u). Since we are interested in measuring the circuit’s generic processing properties and not specific transformations on specific inputs, considering the input a random variable ensures that it has no pre-imposed structure so that any measured structure reflects only the system’s intrinsic properties and not the acquisition of structural relations present in the input. We set p(u) to be the uniform distribution over the interval [0, 1]. This input sequence is then directly encoded into the circuit as explained above.

The circuit’s initial states are randomized () and the circuit is driven by the input, for a total simulation time of T×Δt. In order to obtain accurate results and diminish potential errors and biases, we use a large sample size of T = 105. The circuit state in response to the input is sampled at every Δt ms, resulting in a collection of state vectors x[n] corresponding to a sample of the circuit state at time point t* = n × Δt ms. The resulting state matrix and the corresponding input will then be used to estimate the capacity.

The aim of the analysis is to quantify the system’s ability to carry out computations on u. For that purpose, we measure the capacity C to reconstruct time-dependent functions z on finite sequences of k inputs, z[n] = z(uk[n]), from the state of the system, using a simple linear estimator: (14)

The numerator in the capacity measure corresponds to the linear estimator that minimizes the quadratic error between the target function to be reconstructed z and its linear estimate : (15) where (16)

For any given function z and observed states X, C[X, z] is normalized, such that, in a system that allows perfect reconstruction of z, C[X, z] = 1, meaning that there exists a linear combination of x[n] that equals z[n], for all n. On the other hand, a capacity of 0 indicates that it is not possible to even partially reconstruct the target function. Evaluating C[X, z] for large sets of target functions y{l} = {z1, …, zL}, allows us to gain insights into the information processing capacity of the system. If the evaluated functions are sufficiently distinct (preferably orthogonal), their corresponding capacities measure independent properties and provide independent information about how the system computes. As such, we systematically probe the capacity space by evaluating the complete set of orthonormal basis functions of u, using finite products of normalized Legendre polynomials: (17) where is the Legendre polynomial of degree dk ≥ 0: (18) and is a function of input u delayed by k steps. The total capacity then corresponds to the sum of the individual capacities for a given set of target functions y{dk}: (19) Naturally, we use finite data and a finite set of indices d to evaluate the capacities, leading to an unavoidable underestimation of the total capacity.

Linear memory and nonlinearity.

Using the notations introduced above, and following [140, 141], we can consider the linear memory capacity as the total capacity associated with linear functions: (20) for a maximum polynomial degree of d = 1, i.e. each of the functions tested corresponds to a delayed version of the input: (21)

Accordingly, the capacity associated with nonlinear functions corresponds to d ≥ 2. For more details on the implementation, consult [141] and the code we provide in the Supplementary Materials.

Numerical simulations, implementation and data analysis

All the work presented in this manuscript was implemented using the Neural Microcircuit Simulation and Analysis Toolkit (NMSAT) [193], a python package designed to provide the first steps towards complex microcircuit benchmarking, as suggested and exemplified in this study. The core simulation engine running all the numerical simulations is NEST. Due to the specificities of this project, we used a modified version of NEST 2.10.0 [194], which includes all the models used in this manuscript (some of which are not available in the main release). A complete code package is provided in the supplementary materials that implements project-specific functionality to the framework, allowing the reproduction of all the numerical experiments presented in this manuscript. Computing resources were provided by the JARA-HPC Vergabegremium on the supercomputer JURECA [195] at Forschungszentrum Jülich. All numerical simulations were performed at a resolution of 0.1 ms, using the GSL implementation of the adaptive fourth-order Runge-Kutta method.

Supporting information

S1 Table. Tabular description of network model after [196].

https://doi.org/10.1371/journal.pcbi.1006781.s001

(PDF)

S2 Table. Discrepancies between electrophysiological parameters reported in the literature and model results.

The results obtained after careful choice of the individual parameters for the different neuronal classes did not exactly match the experimental reports, but the relative relations between classes are retained. (*) Note that the ranges reported in this table are a rough approximation to the range of mean values reported in different studies (see below). Naturally, values like the maximum rate (νmax[Hz]) depend entirely on the range of input current considered in a given experiment, so in this case, only the relative ratio is pertinent.

https://doi.org/10.1371/journal.pcbi.1006781.s002

(PDF)

S1 Appendix. Primary data sources.

List of the main references used to constrain model parameters.

https://doi.org/10.1371/journal.pcbi.1006781.s003

(PDF)

S1 Fig. Population rate transfer functions.

Characteristics of population spiking activity in response to background, Poissonian input (quiet state) in the various conditions analysed and for the 3 different population types (E, blue; I1, red; I2, orange), as a function of the input rate νin for a total simulation time of 10 seconds. The top row corresponds the population rate transfer functions, showing that E neurons fire extremely sparsely and synaptic heterogeneity is strictly required to obtain an active E population. The middle and bottom row depict the measured irregularity (CVISI) and synchrony (CC) in all conditions analysed. Note that in many conditions the spiking activity in the E population is so sparse that it is not possible to compute these metrics, since the total number of spikes is insufficient.

https://doi.org/10.1371/journal.pcbi.1006781.s005

(TIF)

S2 Fig. Tuning the input parameters for the active state.

To emulate an active processing condition, an extra input current of maximum amplitude ρu is given to a randomly chosen subset of 25% excitatory neurons. The circuits in the different conditions exhibit different degrees of sensitivity to their inputs. To achieve adequate and comparable responses, we attempt to find combination of input parameters that allows the mean firing rates to remain within realistic bounds (νE ∈ [0.5, 5], , ).

https://doi.org/10.1371/journal.pcbi.1006781.s006

(TIF)

S3 Fig. Population spiking activity in the active state.

Complete statistics of population spiking activity in the active state for the different neuron classes: E (top, blue), I1 (middle, red) and I2 (bottom, orange) and for the different conditions (columns). The radial axes in each plot correspond to: regularity (CVISI), synchrony (CC), burstiness (ISI5%), entropy of the ISI distribution (HISI), and the mean firing rate (ν). All statistics were computed for an observation period of 10s, in a single realization for each condition, with all input parameters fixed and set to the values determined in S2 Fig.

https://doi.org/10.1371/journal.pcbi.1006781.s007

(TIF)

S4 Fig. Temporal receptivity of the microcircuits analysed.

(a) capacity to reconstruct the original input signal at zero lag (i.e. maximum polynomial degree d = 1, maximum delay k = 0, Cd=1,k=0) as a function of the signal resolution (Δt). Since the capacity values converge asymptotically to 1, we determine the optimal resolution as the minimum Δt at which Cd=1,k=0 ≥ 0.99. (b) Decoding capacity at minimum resolution Δt = 0.1 ms (equal to the simulation resolution). (c) Capacity at the maximum resolution tested (Δt = 20 ms). (d) Optimal resolution for each condition. All results correspond to the mean and standard deviations for 10 simulations per condition.

https://doi.org/10.1371/journal.pcbi.1006781.s008

(TIF)

Acknowledgments

The authors would like to acknowledge the contributions of Alexander Seeholzer in the initial conception and implementation of the neuron model used throughout this study. The authors would like to thank Maximillian Schmidt, Sven Goedecke and, in particular, Jannis Schuecker, for important suggestions and discussions at various stages of the project. The authors gratefully acknowledge the computing time granted by the JARA-HPC Vergabegremium on the supercomputer JURECA at Forschungszentrum Jülich, as well as the Neurobiology of Language group at the Max Plank Institute for Psycholinguistics, for valuable and useful discussions throughout the project.

References

  1. 1. Koch C. Complexity and the Nervous System. Science. 1999;284(5411):96–98. pmid:10102826
  2. 2. Duarte R, Seeholzer A, Zilles K, Morrison A. Synaptic patterning and the timescales of cortical dynamics. Current Opinion in Neurobiology. 2017;43:156–165. pmid:28407562
  3. 3. Gjorgjieva J, Drion G, Marder E. Computational implications of biophysical diversity and multiple timescales in neurons and synapses for circuit performance. Current Opinion in Neurobiology. 2016;37(Table 1):44–52. pmid:26774694
  4. 4. Singer W. Complexity as Substrate for Neuronal Computations. Complexity and Analogy in Science: Theoretical, Methodological and Epistemological Aspects. 2015;22:209–218.
  5. 5. Otopalik AG, Sutton AC, Banghart M, Marder E. When complex neuronal structures may not matter. eLife. 2017;6:e23508. pmid:28165322
  6. 6. Bélanger M, Allaman I, Magistretti PJ. Brain energy metabolism: Focus on Astrocyte-neuron metabolic cooperation; 2011. Available from: http://www.sciencedirect.com/science/article/pii/S1550413111004207.
  7. 7. Mappes J, Lindstrom L. How Did the Cuckoo Get Its Polymorphic Plumage? Science. 2012;337(6094):532–533. pmid:22859476
  8. 8. Edelman GM, Gally JA. Degeneracy and complexity in biological systems. Proceedings of the National Academy of Sciences. 2001;98(24):13763–13768.
  9. 9. Price CJ, Friston KJ. Degeneracy and cognitive anatomy. Trends in Cognitive Sciences. 2002;6(10):416–421. pmid:12413574
  10. 10. Krakauer JW, Ghazanfar AA, Gomez-Marin A, MacIver MA, Poeppel D. Neuroscience Needs Behavior: Correcting a Reductionist Bias. Neuron. 2017;93(3):480–490. pmid:28182904
  11. 11. Marom S. On the Precarious Path of Reverse Neuro-Engineering. Frontiers in Computational Neuroscience. 2009;3(May):3–6.
  12. 12. Hopfield JJ. Physics, Computation, and Why Biology Looks so Different; 1994. Available from: http://www.sciencedirect.com/science/article/pii/S0022519384712112.
  13. 13. Getting P. Emerging Principles Governing The Operation Of Neural Networks. Annual Review of Neuroscience. 1989;12(1):185–204. pmid:2648949
  14. 14. Grossberg S, Mingolla E. Neural dynamics of perceptual grouping: Textures, boundaries, and emergent segmentations. Perception & Psychophysics. 1985;38(2):141–171.
  15. 15. Léveillé J, Versace M, Grossberg S. Running as fast as it can: How spiking dynamics form object groupings in the laminar circuits of visual cortex. Journal of Computational Neuroscience. 2010;28(2):323–346. pmid:20111896
  16. 16. Thalmeier D, Uhlmann M, Kappen HJ, Memmesheimer RM. Learning Universal Computations with Spikes. PLoS Computational Biology. 2016;12(6):1–29.
  17. 17. Maass W, Natschlager T, Markram H. Fading memory and kernel properties of generic cortical microcircuit models. Journal of Physiology Paris. 2004;98(4-6 SPEC. ISS.):315–330.
  18. 18. Maass W, Natschlager T, Markram H. Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural computation. 2002;14(11):2531–2560. pmid:12433288
  19. 19. Mountcastle VB. The columnar organization of the neocortex. Brain. 1997;120(4):701–722. pmid:9153131
  20. 20. Mountcastle V. An organizing principle for cerebral function: the unit model and the distributed system. In: Edelman GM, Mountcastle V, editors. The Mindful Brain. MIT Press; 1978. p. 7–50. Available from: http://www.citeulike.org/group/8299/article/4545635.
  21. 21. Park HJ, Friston K. Structural and functional brain networks: from connections to cognition. Science (New York, NY). 2013;342(6158):1238411.
  22. 22. Meunier D, Lambiotte R, Bullmore ET. Modular and hierarchically modular organization of brain networks. Frontiers in Neuroscience. 2010;4(DEC):1–11.
  23. 23. Friston K. A theory of cortical responses. Philosophical transactions of the Royal Society of London Series B, Biological sciences. 2005;360(1456):815–36. pmid:15937014
  24. 24. VanEssen DC. Cartography and connectomes. Neuron. 2013;80(3):775–790.
  25. 25. Shinomoto S, Shima K, Tanji J. Differences in spiking patterns among cortical neurons. Neural computation. 2003;15(12):2823–2842. pmid:14629869
  26. 26. Pletikos M, Sousa A, Sedmak G, Meyer K, Zhu Y, Cheng F, et al. Temporal specification and bilaterality of human neocortical topographic gene expression. Neuron. 2014;81(2):321–332. pmid:24373884
  27. 27. Kang HJ, Kawasawa YI, Cheng F, Zhu Y, Xu X, Li M, et al. Spatio-temporal transcriptome of the human brain. Nature. 2011;478(7370):483–9. pmid:22031440
  28. 28. Hawrylycz MJ, Lein ES, Guillozet-Bongaarts AL, Shen EH, Ng L, Miller JA, et al. An anatomically comprehensive atlas of the adult human brain transcriptome. Nature. 2012;489(7416):391–9. pmid:22996553
  29. 29. Zilles K, Palomero-Gallagher N, Schleicher A. Transmitter receptors and functional anatomy of the cerebral cortex. Journal of Anatomy. 2004;205(6):417–432. pmid:15610391
  30. 30. O’Rourke NA, Weiler NC, Micheva KD, Smith SJ. Deep molecular diversity of mammalian synapses: why it matters and how to measure it. Nature reviews Neuroscience. 2012;13(6):365–79. pmid:22573027
  31. 31. Mueller S, Wang D, Fox MD, Yeo BTT, Sepulcre J, Sabuncu MR, et al. Individual Variability in Functional Connectivity Architecture of the Human Brain. Neuron. 2013;77(3):586–595. pmid:23395382
  32. 32. Yeo BTT, Krienen FM, Sepulcre J, Sabuncu MR, Lashkari D, Hollinshead M, et al. The organization of the human cerebral cortex estimated by intrinsic functional connectivity. Journal of neurophysiology. 2011;106:1125–1165. pmid:21653723
  33. 33. Power JD, Cohen AL, Nelson SM, Wig GS, Barnes KA, Church JA, et al. Functional Network Organization of the Human Brain. Neuron. 2011;72(4):665–678. pmid:22099467
  34. 34. Harris KD, Shepherd GMG. The neocortical circuit: themes and variations. Nature Neuroscience. 2015;18(2):170–181. pmid:25622573
  35. 35. Shinomoto S, Kim H, Shimokawa T, Matsuno N, Funahashi S, Shima K, et al. Relating neuronal firing patterns to functional differentiation of cerebral cortex. PLoS Computational Biology. 2009;5(7). pmid:19593378
  36. 36. Tripathy SJ, Burton SD, Geramita M, Gerkin RC, Urban NN. Brain-wide analysis of electrophysiological diversity yields novel categorization of mammalian neuron types. Journal of Neurophysiology. 2015;113(10):3474–3489. pmid:25810482
  37. 37. Poulin JF, Tasic B, Hjerling-Leffler J, Trimarchi JM, Awatramani R. Disentangling neural cell diversity using single-cell transcriptomics. Nature Neuroscience. 2016;19(9):1131–1141. pmid:27571192
  38. 38. Palomero-Gallagher N, Zilles K. Cortical layers: Cyto-, myelo-, receptor- and synaptic architecture in human cortical areas. NeuroImage. 2017. pmid:28811255
  39. 39. Zilles K, Palomero-Gallagher N, Grefkes C, Scheperjans F, Boy C, Amunts K, et al. Architectonics of the human cerebral cortex and transmitter receptor fingerprints: Reconciling functional neuroanatomy and neurochemistry. European Neuropsychopharmacology. 2002;12(6):587–599. pmid:12468022
  40. 40. Zilles K, Amunts K. Receptor mapping: architecture of the human cerebral cortex. Current Opinion in Neurology. 2009;22(4):331–339. pmid:19512925
  41. 41. Wagstyl K, Ronan L, Goodyer IM, Fletcher PC. Cortical thickness gradients in structural hierarchies. NeuroImage. 2015;111:241–250. pmid:25725468
  42. 42. Finlay BL, Uchiyama R. Developmental mechanisms channeling cortical evolution. Trends in Neurosciences. 2015;38(2):69–76. pmid:25497421
  43. 43. Collins CE, Airey DC, Young NA, Leitch DB, Kaas JH. Neuron densities vary across and within cortical areas in primates. Proceedings of the National Academy of Sciences of the United States of America. 2010;107(36):15927–32. pmid:20798050
  44. 44. Brown SP, Hestrin S. Intracortical circuits of pyramidal neurons reflect their long-range axonal targets. Nature. 2009;457(7233):1133–1136. pmid:19151698
  45. 45. Stepanyants A, Chklovskii DB. Neurogeometry and potential synaptic connectivity; 2005. Available from: http://linkinghub.elsevier.com/retrieve/pii/S0166223605001311.
  46. 46. Zaitsev AV, Povysheva NV, Gonzalez-Burgos G, Lewis DA. Electrophysiological classes of layer 2/3 pyramidal cells in monkey prefrontal cortex. Journal of Neurophysiology. 2012;108(2):595–609. pmid:22496534
  47. 47. Mensi S, Naud R, Pozzorini C, Avermann M, Petersen CCH, Gerstner W. Parameter extraction and classification of three cortical neuron types reveals two distinct adaptation mechanisms. Journal of Neurophysiology. 2012;107(6):1756–1775. pmid:22157113
  48. 48. Van Aerde KI, Feldmeyer D. Morphological and physiological characterization of pyramidal neuron subtypes in rat medial prefrontal cortex. Cerebral Cortex. 2015;25(3):788–805.
  49. 49. Lisman JE, Raghavachari S, Tsien RW. The sequence of events that underlie quantal transmission at central glutamatergic synapses. Nature Reviews Neuroscience. 2007;8(8):597–609. pmid:17637801
  50. 50. Südhof TC, Malenka RC. Understanding Synapses: Past, Present, and Future. Neuron. 2008;60(3):469–476. pmid:18995821
  51. 51. Marx V. A deep look at synaptic dynamics. Nature. 2014;515(7526):293–297. pmid:25391965
  52. 52. Sabatini BL, Regehr WG. Timing of Synaptic Transmission. Annual Review of Physiology. 1999;61(1):521–542. pmid:10099700
  53. 53. Greengard P. The Neurobiology of Slow Synaptic Transmission. Science. 2001;294(5544):1024–1030. pmid:11691979
  54. 54. Südhof TC. Neurotransmitter release: The last millisecond in the life of a synaptic vesicle. Neuron. 2013;80(3):675–690. pmid:24183019
  55. 55. Abbott LF, Regehr WG. Synaptic computation. Nature. 2004;431(7010):796–803. pmid:15483601
  56. 56. Voglis G, Tavernarakis N. The role of synaptic ion channels in synaptic plasticity. EMBO reports. 2006;7(11):1104–1110. pmid:17077866
  57. 57. Hestrin S. Different glutamate receptor channels mediate fast excitatory synaptic currents in inhibitory and excitatory cortical neurons. Neuron. 1993;11(6):1083–1091. pmid:7506044
  58. 58. Moreau AW, Kullmann DM. NMDA receptor-dependent function and plasticity in inhibitory circuits. Neuropharmacology. 2013;74:23–31. pmid:23537500
  59. 59. Angulo MC, Rossier J, Audinat E. Postsynaptic glutamate receptors and integrative properties of fast-spiking interneurons in the rat neocortex. Journal of neurophysiology. 1999;82(3):1295–1302. pmid:10482748
  60. 60. Nissen W, Szabo A, Somogyi J, Somogyi P, Lamsa KP. Cell Type-Specific Long-Term Plasticity at Glutamatergic Synapses onto Hippocampal Interneurons Expressing either Parvalbumin or CB1 Cannabinoid Receptor. Journal of Neuroscience. 2010;30(4):1337–1347. pmid:20107060
  61. 61. Destexhe A, Mainen ZF, Sejnowski TJ. Kinetic models of synaptic transmission. In: Koch C, Segev I, editors. Methods in Neuronal Modeling. 2nd ed. Cambridge, MA: MIT Press; 1998. p. 1–25. Available from: http://cns.iaf.cnrs-gif.fr/abstracts/KSchap96.html.
  62. 62. Destexhe A, Mainen ZF, Sejnowski TJ. Synthesis of models for excitable membranes, synaptic transmission and neuromodulation using a common kinetic formalism. Journal of Computational Neuroscience. 1994;1(3):195–230. pmid:8792231
  63. 63. Kubota Y, Karube F, Nomura M, Kawaguchi Y. The Diversity of Cortical Inhibitory Synapses. Frontiers in Neural Circuits. 2016;10:27. pmid:27199670
  64. 64. Markram H, Lübke J, Frotscher M, Roth A, Sakmann B. Physiology and anatomy of synaptic connections between thick tufted pyramidal neurones in the developing rat neocortex. Journal of Physiology. 1997;500(2):409–440. pmid:9147328
  65. 65. Song S, Sjostrom PJ, Reigl M, Nelson S, Chklovskii DB. Highly nonrandom features of synaptic connectivity in local cortical circuits. PLoS Biology. 2005;3(3):0507–0519.
  66. 66. Thomson AM. Synaptic Connections and Small Circuits Involving Excitatory and Inhibitory Neurons in Layers 2-5 of Adult Rat and Cat Neocortex: Triple Intracellular Recordings and Biocytin Labelling In Vitro. Cerebral Cortex. 2002;12(9):936–953. pmid:12183393
  67. 67. Perin R, Berger TK, Markram H. A synaptic organizing principle for cortical neuronal groups. Proceedings of the National Academy of Sciences. 2011;108(13):5419–5424.
  68. 68. Yoshimura Y, Dantzker JLM, Callaway EM. Excitatory cortical neurons form fine-scale functional networks. Nature. 2005;433(7028):868–873. pmid:15729343
  69. 69. Yoshimura Y, Callaway EM. Fine-scale specificity of cortical networks depends on inhibitory cell type and connectivity. Nature Neuroscience. 2005;8(11):1552–1559. pmid:16222228
  70. 70. Shimono M, Beggs JM. Functional clusters, hubs, and communities in the cortical microconnectome. Cerebral Cortex. 2015;25(10):3743–3757. pmid:25336598
  71. 71. Tomm C, Avermann M, Petersen C, Gerstner W, Vogels TP. Connection-type-specific biases make uniform random network models consistent with cortical recordings. Journal of Neurophysiology. 2014;112(8):1801–1814. pmid:24944218
  72. 72. Koulakov AA, Hromadka T, Zador AM. Correlated Connectivity and the Distribution of Firing Rates in the Neocortex. Journal of Neuroscience. 2009;29(12):3685–3694. pmid:19321765
  73. 73. Roxin A. The Role of Degree Distribution in Shaping the Dynamics in Networks of Sparsely Connected Spiking Neurons. Frontiers in Computational Neuroscience. 2011;5:8. pmid:21556129
  74. 74. Pernice V, Deger M, Cardanobile S, Rotter S. The relevance of network micro-structure for neural dynamics. Frontiers in computational neuroscience. 2013;7(June):72. pmid:23761758
  75. 75. Litwin-Kumar A, Doiron B. Slow dynamics and high variability in balanced cortical networks with clustered connections. Nature Neuroscience. 2012;15(11):1498–1505. pmid:23001062
  76. 76. Harris KD, Mrsic-Flogel TD. Cortical connectivity and sensory coding. Nature. 2013;503(7474):51–58. pmid:24201278
  77. 77. Hoffmann FZ, Triesch J. Nonrandom network connectivity comes in pairs. Network Neuroscience. 2017;1(1):31–41. pmid:29601066
  78. 78. Tsotsos JK, Culhane SM, Wai WYK, Lai Y, Davis N, Nuflo F. Dynamics of Sparsely Conntected Networks of Excitatory and Inhibitory Spiking Neurons. Journal of Computational Neuroscience. 2000;8:183–208.
  79. 79. van Vreeswijk C, Sompolinsky H. Chaos in neuronal networks with balanced excitatory and inhibitory activity. Science (New York, NY). 1996;274(5293):1724–6.
  80. 80. van Vreeswijk C, Sompolinsky H. Chaotic Balanced State in a Model of Cortical Circuits. Neural Computation. 1998;10(6):1321–1371. pmid:9698348
  81. 81. Amit DJ, Brunel N. Model of global spontaneous activity and local structured activity during delay periods in the cerebral cortex. Cerebral Cortex. 1997;7(3):237–252. pmid:9143444
  82. 82. Potjans TC, Diesmann M. The cell-type specific cortical microcircuit: Relating structure and activity in a full-scale spiking network model. Cerebral Cortex. 2014;24(3):785–806. pmid:23203991
  83. 83. Schmidt M, Bakker R, Shen K, Bezgin G, Diesmann M, van Albada SJ. Full-density multi-scale account of structure and dynamics of macaque visual cortex. 2015. https://doi.org/10.1371/journal.pcbi.1006359
  84. 84. Cain N, Iyer R, Koch C, Mihalas S. The Computational Properties of a Simplified Cortical Column Model. PLOS Computational Biology. 2016;12(9):e1005045. pmid:27617444
  85. 85. Haeusler S, Maass W. A statistical analysis of information-processing properties of lamina-specific cortical microcircuit models. Cerebral Cortex. 2007;17(1):149–162. pmid:16481565
  86. 86. Landau ID, Egger R, Dercksen VJ, Oberlaender M, Sompolinsky H. The Impact of Structural Heterogeneity on Excitation-Inhibition Balance in Cortical Networks. Neuron. 2016;92(5):1106–1121. pmid:27866797
  87. 87. Chelaru MI, Dragoi V. Efficient coding in heterogeneous neuronal populations. Proceedings of the National Academy of Sciences of the United States of America. 2008;105(42):16344–16349. pmid:18854413
  88. 88. Markram H, Muller E, Ramaswamy S, Reimann MW, Abdellah M, Sanchez CA, et al. Reconstruction and Simulation of Neocortical Microcircuitry. Cell. 2015;163(2):456–92. pmid:26451489
  89. 89. Markram H. The blue brain project. Nature reviews Neuroscience. 2006;7(2):153–160. pmid:16429124
  90. 90. Helmstaedter M, de Kock CPJ, Feldmeyer D, Bruno RM, Sakmann B. Reconstruction of an average cortical column in silico. Brain Research Reviews. 2007;55(2 SPEC. ISS.):193–203. pmid:17822776
  91. 91. Ramaswamy S, Courcol JD, Abdellah M, Adaszewski SR, Antille N, Arsever S, et al. The neocortical microcircuit collaboration portal: a resource for rat somatosensory cortex. Frontiers in Neural Circuits. 2015;9:44. pmid:26500503
  92. 92. Lefort S, Tomm C, Floyd Sarria JC, Petersen CCH. The Excitatory Neuronal Network of the C2 Barrel Column in Mouse Primary Somatosensory Cortex. Neuron. 2009;61(2):301–316. pmid:19186171
  93. 93. Avermann M, Tomm C, Mateo C, Gerstner W, Petersen CCH. Microcircuits of excitatory and inhibitory neurons in layer 2/3 of mouse barrel cortex. Journal of Neurophysiology. 2012;107(11):3116–3134. pmid:22402650
  94. 94. Gentet LJ, Avermann M, Matyas F, Staiger JF, Petersen CCH. Membrane Potential Dynamics of GABAergic Neurons in the Barrel Cortex of Behaving Mice. Neuron. 2010;65(3):422–435. pmid:20159454
  95. 95. Shadlen MN, Newsome WT. The variable discharge of cortical neurons: implications for connectivity, computation, and information coding. The Journal of neuroscience: the official journal of the Society for Neuroscience. 1998;18(10):3870–96.
  96. 96. Ecker AS, Berens P, Keliris GA, Bethge M, Logothetis NK, Tolias AS. Decorrelated Neuronal Firing in Cortical Microcircuits. Science. 2010;327(5965):584–587. pmid:20110506
  97. 97. Renart A, de la Rocha J, Bartho P, Hollender L, Parga N, Reyes A, et al. The Asynchronous State in Cortical Circuits. Science. 2010;327(5965):587–590. pmid:20110507
  98. 98. Koch C. Biophysics of Computation Information Processing in Single Neuron. vol. 11. Oxford University Press, USA; 2004. Available from: http://www.amazon.de/Biophysics-Computation-Information-Computational-Neuroscience/dp/0195181999.
  99. 99. Gerstner W, Kistler WM, Naud R, Paninski L. Neuronal Dynamics—from single neurons to networks and models of cognition. Cambridge University Press; 2014. Available from: https://books.google.de/books?id=D4j2AwAAQBAJ.
  100. 100. Tripathy SJ, Savitskaya J, Burton SD, Urban NN, Gerkin RC. NeuroElectro: a window to the world’s neuron electrophysiology data. Frontiers in Neuroinformatics. 2014;8(April):40. pmid:24808858
  101. 101. Harrison PM, Badel L, Wall MJ, Richardson MJE. Experimentally Verified Parameter Sets for Modelling Heterogeneous Neocortical Pyramidal-Cell Populations. PLoS Computational Biology. 2015;11(8):e1004165. pmid:26291316
  102. 102. Lu Jt, Li Cy, Zhao JP, Poo Mm, Zhang Xh. Spike-Timing-Dependent Plasticity of Neocortical Excitatory Synapses on Inhibitory Interneurons Depends on Target Cell Type. Journal of Neuroscience. 2007;27(36):9711–9720. pmid:17804631
  103. 103. Szabadics J. Excitatory Effect of GABAergic Axo-Axonic Cells in Cortical Microcircuits. Science. 2006;311(5758):233–235. pmid:16410524
  104. 104. Hill E, Kalloniatis M, Tan SS. Glutamate, GABA and precursor amino acids in adult mouse neocortex: cellular diversity revealed by quantitative immunocytochemistry. Cerebral cortex (New York, NY: 1991). 2000;10(11):1132–42.
  105. 105. Palomero-Gallagher N, Amunts K, Zilles K. Transmitter Receptor Distribution in the Human Brain. In: Toga AW, editor. Brain Mapping. San Diego: Elsevier Academic Press; 2015. p. 261–275.
  106. 106. Zilles K, Schleicher A, Palomero-Gallagher N, Amunts K. Quantitative Analysis of Cyto- and Receptor Architecture of the Human Brain. vol. 58; 2002. Available from: http://www.epjap.org/10.1051/epjap/2012110475%5Cnhttp://linkinghub.elsevier.com/retrieve/pii/B978012693019150023X.
  107. 107. McCormick DA, Wang Z, Huguenard J. Neurotransmitter control of neocortical neuronal activity and excitability. Cerebral Cortex. 1993;3(5):387–398. pmid:7903176
  108. 108. Gerstner W, Kistler WM. Spiking Neuron Models. Cambridge University Press; 2002. Available from: http://ebooks.cambridge.org/ref/id/CBO9780511815706.
  109. 109. Hoffmann JHO, Meyer HS, Schmitt AC, Straehle J, Weitbrecht T, Sakmann B, et al. Synaptic conductance estimates of the connection between local inhibitor interneurons and pyramidal neurons in layer 2/3 of a cortical column. Cerebral Cortex. 2015;25(11):4415–4429. pmid:25761638
  110. 110. Tomm C. Analysing Neuronal Network Architectures: From Weight Distributions to Structure and Back. ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE; 2012. Available from: https://infoscience.epfl.ch/record/174669/files/EPFL_TH5302.pdf.
  111. 111. O’Connor DH, Peron SP, Huber D, Svoboda K. Neural activity in barrel cortex underlying vibrissa-based object localization in mice. Neuron. 2010;67(6):1048–1061. pmid:20869600
  112. 112. Benedetti BL, Takashima Y, Wen JA, Urban-Ciecko J, Barth AL. Differential wiring of layer 2/3 neurons drives sparse and reliable firing during neocortical development. Cerebral Cortex. 2013;23(11):2690–2699. pmid:22918982
  113. 113. Petersen CCH, Crochet S. Synaptic Computation and Sensory Processing in Neocortical Layer 2/3. Neuron. 2013;78(1):28–48. pmid:23583106
  114. 114. Crochet S, Poulet JFA, Kremer Y, Petersen CCH. Synaptic mechanisms underlying sparse coding of active touch. Neuron. 2011;69(6):1160–1175. pmid:21435560
  115. 115. Kumar A, Schrader S, Aertsen A, Rotter S. The High-Conductance State of Cortical Networks. Neural Computation. 2008;20(1):1–43. pmid:18044999
  116. 116. Destexhe A, Rudolph M, Paré D. The high-conductance state of neocortical neurons in vivo. Nature Reviews Neuroscience. 2003;4(12):1019–1019.
  117. 117. Waters J, Helmchen F. Background Synaptic Activity Is Sparse in Neocortex. Journal of Neuroscience. 2006;26(32):8267–8277. pmid:16899721
  118. 118. Léger J, Stern E, Aertsen A, Heck D. Synaptic integration in rat frontal cortex shaped by network activity. Journal of Neurophysiology. 2005;93(93):281–293. pmid:15306631
  119. 119. Destexhe A, Paré D, Gk Q, Pare D. Impact of network activity on the integrative properties of neocortical pyramidal neurons in vivo. Journal of neurophysiology. 1999;81(4):1531–47. pmid:10200189
  120. 120. Humphries MD. The Goldilocks zone in neural circuits. eLife. 2016;5. pmid:27911259
  121. 121. Tsodyks M, Sejnowski T. Rapid state switching in balanced cortical network models. Network: Computation in Neural Systems. 1995;6(2):111–124.
  122. 122. Zucca S, D’Urso G, Pasquale V, Vecchia D, Pica G, Bovetti S, et al. An inhibitory gate for state transition in cortex. eLife. 2017;6:e26177. pmid:28509666
  123. 123. Poulet JFA. Keeping an Eye on Cortical States. Neuron. 2014;84(2):246–248. pmid:25374350
  124. 124. Kremkow J, Aertsen A, Kumar A. Gating of Signal Propagation in Spiking Neural Networks by Balanced and Correlated Excitation and Inhibition. Journal of Neuroscience. 2010;30(47):15760–15768. pmid:21106815
  125. 125. Vogels TP. Signal Propagation and Logic Gating in Networks of Integrate-and-Fire Neurons. Journal of Neuroscience. 2005;25(46):10786–10795. pmid:16291952
  126. 126. Vogels TP, Abbott LF. Gating multiple signals through detailed balance of excitation and inhibition in spiking networks. Nature Neuroscience. 2009;12(4):483–491. pmid:19305402
  127. 127. Duarte R, Morrison A. Dynamic stability of sequential stimulus representations in adapting neuronal networks. Frontiers in Computational Neuroscience. 2014;8(October):124. pmid:25374534
  128. 128. Denève S, Machens CK. Efficient codes and balanced networks. Nature Neuroscience. 2016;19(3):375–382. pmid:26906504
  129. 129. Rubin R, Abbott LF, Sompolinsky H. Balanced Excitation and Inhibition are Required for High-Capacity, Noise-Robust Neuronal Selectivity. 2017.
  130. 130. Vogels TP, Sprekeler H, Zenke F, Clopath C, Gerstner W. Inhibitory Plasticity Balances Excitation and Inhibition in Sensory Pathways and Memory Networks. Science. 2011;334(6062):1569–1573. pmid:22075724
  131. 131. Crochet S, Petersen CCH. Correlating whisker behavior with membrane potential in barrel cortex of awake mice. Nature Neuroscience. 2006;9(5):608–610. pmid:16617340
  132. 132. Poulet JFA, Petersen CCH. Internal brain state regulates membrane potential synchrony in barrel cortex of behaving mice. Nature. 2008;454(7206):881–885. pmid:18633351
  133. 133. Buzsáki G, Mizuseki K. The log-dynamic brain: how skewed distributions affect network operations. Nature Reviews Neuroscience. 2014;15(4):264–278. pmid:24569488
  134. 134. Enel P, Procyk E, Quilodran R, Dominey PF. Reservoir Computing Properties of Neural Dynamics in Prefrontal Cortex. PLoS Computational Biology. 2016;12(6):e1004967. pmid:27286251
  135. 135. Nikolić D, Häusler S, Singer W, Maass W, Nikolic D, Hausler S, et al. Distributed fading memory for stimulus properties in the primary visual cortex. PLoS Biology. 2009;7(12):e1000260. pmid:20027205
  136. 136. Maass W. Searching for Principles of Brain Computation. Current Opinion in Behavioral Sciences. 2016;11:81–92.
  137. 137. Bruno RM. Cortex Is Driven by Weak but Synchronously Active Thalamocortical Synapses. Science. 2006;312(5780):1622–1627. pmid:16778049
  138. 138. Okun M, Lampl I. Balance of excitation and inhibition. Scholarpedia. 2009;4(8):7467.
  139. 139. Chaudhuri R, Bernacchia A, Wang XJ. A diversity of localized timescales in network activity. eLife. 2014;3:e01239. pmid:24448407
  140. 140. Dambre J, Verstraeten D, Schrauwen B, Massar S. Information Processing Capacity of Dynamical Systems. Scientific Reports. 2012;2(1):514. pmid:22816038
  141. 141. Jaeger H. Short term memory in echo state networks. GMD Report 152. 2002; p. 60.
  142. 142. Lewis DA, Gonzalez-Burgos G. Intrinsic excitatory connections in the prefrontal cortex and the pathophysiology of schizophrenia. Brain Research Bulletin. 2000;52(5):309–317. pmid:10922508
  143. 143. Feldmeyer D, Lübke J, Sakmann B. Efficacy and connectivity of intracolumnar pairs of layer 2/3 pyramidal cells in the barrel cortex of juvenile rats. The Journal of Physiology. 2006;575(2):583–602. pmid:16793907
  144. 144. Neske GT, Patrick SL, Connors BW. Contributions of Diverse Excitatory and Inhibitory Neurons to Recurrent Network Activity in Cerebral Cortex. Journal of Neuroscience. 2015;35(3):1089–1105. pmid:25609625
  145. 145. Branco T, Häusser M. The single dendritic branch as a fundamental functional unit in the nervous system. Current Opinion in Neurobiology. 2010;20(4):494–502. pmid:20800473
  146. 146. Morita K. Possible Role of Dendritic Compartmentalization in the Spatial Working Memory Circuit. Journal of Neuroscience. 2008;28(30):7699–7724. pmid:18650346
  147. 147. Spruston N. Pyramidal neurons: dendritic structure and synaptic integration. Nature Reviews Neuroscience. 2008;9(3):206–221. pmid:18270515
  148. 148. Kubota Y, Kondo S, Nomura M, Hatada S, Yamaguchi N, Mohamed AA, et al. Functional effects of distinct innervation styles of pyramidal cells by fast spiking cortical interneurons. eLife. 2015;4(July 2015):1–27.
  149. 149. Sunkin SM, Ng L, Lau C, Dolbeare T, Gilbert TL, Thompson CL, et al. Allen Brain Atlas: an integrated spatio-temporal portal for exploring the central nervous system. Nucleic acids research. 2013;41(Database issue):D996–D1008. pmid:23193282
  150. 150. Ascoli GA, Donohue DE, Halavi M. NeuroMorpho.Org: a central resource for neuronal morphologies. The Journal of neuroscience: the official journal of the Society for Neuroscience. 2007;27(35):9247–51.
  151. 151. Podlaski WF, Seeholzer A, Groschner LN, Miesenboeck G, Ranjan R, Vogels TP. ICGenealogy: Mapping the function of neuronal ion channels in model and experiment. bioRxiv. 2016; p. 058685.
  152. 152. Zehl L, Jaillet F, Stoewer A, Grewe J, Sobolev A, Wachtler T, et al. Handling Metadata in a Neurophysiology Laboratory. Frontiers in Neuroinformatics. 2016;10:26. pmid:27486397
  153. 153. Peng RD. Reproducible Research in Computational Science. Science. 2011;334(6060):1226–1227. pmid:22144613
  154. 154. Open Science Collaboration. Estimating the reproducibility of psychological science. Science. 2015;349(6251):aac4716–aac4716. pmid:26315443
  155. 155. Pauli R, Weidel P, Kunkel S, Morrison A. Reproducing Polychronization: A Guide to Maximizing the Reproducibility of Spiking Network Models. 2018;12(August):1–21.
  156. 156. Ascoli GA, Alonso-Nanclares L, Anderson SA, Barrionuevo G, Benavides-Piccione R, Burkhalter A, et al. Petilla terminology: nomenclature of features of GABAergic interneurons of the cerebral cortex. Nature Reviews Neuroscience. 2008;9(7):557–568. pmid:18568015
  157. 157. Helmstaedter M, Sakmann B, Feldmeyer D. L2/3 Interneuron groups defined by multiparameter analysis of axonal projection, dendritic geometry, and electrical excitability. Cerebral Cortex. 2009;19(4):951–962. pmid:18802122
  158. 158. Jiang X, Shen S, Sinz F, Reimer J, Cadwell CR, Berens P, et al. Response to Comment on “Principles of connectivity among morphologically defined cell types in adult neocortex”. Science. 2016;353(6304):1108–1108. pmid:27609883
  159. 159. Shepard GM, Grillner S. Handbook of Brain Microcircuits; 2010. Available from: http://oxfordmedicine.com/view/10.1093/med/9780195389883.001.0001/med-9780195389883.
  160. 160. Isaacson JS, Scanziani M. How inhibition shapes cortical activity. Neuron. 2011;72(2):231–243. pmid:22017986
  161. 161. Fino E, Packer AM, Yuste R. The Logic of Inhibitory Connectivity in the Neocortex. The Neuroscientist. 2013;19(3):228–237. pmid:22922685
  162. 162. Wilson NR, Runyan CA, Wang FL, Sur M. Division and subtraction by distinct cortical inhibitory networks in vivo. Nature. 2012;488(7411):343–348. pmid:22878717
  163. 163. Pi HJ, Hangya B, Kvitsiani D, Sanders JI, Huang ZJ, Kepecs A. Cortical interneurons that specialize in disinhibitory control. Nature. 2013;503(7477):521–524. pmid:24097352
  164. 164. Gupta A. Organizing Principles for a Diversity of GABAergic Interneurons and Synapses in the Neocortex. Science. 2000;287(5451):273–278. pmid:10634775
  165. 165. Pozzorini C, Naud R, Mensi S, Gerstner W. Temporal whitening by power-law adaptation in neocortical neurons. Nature Neuroscience. 2013;16(7):942–948. pmid:23749146
  166. 166. Lagzi F, Rotter S. Dynamics of competition between subnetworks of spiking neuronal networks in the balanced state. PLoS ONE. 2015;10(9). pmid:26407178
  167. 167. Cossell L, Iacaruso MF, Muir DR, Houlton R, Sader EN, Ko H, et al. Functional organization of excitatory synaptic strength in primary visual cortex. Nature. 2015;518(7539):399–403. pmid:25652823
  168. 168. von der Heydt R, Peterhans E, Baumgartner G. Illusory contours and cortical neuron responses. Science (New York, NY). 1984;224(4654):1260–2.
  169. 169. Pachitariu M, Stringer C, Okun M, Bartho P, Harris K, Latham P, et al. Inhibitory control of shared variability in cortical networks. bioRxiv. 2016;(041103):041103.
  170. 170. DeWeese MR, Zador AM. Non-Gaussian Membrane Potential Dynamics Imply Sparse, Synchronous Activity in Auditory Cortex. Journal of Neuroscience. 2006;26(47):12206–12218. pmid:17122045
  171. 171. Petersen CCH, Hahn TTG, Mehta M, Grinvald A, Sakmann B. Interaction of sensory responses with spontaneous depolarization in layer 2/3 barrel cortex. Proceedings of the National Academy of Sciences. 2003;100(23):13638–13643.
  172. 172. Poulet JFA, Fernandez LMJ, Crochet S, Petersen CCH. Thalamic control of cortical states. Nature Neuroscience. 2012;15(3):370–372. pmid:22267163
  173. 173. Luczak A, Bartho P, Marguet SL, Buzsaki G, Harris KD. Sequential structure of neocortical spontaneous activity in vivo. Proceedings of the National Academy of Sciences. 2007;104(1):347–352.
  174. 174. Shadlen MN, Newsome WT. Noise, neural codes and cortical organization. Current Opinion in Neurobiology. 1994;4(4):569–579. pmid:7812147
  175. 175. Vogels TP, Rajan K, Abbott LF. Neural Network Dynamics. Annual Review of Neuroscience. 2005;28(1):357–376. pmid:16022600
  176. 176. Ernst U, Pawelzik K. Sensible Balance. Science. 2011;334(6062):1507–1508. pmid:22174239
  177. 177. Mariño J, Schummers J, Lyon DC, Schwabe L, Beck O, Wiesing P, et al. Invariant computations in local cortical networks with balanced excitation and inhibition. Nature Neuroscience. 2005;8(2):194–201. pmid:15665876
  178. 178. Dorrn AL, Yuan K, Barker AJ, Schreiner CE, Froemke RC. Developmental sensory experience balances cortical excitation and inhibition. Nature. 2010;465(7300):932–936. pmid:20559387
  179. 179. Abbott LF, DePasquale B, Memmesheimer RM. Building functional networks of spiking model neurons. Nature Neuroscience. 2016;19(3):350–355. pmid:26906501
  180. 180. Memmesheimer RM, Timme M. Designing complex networks. Physica D: Nonlinear Phenomena. 2006;224(1-2):182–201.
  181. 181. Memmesheimer RM, Timme M. Designing the dynamics of spiking neural networks. Physical Review Letters. 2006;97(18):1881011–4.
  182. 182. Boerlin M, Machens CK, Denève S. Predictive Coding of Dynamical Variables in Balanced Spiking Networks. PLoS Computational Biology. 2013;9(11). pmid:24244113
  183. 183. Schwemmer MA, Fairhall AL, Denéve S, Shea-Brown ET. Constructing precisely computing networks with biophysical spiking neurons. The Journal of Neuroscience. 2014;32(28):10112–10134.
  184. 184. Guzman SJ, Schlögl A, Frotscher M, Jonas P. Synaptic mechanisms of pattern completion in the hippocampal CA3 network. Science. 2016;353(6304):1117–1123. pmid:27609885
  185. 185. Brunel N. Is cortical connectivity optimized for storing information? Nature Neuroscience. 2016;19(5):749–755. pmid:27065365
  186. 186. Jahr CE, Stevens CF. Voltage dependence of NMDA-activated macroscopic conductances predicted by single-channel kinetics. The Journal of neuroscience. 1990;10(9):3178–3182. pmid:1697902
  187. 187. Eliasmith C, Anderson CH. Neural Engineering: Computation Representation and Dyamics in Neurobiological Systems. vol. 19. MIT Press; 1991.
  188. 188. Weidel P, Djurfeldt M, Duarte R, Morrison A. Closed loop interactions between spiking neural network and robotic simulators based on MUSIC and ROS. Frontiers in Neuroinformatics. 2016;10(31):1–19.
  189. 189. van den Broek D, Uhlmann M, Fitz H, Duarte R, Hagoort P, Petersson KM. The best spike filter kernel is a neuron; 2017.
  190. 190. van Dijck G, van Hulle MM, Heiney SA, Blazquez PM, Meng H, Angelaki DE, et al. Probabilistic Identification of Cerebellar Cortical Neurones across Species. PLoS ONE. 2013;8(3). pmid:23469215
  191. 191. Ruigrok TJH, Hensbroek RA, Simpson JI. Spontaneous Activity Signatures of Morphologically Identified Interneurons in the Vestibulocerebellum. Journal of Neuroscience. 2011;31(2):712–724. pmid:21228180
  192. 192. Dorval AD. Probability distributions of the logarithm of inter-spike intervals yield accurate entropy estimates from small datasets. Journal of Neuroscience Methods. 2008;173(1):129–139. pmid:18620755
  193. 193. Duarte R, Zajzon B, Morrison A. Neural Microcircuit Simulation And Analysis Toolkit. Zenodo. 2017.
  194. 194. Bos H, Morrison, Abigail Peyser, Alexander Hahne J, Helias M, Kunkel S, Ippen T, Eppler JM, et al. Nest 2.10.0. 2015; p. https://doi.org/10.5281/zenodo.44222
  195. 195. Krause D, Thörnig P. JURECA: General-purpose supercomputer at Jülich Supercomputing Centre. Journal of large-scale research facilities JLSRF. 2016;2(0):A62.
  196. 196. Nordlie E, Gewaltig MO, Plesser HE. Towards reproducible descriptions of neuronal network models. PLoS Computational Biology. 2009;5(8):e1000456. pmid:19662159