Skip to main content
  • Poster presentation
  • Open access
  • Published:

Artificial grammar recognition using spiking neural networks

Introduction

A biologically inspired neocortical model consisting of spiking neurons is designed to perform artificial grammar processing. Building on work in [1], the model is designed to categorize symbol strings as belonging to a Reber grammar [2]. Columnar organization of the cortex is used as the general inspiration of the network [3, 4].

The model consists of an input layer and a recognition layer. Input layer has six DC generator nodes designed to represent the presence of a specific symbol of input. The symbols are all those that the Reber grammar can produce (Figure 1). When a symbol is presented to the input layer, the corresponding node generates a direct current that is fed into the recognition layer. The recognition layer consists of 20 nodes connected in a way designed to sustain activity at around 50 Hz, called the recognition level, when the correct sequence of symbols occurs. Each node is a minicolumn model consisting of 100 neurons, of which 80 are excitatory and 20 inhibitory. Excitatory connections inside the node sustain activity, indicating when input is received. Indication of longer sequences is achieved by combining inputs from nodes representing symbols which can appear after another in the grammar, thus the recognition layer consists of the following 20 nodes: {#, M, T, R, V, X, MT, MV, RM, VT, VX, XT, XV, XR, XM, VXT, VXV, VXR, VXM, OUT}. Three levels are used in which the first layer receives input directly from DC generators, effectively recognizing single character input. The next layer is activated by two-symbol substrings in the grammar. A third layer is activated by subsequent three-symbol substrings. When the activity of the OUT node exceeds 50 Hz, it is said to recognize the string. Each symbol is presented for 500 ms; the next symbol is presented immediately afterwards. The nodes are designed to sustain the activity for 1000 ms after the node received an input. After that time, local inhibition overcomes excitatory activity silences the node.

Figure 1
figure 1

Finite State Machine (FSM) defining Reber grammar. Minicolumn model activity.

Discussion

The results from two strings are presented here. One belonging to the grammar '#MTVT#' and one string not belonging to the grammar '#MTRT#'. The OUT node passes the recognition level of 50 Hz, recognizing the grammar string (Figure 2). The other string, is recognized as not being in the grammar (Figure 3). Strings with between 4 and 15 symbols, representing all paths through the grammar FSM produce promising results. Further work will explore different network paradigms and their ability to recognize strings from the grammar.

Figure 2
figure 2

Grammar string is recognized by the network.

Figure 3
figure 3

A random string does not excite the OUT node above the recognition level.

References

  1. Petersson KM: Artificial grammar learning and neural networks. Proc Cogn Sci Soc. 2005, 1726-1731.

    Google Scholar 

  2. Reber AS: Implicit learning of artificial grammar. J Verb Learn & Verb Behav. 1967, 6: 855-863. 10.1016/S0022-5371(67)80149-X.

    Article  Google Scholar 

  3. Çürüklü B: A Canonical Model of the Primary Visual Cortex. 2005, Mälardalen University Press

    Google Scholar 

  4. Mountcastle VB: The columnar organization of the neocortex. Brain. 1997, 701-722. 10.1093/brain/120.4.701.

    Google Scholar 

  5. Gewaltig MO, Diesmann M: NEST. Scholarpedia. 2007, 2: 1430.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Philip Cavaco.

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Cavaco, P., Çürüklü, B. & Petersson, K.M. Artificial grammar recognition using spiking neural networks. BMC Neurosci 10 (Suppl 1), P352 (2009). https://0-doi-org.brum.beds.ac.uk/10.1186/1471-2202-10-S1-P352

Download citation

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/1471-2202-10-S1-P352

Keywords