Skip to main content
  • Poster presentation
  • Open access
  • Published:

Quantifying the complexity of neural network output using entropy measures

Introduction

Countless methods exist to quantify neurophysiological signals yet determining the most appropriate tool for extracting features is a daunting task. A growing body of literature has investigated the use of entropy for measuring "complexity" in signals. We present the application of a suite of entropy measures on neural network outputs to compare and constrast their ability to identify signal characteristics not captured by variance based measures of regularity. Our previous work [1] has shown that modifications to exisiting algorithms may be necessary to accurately capture nonlinear signal components. We have built upon this work, revealed interesting features in a commonly used preparation and hypothesize our entropy tools as being useful for a wide variety of scientists.

Methods

We used the in vitro respiratory slice preparation from neonatal rats [2] and an in silico model (NEURON) of this system. In brief, a brainstem slice containing the preBötzinger complex, premotoneurons and XII motomeurons is surgically removed, placed in a chamber with artificial cerebrospinal fluid and electrophysiologically recorded from. This slice contains necessary and sufficient neural circuitry to generate spontaneous rhythmic activity. To test changes in network complexity, network excitability was altered by changing extracellular [K+].

Our entropy work focused on three measures: Approximate Entropy (ApEn), Sample Entropy (SampEn) and the Entropy of interburst intervals (EnInt). We consider larger entropy values to mean less predictability (ApEn and SampEn) or more information density (EnInt). ApEn and SampEn were calculated for the fictive respiratory "bursts" (in vitro) and EnInt was applied on the interburst intervals (in vitro and in silico).

Results

Our in vitro entropy measures showed a significant change as network excitability was increased. The measures also identified peaks in complexity at 5–7 mM [K+]. These trends were not observed with linear measures. The in vitro peak complexity occurred at different levels for the timing component (EnInt) and the burst dynamics (ApEn and SampEn). We are currently incorporating these observations into our in silico model.

Discussion

These results suggest that entropy measures offer the ability to quantify additional aspects of a neural signal. Specifically, changing excitablity (which is common) influences the complexity of the bursting patterns and may control a bifurcation point in in vitro network activity. These changes may provide further insight into respiratory instabilities in humans. We envision these tools (freely available from our laboratory) as useful for improving feature detection in neural networks and providing additional data dimension.

References

  1. Kaffashi F, Foglyano R, Wilson CG, Loparo KA: The effect of time delay on approximate and sample entropy calculations. Physica D: Nonlinear Phenomena. 2008, 237: 3069-3074. 10.1016/j.physd.2008.06.005.

    Article  Google Scholar 

  2. Smith JC, Ellenberger HH, Ballanyi K, Richter DW, Feldman JL: PreBötzinger complex: a brainstem region that may generate respiratory rhythm in mammals. Science. 1991, 254: 726-998. 10.1126/science.1683005.

    Article  CAS  PubMed Central  PubMed  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ryan M Foglyano.

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Foglyano, R.M., Kaffashi, F., Dick, T.E. et al. Quantifying the complexity of neural network output using entropy measures. BMC Neurosci 10 (Suppl 1), P322 (2009). https://0-doi-org.brum.beds.ac.uk/10.1186/1471-2202-10-S1-P322

Download citation

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/1471-2202-10-S1-P322

Keywords