Skip to main content
  • Poster presentation
  • Open access
  • Published:

EEG processing with TESPAR for depth of anesthesia detection

Introduction

Adequate anesthesia is crucial to the success of surgical interventions and subsequent recovery. Neuroscientists, surgeons, and engineers have sought to understand the impact of anesthetics on the information processing in the brain and to properly assess the level of anesthesia in an non-invasive manner. Studies have indicated a more reliable depth of anesthesia (DOA) detection if multiple parameters are employed. Indeed, commercial DOA monitors (BIS, Narcotrend, M-Entropy and A-line ARX) use more than one feature extraction method. Here, we propose TESPAR (Time Encoded Signal Processing And Recognition) a time domain signal processing technique novel to EEG DOA assessment that could enhance existing monitoring devices.

Methods

We developed an artificial system that employs TESPAR descriptors from EEG combined with MLP artificial neural networks to discriminate between five DOA levels. The system was trained and tested on DOA mappings performed by two expert anesthesiologists based on morphologically different features, namely the mid latency auditory evoked potentials (MLAEP) known to be correlated with DOA. A number of 62 patients, underlying surgery, were enrolled in this study after having provided their informed consent. The patients were sedated using a cocktail of substances chosen by the attending anesthesiologist. The cleaned EEG with a bandwidth of 0.5 to 600 Hz was divided in segments of about 100 seconds that were categorized in five DOA classes based on the notes recorded during surgery and the shape of the corresponding MLAEP.

Results

The largest amount of self-consistency achieved by one of the experts that classified the same data on two occasions was 70.77%. This was regarded as the limit of classification performance that the artificial system could achieve. Indeed, the artificial system achieved a 69.05% classification performance. Moreover, we found that the human expert and the artificial system had similar confusion matrices and thus similar mapping patterns.

Discussion

TESPAR processed EEG showed an intimate relation with states of patients undergoing general anesthesia. This intimate relation allowed the artificial system to achieve good DOA classification performance, despite the limits imposed by learning from imperfect human experts. TESPAR offers small, compact, fixed-size and highly informative descriptors that could be used to enhance already existing DOA monitors. TESPAR shows promising perspective in areas where only light computational resources are available.

Acknowledgements

We gratefully acknowledge the financial support from the Hertie Foundation, three grants of the Romanian government (Human Resources Program RP-5/2007 contract 1/01.10.2007 and Ideas Program ID 48/2007 contract 204/01.10.2007 both financed by MECT/UEFISCSU, and Partnerships Program contract 11039/18.09.2007 financed by MECT/ANCS), a grant for the "Max Planck – Coneural Partner Group," and the EU (EU project GABA-FP6-2005-NEST-Path-043309). We want to thank Prof. Wolf Singer, Diek Wheeler and Ovidiu Jurjut for useful discussions and comments on the manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Vasile V Moca.

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Moca, V.V., Scheller, B., Mureşan, R.C. et al. EEG processing with TESPAR for depth of anesthesia detection. BMC Neurosci 10 (Suppl 1), P68 (2009). https://0-doi-org.brum.beds.ac.uk/10.1186/1471-2202-10-S1-P68

Download citation

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/1471-2202-10-S1-P68

Keywords