A duplex cerebral function analyzer using the processed EEG and middle latency auditory evoked potential (ml-AEP) for anaesthesia and sedation monitoring.

Slides:



Advertisements
Similar presentations
Neurologic Monitoring
Advertisements

ECE 8443 – Pattern Recognition EE 3512 – Signals: Continuous and Discrete Objectives: Response to a Sinusoidal Input Frequency Analysis of an RC Circuit.
Time-Frequency Analysis Analyzing sounds as a sequence of frames
BIOPOTENTIAL AMPLIFIERS
Copyright 2001, Agrawal & BushnellVLSI Test: Lecture 181 Lecture 18 DSP-Based Analog Circuit Testing  Definitions  Unit Test Period (UTP)  Correlation.
EE513 Audio Signals and Systems Digital Signal Processing (Synthesis) Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Abstract Overview Analog Circuit EEG signals have magnitude in the microvolt range. A much larger voltage magnitude is needed to detect changes in the.
Intro to Spectral Analysis and Matlab. Time domain Seismogram - particle position over time Time Amplitude.
EE93 – Medical Mobile Devices and Apps
The Illinois Society of Electroneurodiagnostic Technologists (ISET) Fall Meeting: Electronics Crash Course for Technologists Saturday, November 9, 2013.
Data Acquisition Risanuri Hidayat.
Introduction The aim the project is to analyse non real time EEG (Electroencephalogram) signal using different mathematical models in Matlab to predict.
Novocontrol Alpha Analyzers Fundamentals
Electroencephalography The field generated by a patch of cortex can be modeled as a single equivalent dipolar current source with some orientation (assumed.
Development of Improved Noise Metrics and Auditory Risk Assessment Procedure June 22, 2009 Won Joon Song and Jay Kim Mechanical Engineering Department.
1 Audio Compression Techniques MUMT 611, January 2005 Assignment 2 Paul Kolesnik.
EE2F1 Speech & Audio Technology Sept. 26, 2002 SLIDE 1 THE UNIVERSITY OF BIRMINGHAM ELECTRONIC, ELECTRICAL & COMPUTER ENGINEERING Digital Systems & Vision.
Digital Signal Processing A Merger of Mathematics and Machines 2002 Summer Youth Program Electrical and Computer Engineering Michigan Technological University.
EE2F1 Speech & Audio Technology Sept. 26, 2002 SLIDE 1 THE UNIVERSITY OF BIRMINGHAM ELECTRONIC, ELECTRICAL & COMPUTER ENGINEERING Digital Systems & Vision.
CELLULAR COMMUNICATIONS DSP Intro. Signals: quantization and sampling.
331: STUDY DATA COMMUNICATIONS AND NETWORKS.  1. Discuss computer networks (5 hrs)  2. Discuss data communications (15 hrs)
1 QRS Detection Section Linda Henriksson BRU/LTL.
In The Name of Allah The Most Beneficent The Most Merciful 1.
Chapter 3 (cont).  In this section several basic concepts are introduced underlying the use of spatial filters for image processing.  Mainly spatial.
Ni.com Data Analysis: Time and Frequency Domain. ni.com Typical Data Acquisition System.
GCT731 Fall 2014 Topics in Music Technology - Music Information Retrieval Overview of MIR Systems Audio and Music Representations (Part 1) 1.
Data Processing Functions CSC508 Techniques in Signal/Data Processing.
Prof. Brian L. Evans Dept. of Electrical and Computer Engineering The University of Texas at Austin EE445S Real-Time Digital Signal Processing Lab Fall.
EE345S Real-Time Digital Signal Processing Lab Fall 2006 Lecture 16 Quadrature Amplitude Modulation (QAM) Receiver Prof. Brian L. Evans Dept. of Electrical.
I N T HE N AME OF A LLAH T HE M OST B ENEFICENT T HE M OST M ERCIFUL 1.
EE Audio Signals and Systems Digital Signal Processing (Synthesis) Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
By Sarita Jondhale1 Signal Processing And Analysis Methods For Speech Recognition.
ECE 4710: Lecture #6 1 Bandlimited Signals  Bandlimited waveforms have non-zero spectral components only within a finite frequency range  Waveform is.
Abdul-Aziz .M Al-Yami Khurram Masood
Basics of Neural Networks Neural Network Topologies.
Advanced Digital Signal Processing
Vital Signs Monitor UConn BME 4900 Vital Signs Monitor Purpose As the population ages, many people are required by their doctors to take vital signs.
School of Biomedical Engineering, Science & Health Systems V 2.0 SD [020204] Impedance Cardiography for Non-invasive.
Authors: Sriram Ganapathy, Samuel Thomas, and Hynek Hermansky Temporal envelope compensation for robust phoneme recognition using modulation spectrum.
Gary O’ Donoghue Electronic & Computer Engineering, National University of Ireland, Galway Final Year Project A small number of consumer electronics.
Image Processing Basics. What are images? An image is a 2-d rectilinear array of pixels.
Electromyography (EMG)
By Sarita Jondhale 1 The process of removing the formants is called inverse filtering The remaining signal after the subtraction of the filtered modeled.
ECE 8443 – Pattern Recognition EE 3512 – Signals: Continuous and Discrete Objectives: Causality Linearity Time Invariance Temporal Models Response to Periodic.
1 Analog versus Digital Information-bearing signals can be either analog or digital. Analog signal takes on a continuous range of amplitude values. Whereas.
Computer Graphics & Image Processing Chapter # 4 Image Enhancement in Frequency Domain 2/26/20161.
Computer vision. Applications and Algorithms in CV Tutorial 3: Multi scale signal representation Pyramids DFT - Discrete Fourier transform.
GROUP MEMBERS ELISHBA KHALID 07-CP-07 TAHIRA SAMEEN 07-CP-31.
NAP5 The 5th National Audit Project ■ ■ ■ ■ ■ NAP5 The 5th National Audit Project ■ ■ ■ ■ ■ NAP5 The 5th National Audit Project ■ ■ ■ ■ ■ TIVA Dr Alastair.
Spectral subtraction algorithm and optimize Wanfeng Zou 7/3/2014.
Frequency Domain Representation of Biomedical Signals.
Lifecycle from Sound to Digital to Sound. Characteristics of Sound Amplitude Wavelength (w) Frequency ( ) Timbre Hearing: [20Hz – 20KHz] Speech: [200Hz.
VIDYA PRATISHTHAN’S COLLEGE OF ENGINEERING, BARAMATI.
بسم الله الرحمن الرحيم Digital Signal Processing Lecture 14 FFT-Radix-2 Decimation in Frequency And Radix -4 Algorithm University of Khartoum Department.
Spatial Filtering (Chapter 3) CS474/674 - Prof. Bebis.
(4) Filters.
ARTIFICIAL NEURAL NETWORKS
QRS Detection Linda Henriksson 1.
EE Audio Signals and Systems
ECET 350 Education for Service/tutorialrank.com
N. Capp, E. Krome, I. Obeid and J. Picone
Somatosensory evoked potentials for closed-loop control of anaesthetic depth using propofol in the urethane-anaesthetized rat  A. Angel, R.H. Arnott,
E. Olofsen, J.W. Sleigh, A. Dahan  British Journal of Anaesthesia 
Machine Learning for Visual Scene Classification with EEG Data
Govt. Polytechnic Dhangar(Fatehabad)
Comparison of Bispectral Index and Entropy values with electroencephalogram during surgical anaesthesia with sevoflurane†  A.J. Aho, K. Kamata, V. Jäntti,
Tuning to Natural Stimulus Dynamics in Primary Auditory Cortex
♪ Embedded System Design: Synthesizing Music Using Programmable Logic
FPGA Based Single Phase Motor Control Using Multistep Sine PWM Author Name1, Author Name2., Author Name3, (BE-Stream Name) Under the Guidance Of Guide.
Presentation transcript:

A duplex cerebral function analyzer using the processed EEG and middle latency auditory evoked potential (ml-AEP) for anaesthesia and sedation monitoring A.C.Fisher*, A.F.G.Taktak, A.G.Jones #, S.M.Mostafa #, G.Sidaras # Depts. of Clinical Engineering and # Anaesthesia & Theatres ( * liv.ac.uk ) C linical E ngineering Royal Liverpool University Hospital Summary Measuring the depth of anaesthesia during surgery is not as straight-forward as one would think In addition to the main anaesthetic agent, the patient often receives other medication (for example: neuromuscular blockers) which hide the symptoms of any pain or discomfort if the anaesthesia is too weak Similarly, the responses that might indicate if the anaesthesia is too strong can also be masked The most common methods to be proposed analyse the electrical activity of the brain (the EEG: electroencephalogram) In Liverpool, a novel system which uses two EEG analyses simultaneously has been designed and used to monitor patients undergoing a variety of surgical procedures This approach is made possible by the availability of fast computers and recently-developed mathematical techniques. e+e+ e-e- e-e- e+e+ ref  PIC control  PIC data flow ~ ~ 5kHz isolation barrier Visual B ASIC MATLAB Active X PC serial communication embedded microcontrollers filters, programmable gain & A-to-D converters instrum. amplifiers switching network & electrodes click to headphones C code CMEX impedance test signal RL-CeFAM : The Royal Liverpool Cerebral Function Anaesthesia Monitor Figure 1: The headbox: 2 channels of EEG monitoring... OR 1 channel of EEG and 1 channel of mlAEP Figure 2: The headbox electronics schematic SE HzSE Hz 4.2k 3.8k6.4k4.7k Tony_Fisher_eyes_closed_1.cfm Figure 3 RL-CeFAM Monitor Display 4.2k 3.2k a: right channel EEG amplitude mode b: left channel AEP mode / OASS Propofol μg.ml -1 plasma (target) concn min LivLAS form ml AEP How are data processed in RL-CeFAM? At the lowest level, data-acquisition and first-order noise rejection are performed by algorithm implemented in C-language at the microcontroller level. At the higher levels in the PC, data are processed using the mathematical language MatLab viz: – signal recovery by finite impulse response (FIR) bandpass filtering – mains noise elimination by adaptive filtering (LS Adaptive Cancellation) – non-linear artefact rejection using mathematical-morphology operations – frequency analysis (eg. S95, S50) via the Fast Fourier Transform – Graphical User Interfaces (GUI) constructed in MS Visual BASIC What exactly is the Liv-LAS Index for Depth of Anaesthesia? The mlAEP is the eeg response to a ‘click’ of sound presented to the patient via headphones. The Liv-LAS is the vector sum of the Laplacian difference vector of the recovered AEP over the period 20 to 125ms. This is a measure of ‘curviness’ (sic). See the AEP displayed in Figure 3b RL-CeFAM in action Figures 4 & 5 show the Liv-Las throughout 2hrs abdominal surgery with anaesthesia maintained by Propofol TIVA. The index is computed ~ 500 times per hour. The upper trace (Figure 4) shows the raw and first-order- filtered index. Figure 5 shows the index recovery after non-linear processing. Induction, maintenance and wake-up phases are clearly discernable. What is measured in RL-CeFAM? amplitude/time domain continuous eeg (linear and logarithmic representation) mean & median integrated amplitude amplitude/frequency/time domain compressed spectral array S90- and S95-spectral edges S50: 50th percentile (median) frequency sampled time: evoked potential auditory (AEP) middle latency 25 to 125ms specialised indices Liv-LAS: AEP-based depth of anaesthesia index... (modified Glasgow LAS level of arousal score) continuous electrode impedance monitor The 2 corner stones are the S50 median frequency and the Liv-LAS index derived from the mlAEP (see above and Figure 4 & 5 ) The acid test. How does RL-CeFAM relate to conventional scoring of anaesthetic depth? There is no gold standard by which the performance of a new depth of anaesthesia monitor can be assessed. The frequently accepted comparative test uses the Observer's Assessment of Alertness/Sedation scale (OAAS) levels 1 - 5: 1. Awake 2. Slow response: slurred speech 3. Eyes closed: response to commands 4. Response to commands only after several attempts and mild prodding 5. No response to commands or shaking The performance of RL-CeFAM wrt OASS is illustrated below. Figure 6: Liv-LAS course during 30 minutes of Propofol TIVA. The OASS staging (here superimposed a postiori) is consistent with the Depth of Anaesthesia Index derived in real-time by the RL-CeFAM time base [samples/2] sb whole.cln CEFAM-AEP (LIV-LASs) Liv-LAS [arbitrary units] time base [samples/2] raw Liv-LAS low pass Liv-LAS Liv-LAS [arbitrary units] voted Liv-LAS smoothed voted Liv-LAS voted-voted Liv-LAS Figure 4 Liv-LAS first-order time course Figure 5 Liv-LAS non-linear recovery