Measures of Complexity
Information theory estimates Algorithmic complexity Hidden Markov
Approximate entropy Probability that 2 sequences which are similar for m points, remain similar at the next point
Anaesthesia depth discrimination from EEG data, deeper anaesthesia means lower complexity Characterization of mentally pathological and healthy groups from EEG recordings Discrimination of different stages of sleep from EEG and respiratory motion, lower complexity during deep sleep
Sample entropy Modification of approximate entropy and self-matches exclusion
Neonatal sepsis prediction from HRV data
Fourier entropy Compute PSD of a time series Normalize spectrum to get probability-like distribution Calculate entropy of normalized spectrum
Wavelet entropy Compute wavelet spectrum of a time series Calculate wavelet entropy
Wavelet entropy of pigs ECG
Renyi entropy Compute time-frequency representation of a time series Count connected regions above some threshold One peak in time-frequency space represents an elementary event Counting peaks gives an estimate of complexity
Higher order methods Break a time series into parts Compute entropy of each part Treat entropy values as a time series and compute entropy of that sequence
Multiscale methods Downsampling of a time series Calculate entropy of downsampled data
MSE entropy of HRV signal