Truncation thresholds based empirical mode decomposition approach for classification performance of motor imagery BCI systems


CHAOS SOLITONS & FRACTALS, vol.152, 2021 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 152
  • Publication Date: 2021
  • Doi Number: 10.1016/j.chaos.2021.111450
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, INSPEC, zbMATH
  • Keywords: Motor Imagery, EEG, BCI, Signal processing, Classification performance, COMPONENT ANALYSIS, FEATURE-EXTRACTION, NEURAL-NETWORK, EEG SIGNALS, PATTERNS, REMOVAL
  • Kayseri University Affiliated: Yes


Electroencephalogram (EEG) signals classification, which are important for brain computer interfaces (BCI) systems, is extremely difficult due to the inherent complexity and tendency to artifact properties of the signals. In this paper, a novel methodology based on Truncation Thresholds (TT) method based Empirical Mode Decomposition (EMD) method and statistical Common Spatial Pattern (CSP) feature extraction method is proposed to classified left and right hand imaginary movements from EEG signals. The TT method is used to change the selected local maximum and minimum points with EMD to distinguish more accurately the hidden information about the motor imagery cover the sub-bands in the frequency domain in addition to remove the blinking electrooculography (EOG) artefacts. TT method is performed to raw EEG signals. Then, statistical spatial features are extracted with CSP method from each Intrinsic Modal Component (IMF) which is created by used the EEG signals with the EMD method. Finally, the extracted features are fed to three different classifiers which are SVM, KNN and LDA. The proposed methodology is applied to our dataset and public BCI Competition IV-2b dataset. The results show that the proposed methodology provides accuracy of 97% and 94% with using LDA classifier for our dataset and with using KNN classifier for BCI Competition IV-2b dataset, respectively. (c) 2021 Elsevier Ltd. All rights reserved.