Decoding micro-electrocorticographic signals by using explainable 3D convolutional neural network to predict finger movements

Chao Hung Kuo, Guan Tze Liu, Chi En Lee, Jing Wu, Kaitlyn Casimo, Kurt E. Weaver, Yu Chun Lo, You Yin Chen, Wen Cheng Huang, Jeffrey G. Ojemann

Research output: Contribution to journalArticlepeer-review

Abstract

Background: Electroencephalography (EEG) and electrocorticography (ECoG) recordings have been used to decode finger movements by analyzing brain activity. Traditional methods focused on single bandpass power changes for movement decoding, utilizing machine learning models requiring manual feature extraction. New method: This study introduces a 3D convolutional neural network (3D-CNN) model to decode finger movements using ECoG data. The model employs adaptive, explainable AI (xAI) techniques to interpret the physiological relevance of brain signals. ECoG signals from epilepsy patients during awake craniotomy were processed to extract power spectral density across multiple frequency bands. These data formed a 3D matrix used to train the 3D-CNN to predict finger trajectories. Results: The 3D-CNN model showed significant accuracy in predicting finger movements, with root-mean-square error (RMSE) values of 0.26–0.38 for single finger movements and 0.20–0.24 for combined movements. Explainable AI techniques, Grad-CAM and SHAP, identified the high gamma (HG) band as crucial for movement prediction, showing specific cortical regions involved in different finger movements. These findings highlighted the physiological significance of the HG band in motor control. Comparison with existing methods: The 3D-CNN model outperformed traditional machine learning approaches by effectively capturing spatial and temporal patterns in ECoG data. The use of xAI techniques provided clearer insights into the model's decision-making process, unlike the “black box” nature of standard deep learning models. Conclusions: The proposed 3D-CNN model, combined with xAI methods, enhances the decoding accuracy of finger movements from ECoG data. This approach offers a more efficient and interpretable solution for brain-computer interface (BCI) applications, emphasizing the HG band's role in motor control.

Original languageEnglish
Article number110251
JournalJournal of Neuroscience Methods
Volume411
DOIs
Publication statusPublished - Nov 2024

Keywords

  • Convolutional neural network
  • ECoG
  • Electrocorticography
  • Finger movement
  • Grad-CAM
  • High gamma
  • SHAP value

ASJC Scopus subject areas

  • General Neuroscience

Fingerprint

Dive into the research topics of 'Decoding micro-electrocorticographic signals by using explainable 3D convolutional neural network to predict finger movements'. Together they form a unique fingerprint.

Cite this