Search for a command to run...
Accurate electroencephalography (EEG) signals classification is essential for diagnosing brain disorders such as epilepsy. Whereas deep learning models such as convolution neural networks (CNNs) and long short‐term memory (LSTM) improved EEG classification performance over traditional methods, existing attention mechanisms such as Additive, Luong, and Multihead struggle to capture EEG’s complex temporal dependencies. This study proposes scaled custom attention (SCA); a mechanism for temporal dependency modeling during EEG signals classification. Unlike traditional QKV‐based similarity scoring attention mechanisms, which applies semantic cross‐token weighting, SCA replaces these operations with a direct feature‐weighting strategy tailored to the temporal structure of EEG signals and incorporates a scaling mechanism to improve computational stability. To validate our approach, experiments were conducted using TUH EEG Epilepsy Corpus (TUEP) where SCA achieved an improved classification performance (accuracy: 98.07%, F1‐Score: 98.06%), marginally higher than additive (97.60%, 97.61%), multihead (97.66%, 97.66%), and Luong (97.68%, 97.66%) baseline attention mechanisms when integrated to the LConvNet EEG classification model. Additionally, SCA achieves a balanced performance profile, with competitive inference time of 2.83 vs. 1.32–3.89 for baselines, parameter efficiency (58.5 params/sample vs., 58.5–63.7), and a comparable generalization, with an average training‐validation difference (Avg) of 0.0191, making it a promising enhancement for EEG‐based DL models. Subsequently, based on further performance comparison analyses using state‐of‐the‐art (SOTA) EEG classification models; including EEGNet, DeepConvNet, and ShallowConvNet, the proposed LConvNet + SCA model demonstrates superior performance.