Electronic health records (EHRs) are crucial for modern medical practices, providing digital storage of patient health information. However, accurately identifying diseases that lead to mortality within EHRs remains challenging. This study introduces a novel hybrid intra-multi-head attention with a bidirectional gated recurrent unit (Hybrid-IMHA-BiGRU) model to detect critical diseases associated with higher patient mortalities. Our main objective is to capture critical contextual features in text sequences. To achieve this, we propose both a two-channeled attention mechanism (TC-AM) and a single-channeled AM (SC-AM). The TC-AM focuses on symptoms, diseases, and mortalities, while the SC-AM discerns intricate patterns in long text sequences. Our approach's novelty lies in bidirectional processing, enhancing the model's ability to comprehend entire sequences by capturing contextual information from both preceding and following tokens. Existing studies struggle with capturing relevant contexts in variable-length EHRs due to issues with long-term dependencies, dynamic sequences, and variations inpatient data representation. To address these challenges, we recommend a medical context-based robustly optimized bidirectional encoder representations from transformers pretraining approach (Med-RoBERTa) for embedding computations, which efficiently captures nuanced relationships and semantic dependencies within variable-length EHRs. Additionally, our model incorporates a cross-AM to adaptively weigh parts of the input sequence, which is beneficial for handling sequences of varying lengths. Our findings indicate that the proposed model outperforms state-of-the-art models, achieving 98.10% and 97.90% accuracies and F1 scores of 94.72% and 94.45% on EHRs datasets. This underscores the effectiveness of our approach in precisely identifying diseases associated with patient mortalities in EHRs.