Publications & Research
Deep Learning for Stroke Mortality Prediction in eICU: A Dual-Tower Transformer Framework
Zhengrong Jia*, Kwong-Cheong Wong*
The 6th International Conference on Computer Communication and Artificial Intelligence (CCAI), 2026
Abstract
We propose a novel Dual-Tower Transformer (DT-Transformer) for stroke mortality prediction on the multicenter eICU Collaborative Research Database. The decoupled architecture processes categorical demographics and numerical vitals through separate tower pathways, each equipped with Self-Attention, before fusing representations for final prediction. The model achieves an AUPRC of 0.6171 — a 14.4% improvement over the strongest neural baseline. An Adaptive Runtime Safeguard is integrated for inference stability against physiological outliers, and attention map visualizations provide clinical interpretability.
- Dual-Tower Design — Separate encoding pathways for categorical (demographics) and numerical (vitals) features
- Self-Attention Layers — Each tower applies multi-head Self-Attention for intra-modality feature interaction
- Late Fusion — Tower outputs concatenated before classification head
- Adaptive Runtime Safeguard — Detects out-of-distribution inputs at inference time for clinical safety
- Attention Visualization — Heatmaps over input features for clinical interpretability
eICU Collaborative Research Database
- Multicenter critical care database (200,859 admissions, 208 hospitals)
- Stroke cohort extracted with ICD-9 codes 430–438
- Features: demographics, vitals, lab values, GCS scores
- Access via PhysioNet credentialed access
Code Repository
Open-source implementation available on GitHub. Repository includes:
- PyTorch model implementation
- Data preprocessing pipeline
- 5-fold cross-validation scripts
- Attention visualization tools
If you find this work useful, please cite:
@inproceedings{jia2026dttransformer,
title = {Deep Learning for Stroke Mortality Prediction
in eICU: A Dual-Tower Transformer Framework},
author = {Jia, Zhengrong and Wong, Kwong-Cheong},
booktitle = {Proceedings of the 6th International Conference on
Computer Communication and Artificial Intelligence (CCAI)},
year = {2026},
note = {Accepted. To appear in May 2026}
}
| Model | AUROC | AUPRC |
|---|---|---|
| DT-Transformer | 0.8848 ± 0.0034 | 0.6171 ± 0.0058 |
| XGBoost | 0.8908 | 0.6467 |
| Random Forest | 0.8806 | 0.6236 |
| NN (Neural Network) | 0.8582 ± 0.0018 | 0.5394 ± 0.0054 |
| Standard Transformer | 0.8457 ± 0.0129 | 0.5279 ± 0.0195 |
| Standard MLP | 0.8534 ± 0.0058 | 0.5170 ± 0.0081 |