Search for a command to run...
Intracranial aneurysms are pathological dilations of cerebral arteries affecting 3-5% of the population. Their rupture can cause subarachnoid hemorrhage, a severe form of stroke with high mortality and morbidity. Endovascular treatments, such as coiling and flow diversion, aim to induce thrombosis within the aneurysmal sac. Clinicians face challenges in predicting clot formation due to the complex interplay of blood flow, platelet transport, and biochemical reactions. Computational Fluid Dynamics models can provide valuable hemodynamic insights but are computationally prohibitive for real-time clinical decision-making. Machine Learning offers a promising alternative to usual Finite Element solvers by learning on Computational Fluid Dynamics simulation data. Early approaches based from Convolutional Neural Networks were limited by their reliance on structured grids, while Graph Neural Networks can naturally operate on the unstructured meshes used in Computational Fluid Dynamics simulations. Recent advances in Transformer-based Graph Neural Network architectures have demonstrated superior accuracy and efficiency compared to classical message-passing models. Yet, their application to multi physics problems remains unexplored. In this work, we investigate the use of a Transformer-based Graph Neural Network to predict thrombus formation in patient-specific IA geometries. Our framework, trained on multi physics simulations, provides valuable insights on multitask learning in the context of Graph Neural Networks for physical simulations. Our results show that training separate models for each physical field yields better performance and enables parallelization. The Transformer-based Graph Neural Network achieves state-of-the-art accuracy while significantly reducing computational cost. The model demonstrates robustness to unseen inflow boundary conditions and cardiac cycles, highlighting its potential for clinical applications.
Published in: Computers in Biology and Medicine
Volume 208, pp. 111649-111649