Graduate Seminar in Fluid Mechanics
4:00 pm Presentation
“Computational Modeling of Hemodynamics and Blood Washout in the Patient-Specific Left Atrial Appendages”
Presented by CHUANXIN NI (Advisers: Dr. Jung-Hee Seo & Prof. Mittal)
The left atrial appendage (LAA) is a small chamber-like organ connected to the left atrium (LA). Studies have shown that this structure is implicated in thrombus formation and thromboembolic events for patients with atrial fibrillations. However, due to its highly complex and variable shape, the blood flow patterns and the mechanism of thrombogenesis inside the LAA are poorly understood. The aim of this study is to analyze the hemodynamics inside patient-specific LA/LAAs via computational fluid dynamics (CFD) modeling and to understand the potential for thrombus formation in the LAA. Patient-specific LA/LAA geometries are derived from high-resolution CT scans and the blood flowrate profiles at the mitral annulus are obtained from ultrasound Doppler measurements. Direct numerical simulation is carried out using a sharp-interface immersed boundary method. An Eulerian transport equation for the blood residence time is also solved inside the fluid domain to investigate the blood transportation and coagulation potential in the LAA. In this study, several patient-specific cases with different LAA shapes and heart conditions are considered and the blood flow patterns and washout in the LAA are compared for these cases.
4:25 pm Presentation
“Compressing DNS Datasets using a Sub-Domain Re-Simulation Algorithm”
Presented by ZHAO WU (Adviser: Prof. Meneveau)
With the development of computer speed, researchers are able to perform larger and larger direct numerical simulations (DNS). In 2013, Lee et al. (SC’13) used 260 million CPU hours to run the then world record DNS channel flow at Reτ=5200 with 121 billion grid points. In 2015, Yeung et al. (PNAS 2015) performed an isotropic turbulence DNS on a 81923 grid. Simple calculations show a single snapshot of velocity and pressure would cost 2TB and 8.8TB for the two DNS respectively, in single precision. Clearly, storing the time-resolved results would not be feasible in the current storage approach. To overcome this difficulty, we are experimenting with a novel algorithm to store the dataset. Once the original DNS (termed big DNS below) is performed, spatiotemporal subsampling will be stored. When a user requests values at a particular grid point, a re-simulation of the sub-domain (termed re-simulation below), which contains the querying grid point, will be performed, with boundary conditions from the subsampling. We have found that we have to perfectly match the numerical scheme of the re-simulation with the big DNS to produce machine precision errors. The re-simulation errors will be linearly proportional to the errors, if any, introduced in the re-simulation, which could be caused by using different numerical schemes or incorrect boundary conditions.