A GPU-based Flood Simulation Framework

We present a multi-core, GPU-based framework for simulation and visualization of two-dimensional floods, based on the full implementation of Saint Venant equations. A validated CPU-based flood model was converted to NVIDIA’s CUDA architecture. The model was run on two different NVIDIA graphics cards, a GeForce 8400 GS and a Tesla T10. The model was tested using two case study applications. Implementing the GPU version increased the model performance with speedups ranging from 3X to 6X on GeForce 8400GS, and 50X to 135X on Tesla T10. The GPU version resulted in scalable performance on both the cards compared to the CPU version. The dam break flood event was reproduced and the simulation run time reduced from 2.9 hours to 2 minutes.