Parallel Numerical Solution for Flood Modeling Systems

Simulation of the water flood problems often leads to solving large sparse systems of partial differential equations. For such systems, numerical methods can be very CPU-time consuming. Therefore, parallel simulation is beneficial for water flood studies, and provides satisfactory accuracy. In this paper, we present an approach to parallelizing water flow modeling systems and some experimental results obtained on Linux clusters.