Synchronization and communication in algorithmic structures

In high-performance computing, parallel and distributed systems enhance performance by devoting multiple processors to a single problem. However, the performance of such systems is often less than optimal due to synchronization problems, communication overhead I/O delays, and inefficient algorithms. This paper examines the effects of synchronization and communication on execution times for different categories of algorithmic structures. Our results show that the algorithmic structure used can have an impact on program execution time even when communication times are assumed to be zero. With non-zero communication times, the asynchronous and nearest-neighbor structures show relatively little performance degradation while the synchronous and asynchronous master-slave structures demonstrate a large decrease in performance. Finally, we present and evaluate theoretical bounds on execution time for these structures.<<ETX>>