Using Simulation to Predict Safety and Operational Impacts of Increasing Traffic Signal Density

The use of simulation models to predict the safety and operational impacts of increased traffic signal density in suburban corridors is described. Using 10 years of data from two major arterials in Virginia, actual crash rates were compared with operational performance measures simulated by the Synchro/SimTraffic model. As expected, crash rates were positively correlated with stops and delay per main-line vehicle and negatively correlated with main-line speed. Three findings were significant. First, the correlation between crash rates and performance measures (main-line delay, speed, stops) was relatively strong despite the inherent variability in crash rates: R2 values ranged from .54 to .89. Second, three distinct regimes relate stops per vehicle to signal density: the installation of the first few signals causes a drastic increase in stops, the addition of the next set of signals causes a moderate increase in stops, and the addition of a third set of signals does not significantly affect the number of stops per vehicle. Third, multiple-regime models also relate total delay per vehicle to signal density. Two practical applications are suggested, one for safety and one for operations. To the extent that these measures correlate with crashes, simulation modeling may be used to estimate safety impacts of increased signals, which is appealing because simulation packages are becoming easier to apply. This safety-oriented endeavor was the primary motivation behind this research. A secondary, operational benefit, however, is that three regime models can suggest when, in the timeline of corridor development, the addition of a traffic signal is likely to degrade corridor performance significantly, thereby allowing decision makers to expend political capital when it is most beneficial.