An HDL Simulation of the Effects of Single Event Upsets on Microprocessor Program Flow

Simulation experiments for determining the effects of single event upsets on microprocessor program flow are described. A 16 bit microprocessor is modeled using a hardware description language. Using pseudorandom selection of event time and effected flip-flop, SEU's are injected into the microprocessor model. Upset detectors are modeled along with the microprocessor for determination of fault coverage of several candidate fault detection techniques.

[1]  David L. Kuck,et al.  The Structure of Computers and Computations , 1978 .

[2]  James R. Armstrong,et al.  GSP: A Logic Simulator for LSI , 1981, 18th Design Automation Conference.

[3]  Robert S. Swarz,et al.  The theory and practice of reliable system design , 1982 .

[4]  Gerald M. Masson,et al.  The Containment Set Approach to Upsets in Digital Systems , 1982, IEEE Transactions on Computers.

[5]  J. C. Pickel,et al.  Cosmic Ray Induced Errors in I2L Microprocessors and Logic Devices , 1981, IEEE Transactions on Nuclear Science.