Using Layer Recurrent Neural Network to Generate Pseudo Random Number Sequences

Pseudo Random Number’s (PRN’s) are required for many cryptographic applications. This paper proposes a new method for generating PRN’s using Layer Recurrent Neural Network (LRNN). The proposed technique generates PRN’s from the weight matrix obtained from the layer weights of the LRNN. The LRNN random number generator (RNG) uses a short keyword as a seed and generates a long sequence as a pseudo PRN sequence. The number of bits generated in the PRN sequence depends on the number of neurons in the input layer of the LRNN. The generated PRN sequence changes, with a change in the training function of the LRNN .The sequences generated are a function of the keyword, initial state of network and the training function. In our implementation the PRN sequences have been generated using 3 training functions: 1) Scaled Gradient Descent 2) LevenbergMarquartz (TRAINLM) and 3) TRAINBGF. The generated sequences are tested for randomness using ENT and NIST test suites. The ENT test can be applied for sequences of small size. NIST has 16 tests to test random numbers. The LRNN generated PRN’s pass in 11 tests, show no observations for 4 tests, and fail in 1 test when subjected to NIST .This paper presents the test results for random number sequence ranging from 25 bits to 1000 bits, generated using LRNN.