Characterisation of the standard deviation of a time-series signal has uncommon, yet widespread applications. The usual requirement for a representation of signal standard deviation in real-time implies a high computation speed. A method based on a field programmable gate array (FPGA) implementation is presented. The technique is benchmarked against conventional computational approaches and shows a single windowed standard deviation update calculation of a 16 bit sample can be achieved in 11 ns on a modern CPU. The FPGA implementation is found to be superior to all other approaches examined with an operation time of below 10 ns, and thus provides a useful tool for the real-time measurement of the standard deviation of signals above 100 MHz.