Most satellite communications monitoring tools use simple thresholding of univariate measurements to alert the operator to unusual events [1], [2]. This approach suffers from frequent false alarms, and is moreover unable to detect sequence or multivariate anomalies [3]. Here we consider the problem of detecting outliers in high-dimensional time-series data, such as transponder frequency spectra. Long Short Term Memory (LSTM) networks are able to form sophisticated representations of such multivariate temporal data, and can be used to predict future sequences when presented with sufficient context. We report here on the utility of LSTM prediction error as a defacto measure for detecting outliers. We show that this approach significantly improves on simple threshold models, as well as on moving average and static predictors. The latter simply assume the next trace will be equal to the previous trace. The advantages of using an LSTM network for anomaly detection are twofold. Firstly, the training data do not need to be labelled. This alleviates the need to provide the model with specific examples of anomalies. Secondly, the trained model is able to detect previously unseen anomalies. Such anomalies have a degree of unpredictability that makes them stand out. LSTM networks are further able to potentially detect more nuanced sequence and multivariate anomalies. These occur when all values are within normal tolerances, but the sequence or combinations of values are themselves unusual. The technique we describe could be used in practice for alerting satellite network operators to unusual conditions requiring their attention.
[1]
Fredric C. Gey,et al.
The Relationship between Recall and Precision
,
1994,
J. Am. Soc. Inf. Sci..
[2]
Timothy J. O'Shea,et al.
Recurrent Neural Radio Anomaly Detection
,
2016,
ArXiv.
[3]
Lovekesh Vig,et al.
Long Short Term Memory Networks for Anomaly Detection in Time Series
,
2015,
ESANN.
[4]
Edward Arbon,et al.
Anomaly Detection in Satellite Communications Networks using Support Vector Machines
,
2015
.
[5]
Guigang Zhang,et al.
Deep Learning
,
2016,
Int. J. Semantic Comput..
[6]
Nitish Srivastava,et al.
Dropout: a simple way to prevent neural networks from overfitting
,
2014,
J. Mach. Learn. Res..
[7]
VARUN CHANDOLA,et al.
Anomaly detection: A survey
,
2009,
CSUR.
[8]
Subutai Ahmad,et al.
Unsupervised real-time anomaly detection for streaming data
,
2017,
Neurocomputing.
[9]
Jürgen Schmidhuber,et al.
LSTM: A Search Space Odyssey
,
2015,
IEEE Transactions on Neural Networks and Learning Systems.
[10]
Lovekesh Vig,et al.
LSTM-based Encoder-Decoder for Multi-sensor Anomaly Detection
,
2016,
ArXiv.