Context-dependent image quality assessment of JPEG compressed Mars Science Laboratory Mastcam images using convolutional neural networks

Abstract The Mastcam color imaging system on the Mars Science Laboratory Curiosity rover acquires images that are often JPEG compressed before being downlinked to Earth. Depending on the context of the observation, this compression can result in image artifacts that might introduce problems in the scientific interpretation of the data and might require the image to be retransmitted losslessly. We propose to streamline the tedious process of manually analyzing images using context-dependent image quality assessment, a process wherein the context and intent behind the image observation determine the acceptable image quality threshold. We propose a neural network solution for estimating the probability that a Mastcam user would find the quality of a compressed image acceptable for science analysis. We also propose an automatic labeling method that avoids the need for domain experts to label thousands of training examples. We performed multiple experiments to evaluate the ability of our model to assess context-dependent image quality, the efficiency a user might gain when incorporating our model, and the uncertainty of the model given different types of input images. We compare our approach to the state of the art in no-reference image quality assessment. Our model correlates well with the perceptions of scientists assessing context-dependent image quality and could result in significant time savings when included in the current Mastcam image review process.

[1]  Xin Li,et al.  Blind image quality assessment , 2002, Proceedings. International Conference on Image Processing.

[2]  Alan C. Bovik,et al.  RRED Indices: Reduced Reference Entropic Differencing for Image Quality Assessment , 2012, IEEE Transactions on Image Processing.

[3]  Zhou Wang,et al.  No-reference perceptual quality assessment of JPEG compressed images , 2002, Proceedings. International Conference on Image Processing.

[4]  Paolo Napoletano,et al.  On the use of deep learning for blind image quality assessment , 2016, Signal Image Video Process..

[5]  Zoubin Ghahramani,et al.  Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning , 2015, ICML.

[6]  Nikhil Ketkar,et al.  Convolutional Neural Networks , 2021, Deep Learning with Python.

[7]  Yoshua Bengio,et al.  Deep Sparse Rectifier Neural Networks , 2011, AISTATS.

[8]  Guangming Shi,et al.  Reduced-Reference Image Quality Assessment With Visual Information Fidelity , 2013, IEEE Transactions on Multimedia.

[9]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[10]  Zhou Wang,et al.  Reduced-reference image quality assessment using a wavelet-domain natural image statistic model , 2005, IS&T/SPIE Electronic Imaging.

[11]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[12]  Yi Li,et al.  Convolutional Neural Networks for No-Reference Image Quality Assessment , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[13]  V. K. Bairagi,et al.  A no-reference image quality assessment , 2013, 2013 IEEE International Conference ON Emerging Trends in Computing, Communication and Nanotechnology (ICECCN).

[14]  Azeddine Beghdadi,et al.  A novel free reference image quality metric using neural network approach , 2010 .

[15]  George Barbastathis On the use of deep learning techniques for electromagnetic inverse problems , 2017 .

[16]  S. N. Utane,et al.  REDUCED-REFERENCE IMAGE QUALITY ASSESSMENT , 2011 .

[17]  Granino A. Korn,et al.  Mathematical handbook for scientists and engineers. Definitions, theorems, and formulas for reference and review , 1968 .

[18]  David S. Doermann,et al.  Unsupervised feature learning framework for no-reference image quality assessment , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[19]  Xuelong Li,et al.  Blind Image Quality Assessment via Deep Learning , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[20]  Alan C. Bovik,et al.  Blind image quality assessment on real distorted images using deep belief nets , 2014, 2014 IEEE Global Conference on Signal and Information Processing (GlobalSIP).

[21]  Michal C Malin,et al.  The Mars Science Laboratory (MSL) Mast cameras and Descent imager: Investigation and instrument descriptions , 2017, Earth and space science.

[22]  R. Anderson,et al.  Mars Science Laboratory Mission and Science Investigation , 2012 .

[23]  Cleve Moler,et al.  Mathematical Handbook for Scientists and Engineers , 1961 .

[24]  Alan C. Bovik,et al.  Blind Image Quality Assessment: From Natural Scene Statistics to Perceptual Quality , 2011, IEEE Transactions on Image Processing.

[25]  Mark T. Lemmon,et al.  The Mars Science Laboratory Curiosity rover Mastcam instruments: Preflight and in‐flight calibration, validation, and data archiving , 2017 .

[26]  Xiaojun Wu,et al.  Blind Image Quality Assessment Using a General Regression Neural Network , 2011, IEEE Transactions on Neural Networks.

[27]  Christophe Charrier,et al.  Blind Image Quality Assessment: A Natural Scene Statistics Approach in the DCT Domain , 2012, IEEE Transactions on Image Processing.

[28]  Yi Li,et al.  Simultaneous estimation of image quality and distortion via multi-task convolutional neural networks , 2015, 2015 IEEE International Conference on Image Processing (ICIP).

[29]  Ashish Kapoor,et al.  Blind Image Quality Assessment Using Semi-supervised Rectifier Networks , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[30]  Alan C. Bovik,et al.  No-Reference Image Quality Assessment in the Spatial Domain , 2012, IEEE Transactions on Image Processing.

[31]  Guy Marchal,et al.  Multimodality image registration by maximization of mutual information , 1997, IEEE Transactions on Medical Imaging.