Effect of Annotation and Loss Function on Epiphyte Identification using Conditional Generative Adversarial Network

The deep neural networks are capable of processing large amounts of data for image processing applications like classification, segmentation and identification etc. Conditional- Generative Adversarial Network (C-GAN) is an example of Deep Learning (DL)-based image processing algorithm that is used widely for identifying targets in images. In a Deep Learning based image processing application the quality of the input image plays an important role. The quality and variations within the dataset helps the neural network filters to derive the best features for the classification and identification of the target in the image. Many deep learning approaches will have a preprocessing pipeline which improves the quality of the input training data. The objective of this study was to evaluate the effect of preprocessing methods to improve C-GAN’s ability to detect epiphytes in images acquired by Unmanned Aerial Vehicles (UAVs). These methods include 1) trimming the training images to include mostly the target plant, 2) generating annotation images with a thresholding technique, and 3) incorporating binary cross entropy loss function. Results obtained from this study shows that the percent occupancy of input images and annotation method plays an important role in identifying the target plant.

[1]  Eero P. Simoncelli,et al.  Image quality assessment: from error visibility to structural similarity , 2004, IEEE Transactions on Image Processing.

[2]  Antonio Torralba,et al.  LabelMe: A Database and Web-Based Tool for Image Annotation , 2008, International Journal of Computer Vision.

[3]  Simon Osindero,et al.  Conditional Generative Adversarial Nets , 2014, ArXiv.

[4]  Jin Chen,et al.  Multi-modality imagery database for plant phenotyping , 2016, Machine Vision and Applications.

[5]  S. Tsaftaris,et al.  Learning to Count Leaves in Rosette Plants , 2015 .

[6]  D Venkataraman,et al.  Computer vision based feature extraction of leaves for identification of medicinal values of plants , 2016, 2016 IEEE International Conference on Computational Intelligence and Computing Research (ICCIC).

[7]  Hanno Scharr,et al.  Finely-grained annotated datasets for image-based plant phenotyping , 2016, Pattern Recognit. Lett..

[8]  Alexei A. Efros,et al.  Image-to-Image Translation with Conditional Adversarial Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[9]  S. Tsaftaris,et al.  Phenotiki: an open software and hardware platform for affordable and easy image‐based phenotyping of rosette‐shaped plants , 2017, The Plant journal : for cell and molecular biology.

[10]  Sarfraz Khurshid,et al.  DeepRoad: GAN-Based Metamorphic Testing and Input Validation Framework for Autonomous Driving Systems , 2018, 2018 33rd IEEE/ACM International Conference on Automated Software Engineering (ASE).

[11]  V. Sajithvariyar,et al.  Opportunities and challenges of launching UAVs within wooded areas , 2019 .

[12]  K P Soman,et al.  Speed Bump Segmentation an Application of Conditional Generative Adversarial Network for Self-driving Vehicles , 2020, 2020 Fourth International Conference on Computing Methodologies and Communication (ICCMC).

[13]  V. Sowmya,et al.  Analysis of Adversarial based Augmentation for Diabetic Retinopathy Disease Grading , 2020, 2020 11th International Conference on Computing, Communication and Networking Technologies (ICCCNT).

[14]  K. P. Soman,et al.  Open Set Domain Adaptation for Hyperspectral Image Classification Using Generative Adversarial Network , 2020 .

[15]  V. Sowmya,et al.  IDENTIFYING EPIPHYTES IN DRONES PHOTOS WITH A CONDITIONAL GENERATIVE ADVERSARIAL NETWORK (C-GAN) , 2020 .