Classification of Biotic And Abiotic Stresses on Grape Berries using Transfer Learning

##plugins.themes.academic_pro.article.main##

A Gurjar
DS Yadav
SD Sawant
AK Upadhyay
S Saha

Abstract

Grape farming is one of the most lucrative agricultural enterprises in India. There are several biotic and abiotic stress conditions that may adversely affect the yield if not tackled at the right time. It is crucial that the farmer can correctly identify and monitor the type of stress so that steps can be taken to prevent undesirable outcomes. We have gathered a dataset of these stress conditions on grape berries and categorized them into eight classes, necrosis, shriveling, and honeydew by mealybug, mealybug incidence, spray injury, thrips scarring, pink berry, and powdery mildew. Transfer learning was used to test the performance of six major deep learning image classification architectures (namely MobileNet-v2, Inception-v3, Inception-ResNet-v2, ResNet-v2, NASNet, and PNASNet) with variations in training conditions and hyper parameters. The results were compared to determine the most feasible and accurate deep learning architecture and its hyper parameters for the given problem statement. The experiment shows that Inception-ResNet-v2 obtained maximum classification accuracy of 88.75% when learning rate of 0.035 and minibatch size of 10 were applied using 8000 training steps. This result will act as a pre-requisite for the development of an application for mapping vineyard stress conditions on berries and give automated advisory.

##plugins.themes.academic_pro.article.details##

How to Cite
Gurjar, A., Yadav, D., Sawant, S., Upadhyay, A., & Saha, S. (2023). Classification of Biotic And Abiotic Stresses on Grape Berries using Transfer Learning. Grape Insight, 1(1), 37–47. https://doi.org/10.59904/gi.v1.i1.2023.10

References

  1. Abadi M, Barham P, Chen J, Chen Z, Davis A, Dean J, Devin M, Ghemawat, S, Irving G, Isard M (2016) TensorFlow: A system for large scale machine learning. In: Proc. 2-4 Nov. 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI’16):265-283. Savannah, GA. https://www.usenix.org/conference/osdi16/technical-sessions/presentation/abadi.
  2. Adsule PG, Yadav DS, Upadhyay A, Satisha J and Sharma AK (Eds.) (2013) Good agricultural practices for production of quality table grapes. ICAR-NRCG, Pune: 57.
  3. Alemi A (2016) Improving inception and image classification in TensorFlow. Google AI Blog,The latest from Google Research. https://ai.googleblog.com/2016/08/improving-inception-and-image.html (Accessed as on 12.01.2023).
  4. APEDA (2023) Grapes. https://apeda.gov.in/apedawebsite/SubHead_Products/Grapes.htm #:~:text=Areas of Cultivation %3A, %2C Tamil Nadu%2C and Mizoram (Accessed as on 12.01.2023).
  5. De Boer, P-T, Kroese DP, Mannor S and Rubinstein, RY (2005) A tutorial on the cross-entropy method. Annals of operations research 134: 19-67. https://doi.org/10.1007/s10479-005-5724-z.
  6. Fawaz HI, Forestier G, Weber J, Idoumghar L and Muller P-A (2018) Data augmentation using synthetic data for time series classification with deep residual networks. arXiv preprint arXiv: 1808.02455. https://doi.org/10.48550/arXiv.1808.02455.
  7. He K, Zhang X, Ren S and Sun J (2016a) Deep residual learning for image recognition. In: Proc. 27-30 June. IEEE conference on computer vision and pattern recognition, 2016a, 770-778. Las Vegas, NV, USA. https://doi.org/10.1109/CVPR.2016.90
  8. He K, Zhang X, Ren S and Sun J (2016b) Identity mappings in deep residual networks. In, Proceedings of European conference on computer vision. Leibe B, Matas J, Sebe N, Welling M (eds), Springer: 630-645.https://doi.org/10.1007/978-3-319-46493-0_38
  9. Kamilaris A and Prenafeta-Boldu FX (2018) Deep learning in agriculture: A survey, computers and electronics in agriculture. 147: 70-90. https://doi.org/10.1016/j.compag.2018.02.016
  10. Kangune K, Kulkarni DV and Kosamkar PP (2019) Automated estimation of grape ripeness. Asian Journal for Convergence in Technology (AJCT).http://asianssr.org/index.php/ajct/article/view/792
  11. Krizhevsky A, Sutskever I and Hinton GE (2012) ImageNet classification with deep convolutional neural networks. In: F. PEREIRA, C. J. C. BURGES, L. BOTTOU, K. Q. WEINBERGER (Eds): Proc. 14th Nov. Advances in Neural Information Processing Systems. 25: 1097-1105
  12. Lee M and Xing, S (2018) a study of tangerine pest recognition using advanced deep learning methods. http://dx.doi.org/10.20944/preprints201811.0161.v1.
  13. Levitt J (1980) Responses of plants to environmental stress, Volume 1: Chilling, freezing, and high temperature stresses. Academic Press, Cambridge . https://doi.org/10.1016/B978-0-12-445501-6.50016-6.
  14. Liu C, Zoph B, Neumann M, Shlens J, Hua W, Li L.-J, Fei-Fei L, Yuille A, Huang J and Murphy K (2018) Progressive neural architecture search. In: Proceedings of the European Conference on Computer Vision (ECCV). 19-34: https://doi.org/10.48550/arXiv.1712.00559.
  15. Liu Z, Gao J, Yang G, Zhang H and He Y (2016) Localization and classification of paddy field pests using a saliency map and deep convolutional neural network. Scientific reports, 6: 20410. https://doi.org/10.1038/srep20410.
  16. Pratt LY (1993) Transferring previously learned back-propagation neural networks to new learning tasks, Citeseer.
  17. Rduly Z, Sulyok C, Vadszi Z and Zlde A (2018) Dog breed identification using deep learning. In: Proceedings of 2018 IEEE 16th International Symposium on Intelligent Systems and Informatics (SISY), 000271–000276. https://doi.org/10.1109/SISY.2018.8524715.
  18. Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Huang Z, Karpathy A, Khosla A, Bernstein M, Berg AC and Fei-Fei L (2015) ImageNet large scale visual recognition challenge, International Journal of Computer Vision (IJCV), 115 (3): 211-252. https://doi.org/10.48550/arXiv.1409.0575.
  19. Sandler M, Howard A, Zhu M, Zhmoginov A and Chen, L-C (2018) Mobilenetv2: Inverted residuals and linear bottlenecks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition: 4510-452. https://doi.org/10.1109/CVPR.2018.00474.
  20. Sannakki SS, Rajpurohit VS, Nargund V and Kulkarni P (2013) Diagnosis and classification of grape leaf diseases using neural networks. In: Proc. 4-6 July. 2013 Fourth International Conference on Computing, Communications and Networking Technologies (ICCCNT), IEEE, Tiruchengode, India: 1-5. http://dx.doi.org/10.1109/ICCCNT.2013.6726616.
  21. Szegedy C, Ioffe S, Vanhoucke V and Alemi AA (2017) Inception-v4, Inception- ResNet and the impact of residual connections on learning. In, Proc. 4-9 Feb. Thirty-First AAAI Conference on Artificial Intelligence. 4278- 4284. San Francisco, California, USA. https://doi.org/10.1609/aaai.v31i1.11231.
  22. Szegedy C, Vanhoucke V, Ioe S, Shlens J and Wojna Z (2016) Rethinking the inception architecture for computer vision. In: Proc. 27-30 June. IEEE conference on computer vision and pattern recognition, Las Vegas, NV, USA: 2818-2826. https://doi.org/10.1109/CVPR.2016.308.
  23. Verma S, Chug A, Singh A, Rajvanshi P and Sharma S (2018) Deep learning based plant disease diagnosis for grape plant. In: Proc. Nov. AFITA/WCCA 2018 Research Frontiers in Precision Agriculture. IIT Bombay, India.
  24. Waghmare H, Kokare R and Dandawate Y (2016) Detection and classification of diseases of grape plant using opposite colour local binary pattern feature and machine learning for automated decision support system. In: Proc. 11-12 Feb. 2016 3rd international conference on signal processing and integrated networks (SPIN), IEEE, Noida, India: 513-518. http://dx.doi.org/10.1109/SPIN.2016.7566749.
  25. Yosinski J, Clune J, Bengio Y and Lipson H (2014) How transferable are features in deep neural networks? Advances in neural information processing systems, 27: 3320-3328. https://doi.org/10.48550/arXiv.1411.1792.
  26. Zoph B, Vasudevan V, Shlens J and Le QV (2018) Learning transferable architectures for scalable image recognition. In: Proc. 18-23 June. IEEE conference on computer vision and pattern recognition, Salt Lake City, UT, USA: 8697-8710. https://doi.org/10.1109/CVPR.2018.00907.