Volume: 1 Issue: 1
Year: 2024, Page: 20-25,
Received: Feb. 15, 2024 Accepted: May 18, 2024 Published: May 22, 2024
Agriculture relies heavily on the prompt detection of pests. There are numerous technologies for identifying pests, but almost all of them are susceptible to misclassification due to inadequate lighting, background distractions, a diversity of collection techniques. Thus, pests that are only partially visible or oriented differently. This misclassification could result in a significant yield loss. We presented an architecture that would use skeletonization together with neural networks as classifiers to give excellent classification accuracy under the aforementioned parameters in order to alleviate this problem. The paper compares the performance of CNN. CNN with VGG16 and the proposed system on accuracy metric. The results obtained are on disoriented images. From the findings it is observed that for data augmented dataset CNN gives 80% accuracy, CNN with VGG16 gives 95% accuracy, and CNN with feature extractor, skeletonization achieved good accuracy which is up to 98%. A web app and an android app is also developed to classify the pest which will help the farmers to identify the name of the pest without going into technical details. This framework will surely help farmers in identifying pest names instantly which will later help in identifying the name and quantity of pesticide.
Keywords: Pest Classification using Morphological Processing in Deep Learning
Thenmozhi K, Reddy US. Crop pest classification based on deep convolutional neural network and transfer learning. Computers and Electronics in Agriculture. 2019;164:104906. Available from: https://doi.org/10.1016/j.compag.2019.104906
Hu Z, Xu L, Cao L, Liu S, Luo Z, Wang J, et al. Application of Non-Orthogonal Multiple Access in Wireless Sensor Networks for Smart Agriculture. IEEE Access. 2019;7:87582–87592. Available from: https://doi.org/10.1109/ACCESS.2019.2924917
Ayaz M, Ammad-Uddin M, Sharif Z, Mansour A, Aggoune EHMM. Internet-of-Things (IoT)-Based Smart Agriculture: Toward Making the Fields Talk. IEEE Access. 2019;7:129551–129583. Available from: https://doi.org/10.1109/ACCESS.2019.2932609
Farooq MS, Riaz S, Abid A, Abid K, Naeem MA. A Survey on the Role of IoT in Agriculture for the Implementation of Smart Farming. IEEE Access. 2019;7:156237–156271. Available from: https://doi.org/10.1109/ACCESS.2019.2949703
Karar ME, Rasheed M, Al-Rasheed A, Reyad O. IoT and Neural Network-Based Water Pumping Control System For Smart Irrigation. Inform. Sci. Lett. 2020;9:107–112. Available from: https://doi.org/10.48550/arXiv.2005.04158
Nanni L, Maguolo G, Pancino F. Insect pest image detection and recognition based on bio-inspired methods. Ecological Informatics. 2020;57:101089. Available from: https://doi.org/10.1016/j.ecoinf.2020.101089
Alves AN, Souza WSR, Borges DL. Cotton pests classification in field-based images using deep residual networks. Computers and Electronics in Agriculture. 2020;174:105488. Available from: https://doi.org/10.1016/j.compag.2020.105488
Cheng X, Zhang Y, Chen Y, Wu Y, Yue Y. Pest identification via deep residual learning in complex background. Computers and Electronics in Agriculture. 2017;141:351–356. Available from: https://doi.org/10.1016/j.compag.2017.08.005
Mishra M, Singh PK, Brahmachari A, Debnath NC, Choudhury P. A robust pest identification system using morphological analysis in neural networks. Periodicals of Engineering and Natural Sciences (PEN). 2019;7(1):483. Available from: http://pen.ius.edu.ba/index.php/pen/article/view/377/292
Xia D, Chen P, Wang B, Zhang J, Xie C. Insect Detection and Classification Based on an Improved Convolutional Neural Network. Sensors. 2018;18(12):4169. Available from: https://doi.org/10.3390/s18124169
Thenmozhi K, Srinivasulu Reddy U. Crop pest classification based on deep convolutional neural network and transfer learning. Computers and Electronics in Agriculture. 2019;164:104906.
Rustia DJA, Chao JJJ, Chung JYY, Lin TT. An Online Unsupervised Deep Learning Approach for an Automated Pest Insect Monitoring System </i>. 2019 Boston, Massachusetts July 7- July 10, 2019. 2019. Available from: https://doi.org/10.13031/AIM.201900477
Tetila EC, Machado BB, Astolfi G, Belete NADS, Amorim WP, Roel AR, et al. Detection and classification of soybean pests using deep learning with UAV images. Computers and Electronics in Agriculture. 2020;179:105836. Available from: https://doi.org/10.1016/j.compag.2020.105836
Coulibaly S, Kamsu-Foguem B, Kamissoko D, Traore D. Explainable deep convolutional neural networks for insect pest recognition. Journal of Cleaner Production. 2022;371:133638. Available from: https://doi.org/10.1016/j.jclepro.2022.133638
Ramalingam B, Mohan RE, Pookkuttath S, Gómez BF, Borusu CSCS, Teng TW, et al. Remote Insects Trap Monitoring System Using Deep Learning Framework and IoT. Sensors. 2020;20(18):5280. Available from: https://doi.org/10.3390/s20185280
Xia D, Chen P, Wang B, Zhang J, Xie C. Insect Detection and Classification Based on an Improved Convolutional Neural Network. Sensors. 2018;18(12):4169.
Watve S, Patil M, Shinde A. International Conference on Emerging Smart Computing and Informatics. 2023.
Tetila EC, Machado BB, Astolfi G, Belete NADS, Amorim WP, Roel AR, et al. Detection and classification of soybean pests using deep learning with UAV images. Computers and Electronics in Agriculture. 2020;179:105836. Available from: https://doi.org/10.1016/j.compag.2020.105836
Mishra M, Singh PK, Brahmachari A, Debnath NC, Choudhury P. A robust pest identification system using morphological analysis in neural networks. Periodicals of Engineering and Natural Sciences (PEN). 2019;7(1):483.
© 2024 Thuse & Chavan. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Thuse S, Chavan M. (2024). Pest Classification using Morphological Processing in Deep Learning. International Journal of Electronics and Computer Applications. 1(1): 20-25.