1. Comparison of Deep Learning Models for Multi-Crop Leaf Disease Detection with Enhanced Vegetative Feature Isolation and Definition of a New Hybrid Architecture
- Author
-
Sajjad Saleem, Muhammad Irfan Sharif, Muhammad Imran Sharif, Muhammad Zaheer Sajid, and Francesco Marinello
- Subjects
potatoes ,tomatoes ,mango ,convolutional neural networks ,deep learning ,machine learning ,Agriculture - Abstract
Agricultural productivity is one of the critical factors towards ensuring food security across the globe. However, some of the main crops, such as potato, tomato, and mango, are usually infested by leaf diseases, which considerably lower yield and quality. The traditional practice of diagnosing disease through visual inspection is labor-intensive, time-consuming, and can lead to numerous errors. To address these challenges, this study evokes the AgirLeafNet model, a deep learning-based solution with a hybrid of NASNetMobile for feature extraction and Few-Shot Learning (FSL) for classification. The Excess Green Index (ExG) is a novel approach that is a specified vegetation index that can further the ability of the model to distinguish and detect vegetative properties even in scenarios with minimal labeled data, demonstrating the tremendous potential for this application. AgirLeafNet demonstrates outstanding accuracy, with 100% accuracy for potato detection, 92% for tomato, and 99.8% for mango leaves, producing incredibly accurate results compared to the models already in use, as described in the literature. By demonstrating the viability of a deep learning/IoT system architecture, this study goes beyond the current state of multi-crop disease detection. It provides practical, effective, and efficient deep-learning solutions for sustainable agricultural production systems. The innovation of the model emphasizes its multi-crop capability, precision in results, and the suggested use of ExG to generate additional robust disease detection methods for new findings. The AgirLeafNet model is setting an entirely new standard for future research endeavors.
- Published
- 2024
- Full Text
- View/download PDF