This project leverages the power of EfficientNet, a state-of-the-art deep learning model, to help farmers and gardeners quickly and accurately identify plant diseases using images of plant leaves. Upon detecting a disease, the application provides detailed information, including the disease's name, cause, and symptoms.
EfficientNet is a convolutional neural network (CNN) designed to balance accuracy and efficiency. Unlike traditional models that scale only in width, depth, or resolution, EfficientNet scales all three dimensions simultaneously, leading to better performance with fewer computations.
β
Higher Accuracy β Outperforms ResNet, MobileNet, and Inception models.
β
Fewer Parameters β Uses 5x fewer parameters than ResNet-50 while achieving higher accuracy.
β
Optimized for Mobile & Edge AI β Enables deployment on smartphones, drones, and Raspberry Pi.
β
Faster Inference β Runs efficiently in real-time agricultural applications.
Model | Parameters | Accuracy (ImageNet) | FLOPs (B) |
---|---|---|---|
ResNet-50 | 25.6M | 76.6% | 4.1 |
MobileNetV2 | 3.4M | 72.0% | 0.3 |
EfficientNet-B0 | 5.3M | 77.1% | 0.4 |
EfficientNet-B7 | 66M | 84.4% | 37.0 |
π Key Takeaways:
- EfficientNet-B0 achieves similar accuracy to ResNet-50 but with 5x fewer parameters.
- EfficientNet-B7 outperforms most CNN architectures with 84.4% top-1 accuracy.
- Lower FLOPs (floating-point operations) mean faster inference on low-power devices.
EfficientNet scales a model in three dimensions:
1οΈβ£ Depth β More layers for complex pattern recognition.
2οΈβ£ Width β Wider layers to retain fine-grained details.
3οΈβ£ Resolution β Larger input size to detect subtle disease features.
By combining these factors, EfficientNet achieves higher accuracy while using fewer resources.
π Smartphone Apps β Farmers can use mobile apps to detect diseases instantly.
π± IoT & Edge AI β EfficientNet models can run on Raspberry Pi & Jetson Nano.
βοΈ Cloud-Based APIs β Agricultural platforms can integrate real-time plant disease detection.
π‘ Drone & Smart Camera Systems β Automate plant health monitoring in large-scale farms.
π Reference Paper:
π Mingxing Tan, Quoc V. Le. EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks
π Read on arXiv
This project consists of three main components:
1οΈβ£ Training Pipeline β Train multiple models on the plant disease dataset.
2οΈβ£ Server β Build a FastAPI server to serve predictions.
3οΈβ£ Client β Create a React-based web application for user interaction.
The dataset is collected from various Kaggle sources and includes 22 plant types with multiple disease classes.
- πΏ 22 plant types
- π¬ Multiple diseases per plant
- πΌοΈ Dataset split:
Train
,Validation
, andTest
π± Plant Type | π¦ Disease Classes |
---|---|
Apple | 4 (e.g., brown spot, gray spot) |
Bell Pepper | 2 (e.g., bacterial spot, healthy) |
Potato | 3 (e.g., early blight, late blight, healthy) |
Tomato | 8 (e.g., bacterial spot, leaf mold, late blight) |
... | ... |
π₯ Download Dataset: Kaggle Dataset Link
git clone [email protected]:tinh2044/PlantDisease_classification.git
cd PlantDisease_classification
conda create --name plantDisease python=3.9
conda activate plantDisease
pip install -r requirements.txt
Download the dataset manually: Kaggle Dataset
Or use Kaggle CLI:
kaggle datasets download -d nguyenchitinh/plantdisease-with-20-plant
python train_multiple_model.py --epoch 100 --batch_size 32 --root_dir ./Datasets --img_size 224 --export_dir ./SavedModels --h5_dir ./Models
πΎ After training, model weights will be saved in:
./Models/
β Trained.h5
model weights../SavedModels/
β TensorFlow SavedModel format.
python evaluate.py --root_dir ./Datasets --h5_dir ./Models
python convert_tflite.py
Ensure that all TFLite models are copied to:
server/ModelLight/
cd server
uvicorn app.main:app --host 127.0.0.1 --port 5000
β
API will be accessible at: http://127.0.0.1:5000
docker compose up
Move to the client directory:
cd client
npm start
Create a .env
file in the client/
folder and add:
REACT_APP_API_URL=http://127.0.0.1:5000