Agriculture Robot for Weed Detection, Classification and Spray using Deep Neural Network
Mankind efforts are more challenging in Precision Crop Management (PCM) where one need to protect the fields along with removal of weeds by using cheap resources by maintaining low economical ratio and with higher plant growth. The main challenges may lay in environmental effects, hidden weeds and p
2025-06-28 16:30:09 - Adil Khan
Agriculture Robot for Weed Detection, Classification and Spray using Deep Neural Network
Project Area of Specialization Artificial IntelligenceProject SummaryMankind efforts are more challenging in Precision Crop Management (PCM) where one need to protect the fields along with removal of weeds by using cheap resources by maintaining low economical ratio and with higher plant growth. The main challenges may lay in environmental effects, hidden weeds and precision crop management and growth rate. Our project is based on detection, classification and elimination of weeds by avoiding the rest of field from poisonous spray. In this way, plant growth rate increases, less herbicidal spray is used and accuracy increase. Our autonomous robot uses a central processing unit such as Raspberry pi interlinked with a camera to get pictures of a field in segments, a Raspberry pi GPU (external Geforce Invidia 750M above to get accuracy) that classifies the weeds from plants using deep neural network algorithm and uses the detected locations of weeds for targeted spraying. The movement of robot is also controlled by CPU and further the level of sprayer tank will be shown to user via IoT (in alternate approach an alarm can be set to give an alert about the end of spray in tank). In this way, not only man power will be replaced but plant growth will be increased. Movement of robot and sprayer system is two different tasks. As our robot is being controlled by Raspberry PI as main controller which holds processing data for weed too along with movement of 4-wheel chassis, once processing is done then we map picture into coordinates. After mapping Raspberry Pi send coordinates to Arduino UNO, next task is controlling the mechanical sprayer system through Arduino UNO for fast processing. Chassis movement is being controlled with Raspberry PI. As Arduino UNO gives clearance of its command then Raspberry PI moves the Robot to next step. Our main focus is on line control movement of robot for single row.
We are using CNC 2-D movement technique to move our mechanical sprayer system onto provided location. The whole system is being controlled through an Arduino UNO. Stepper motor 1 and 2 moves at the same time and respectively both vertical rods start rolling which turns the horizontal rod to move forward and backward (in y-axis direction), stepper motor 3 turns the middle red rod which turns the servo motor fixed with sprayer nozzle to move in left-right direction (in x-axis direction), hence a 2-D field area is covered with provided xy coordinates. For this purpose, we are using stepper motor with Rotary Encoder and A4988 micro-stepping driver. Stepper motors are mainly used in open loop position control system. We designed the closed loop position control to overcome no feedback (loss of stepper steps) issue of open loop system. For this purpose, we are using Rotary Encoder. Incremental rotary encoders provide a pair of digital signals that allow Arduino UNO to determine the speed and direction of a Stepper Motor rotation.
Project ObjectivesIn agriculture, ground detection and elimination of weeds is a challenging phase for human beings. The main challenge is the accurate detection and classification of weeds. Moreover, usually an herbicidal sprayer sprays the whole field along with the weed that not only makes the field poisonous but is also harmful for human health. To overcome all these issues, we have developed an autonomous robot that takes picture of whole field area and detects green segments out of it, classifies the weed part and finally removes the weed using spray with high accuracy.
Project Implementation MethodThis Project has Two Main Parts.
- Software Part (only for image processing)
- Hardware Part (all kind of hardware + their software codes)
1: Software Part
Software have further five parts
- Vegetation Detection System
- Plant Classification System
- Sprayer Movement Control System
- Robot Movement Control System
- Herbicide Level Indicator System
Software parts for FYP-I
- Vegetation Detection System
- Plant Classification System
Data Set Acquisition:
- In first stage we took datasets of two leaf from internet. Data set information given in below table 1.
Table 1 Information of Dataset taken from internet
| Leaf Name | No of Pictures | Total Pic. Generated |
| Quercus | 72 | 5000 |
| Salix alba (Sericea) | 72 | 5000 |
- Then, we trained our network with different kind of datasets values of parameters. Results have been given in table 2
Table 2 Results of trained and tested CNN
| No of Layers Of CNN | No of Filters in Conv. Layers | UnRotated Data | Rotated Data | Validation Acc. Of trained CNN | Accuracy Results for UnRotated Pics | Acc. Result for Rotated Pics | ||||
| CL1, CL2, CL3 | Class1 | Class2 | Class1 | Class2 | Class1 | Class2 | Class1 | Class2 | ||
| 15 | 16,32,64 | 5000 | 5000 | 0 | 0 | 100% | 100% | 99.9% | 8% | 91.9% |
| 15 | 16,32,64 | 3000 | 3000 | 2000 | 2000 | 99.9% | 99% | 100% | 98% | 96% |
| 15 | 16,64,64 | 3000 | 3000 | 2000 | 2000 | 99.8% | 100% | 100% | 99% | 97% |
| 15 | ||||||||||
Leaf Name
Quercus
Salix alba (Sericea)
No of Layers
Of CNN
15
15
15
15
Benefits of the Projectas this project is unique, innovative and contemporary and has many benefits in agricultural field of Pakistan.
main benefit of this project is, it is human friendly as it removes all those ingradients which are dangerous to human health, being autonomous in nature it detects weed from the field and after seperating from crop it simply removes it using precise spray positioner.
it is fulfilling the missing labour in agricultural field as labour is rushing towards industries.
it is consuming a very limited spray quantuty which is economical and avoids crop from being hazardous.
it is mechinal machine and autonomous so it needs no supervision other than only one person to turn on or off.
it is replacing the manual spray system where one gets more chances of health issues.
it has varying datasets according to required field so it can cover a variety of crops.
Technical Details of Final DeliverableOur project is controlling the autonomous robot using a Raspberry pi interfaced with a digital camera, stepper motor system to control sprayer, herbicidal tank level indicator and a 4 wheel chassis in real time.
Our project has two main phases:
1 : software work
2 : hardware work
- Software work includes coding of CNN algorithm using Matlab Software.
- Coding of CNN algorithm for image processing
- Coding to control Stepper motors by getting required coordinate of weed
- Coding to control 4 wheel chassis
- Hardware Work includes assembly of complete chassis.
- Chassis skeleton
- 12V, 30 RPM DC motor with metallic gears
- 4.5 inch of Diameter rubber tires
- 8, 1.5 Feet length of each Aluminum rods
- 12, 1.25/1.25 inch metallic brackets
- 2, 1.5/1.5 acrylic sheets
- 14.4V 3500mA NI-MH battery pack
- 11.1V 2800mA LIPO battery
- 2, 7805 and 1, 7809 dc regulators (DC bank management)
- 2, BTS790 DC motor Driver
- Raspberry pi 3 b+
- 5MP Camera
- 1 Liter Herbicide tank
- 1 HS-C04 Ultrasonic sensor
- 3, revolving rods for stepper motor system
- 3, 17HS4401 stepper motors
- 2, A4988 NEMA driver
- 2, limit switch
Components detailed specifications are as follows,
- Raspberry pi 3 b+
- Broadcom BCM2837B0, Cortex-A53 (ARMv8) 64-bit SoC @ 1.4GHz
- 1GB LPDDR2 SDRAM
- 2.4GHz and 5GHz IEEE 802.11.b/g/n/ac wireless LAN, Bluetooth 4.2, BLE
- Gigabit Ethernet over USB 2.0 (maximum throughput 300 Mbps)
- Extended 40-pin GPIO header
- Full-size HDMI
- NEMA 17HS4401
- Type: Hybrid
- Model Number:NEMA17
- Phase:2
- Holding Torque:4000g.cm(
- Current / Phase:1.7A
- Step Angle(degrees):1.8 degree
- NEMA A4988 Driver
- Type: Hybrid
- Model Number 42BYG45034
- Step Angle 1.8°
- Step Accuracy 0.05
- Rated Current/phase 0.4A
- Phase Resistance 30?
- Rated Voltage 12V
- HC-S04 ultrasonic Sensor
- Voltage: DC5V
- Static current: less than 2mA
- Level output: high-5V
- Level output: the end of 0V
- Sensor angle: not more than 15 degrees
- Detection distance: 2cm-450cm
- High precision: up to 0.3cm
- Connection: VCC, trig (control side), echo (receiving end), GND
- 5MP Raspberry pi Compatible Camera
- Raspberry Pi HD Camera
- High-Definition video camera for Raspberry Pi Model A or B.
- Omni vision 5647 sensor in a fixed-focus module with replaceable Lens
- Lens holder: M12x0.5 or CS Mount
- 5MPixel sensor
- BTS7960 Single H-Bridge
- Power voltage: 6V – 27V
- Maximum current: 43A
- Input voltage: 3.3V – 5V
- Control mode: PWM or horizontal
- LIPO Battery
- Battery parameter: ZOP Power 11.1V 2800mAh 30C
- Capacity: 2800mAh
- Size: 28*34*116mm
- Plug Style: XT60 Plug
- Weight: 209g
- NI-MH battery
- Brand: CJC
- Capacity: 3500 mAH
- Type: 14.4 v Li-ion
- Weight: 600g
- Size: 138mm x 46 mm x 44mm
- 14.1V 3500mAh NI-MH battery for two 12V 30RPM DC Motors by using 2 BTS7960 high current H-Bridge Drivers.
- Second 11.1V 2800mAh LIPO battery is used in DC Bank to deliver power to different DC components.
| No of Layers Of CNN | No of Filters in Conv. Layers | UnRotated Data | Rotated Data | Validation Acc. Of trained CNN | Accuracy Results for UnRotated Pics | Acc. Result for Rotated Pics | ||||
| CL1, CL2, CL3 | Class1 | Class2 | Class1 | Class2 | Class1 | Class2 | Class1 | Class2 | ||
| 15 | 16,32,64 | 5000 | 5000 | 0 | 0 | 100% | 100% | 99.9% | 8% | 91.9% |
| 15 | 16,32,64 | 3000 | 3000 | 2000 | 2000 | 99.9% | 99% | 100% | 98% | 96% |
| 15 | 16,64,64 | 3000 | 3000 | 2000 | 2000 | 99.8% | 100% | 100% | 99% | 97% |
| 15 | ||||||||||