Aerial vision-based proximal sensing for weed and pest damage estimation
Aerial vision-based proximal sensing for weed and pest damage estimation
Remote and proximal-sensing technologies help to identify the requirements in the agriculture farm via respective sensors. In this study the use of vision-based indicators captured using a low-altitude unmanned aerial vehicle (UAV) to estimate weed and pest damages. Coverage path planning is employed for automated data acquisition via UAV. The gathered data are processed in a ground workstation employing the proposed methods in estimating vegetation fraction, weed presence, and pest damages. The data processing includes techniques on sub-image level classification using a hybrid ResNet-SVM model and normalized triangular greenness index. Sub-image level classification for isolating crops from the rest of the image achieved an F1-score of 97.73% while pest damage detection was performed with an average accuracy of 86.37%. Also, the weed estimate achieved a true error of 5.72%.
Materials and methods
Vision-based monitoring of the crop health utilized RGB aerial images acquired using the Mavic Air quadrotor drone, which is equipped with a 4K resolution camera, collision and altitude sensors, GPS, and programmable mission path execution software. The ground workstation (personal computer or laptop) running the Java-based application developed specifically for this research performs all the required analyses on the data acquired. The processes performed on the ground workstation include coverage path planning, data sorting, weed estimation, vegetation fraction calculation, and pest damage detection.
Coverage Path Planning
The area-of-interest (AOI) is defined by the user through either an online geographical mapping tool or actual farm visitation where the drone is manually driven to the extents of the AOI while recording the coordinates of the path it traverses. The ground sampling distance (GSD) is the equivalent ground distance between the centers of two pixels in a UAV image. In an agricultural field, the visual content is generally homogenous (mostly vegetation); hence, the flight plans follow a grid-like pattern to cover all required regions just as the FAO International Telecommunication Union recommends. The algorithm for mission planning generates the geographical coordinates for the drone’s flight missions, which are then uploaded to cloud storage for mobile use. Every time that a flight mission is scheduled, these waypoints are downloaded to the drone’s mobile controller. To evaluate the accuracy of the generated waypoints, ground markers are used to compare the GPS coordinates to the corresponding drone’s footprint. The error is determined by measuring the distance of the marker to the position of the drone denoted by the center of the image it captures.
Derivation of indicators for estimating weed and pest damage
Pre-processing operations are carried out to reduce the UAV-acquired image samples of the crops to sub-images that cover only the extent of an individual leaf for localized processing. The preprocessing includes leaf segmentation and vegetation fraction calculation.The leaf segmentation employs superpixelation using the SLIC algorithm. Clusters generated by SLIC superpixelation cover mostly the extent of each leaf. Vegetation fraction is estimated by using a pixel-wise classification based on normalized triangular greenness index. The normalized TGI is a modification of the triangular greenness index. The challenge in using the original TGI equation is that the user needs to know the peak wavelength responses of the CMOS sensors used to capture the image. In the absence of such information, normalized TGI can be used instead. The TGI is originally used to identify vegetative regions in an image.
Weed detection and estimation
The sub-image that contains any form of vegetation (weed and crop) is fed to a cascaded classification process. A deep neural network, specifically ResNet-18, is used to determine whether a sub-image is “with crop” or “without crop.” Once a sub-image is identified to contain a crop, it is forwarded to an SVM classifier, which determines if the sub-image is purely crop or not based on the texture features of the sub-images. The final output map generated from the last layer of the deep neural network is fed to a two-neuron layer where each neuron corresponds to each class label: “with crop” or “without crop.” Once classified as “with crop,” the gray-level co-matrix of the original input sub-image is then fed to the SVM model.
Pest damage detection
The primary symptom used to detect the presence of bad insects in the crop field is the existence of holes in leaves. If a sub-image 𝑅𝑘is classified as “purely crop,” a hole detection algorithm is performed on the sub-image to isolate the portions of that leaf that have been consumed by various insects. The approach used in hole detection is similar to that of leaf segmentation only the first is done at the sub-image level.
Conclusion
The machine vision monitoring and detection system used a UAV equipped with an RGB camera for data collection. The collected visual data is used to estimate how much of the total vegetation is weeds and to detect the leaf damages caused by insect pest attacks. The monitoring system is easily deployable for it only requires three pieces of equipment: the UAV, a smartphone, and a laptop.The generalized path plan follows an approximate cellular decomposition and boustrophedon waypoint pattern to cover the AOI. The flight altitude of 3 m is used for weed estimate, vegetation fraction calculation, and pest damage detection. The normalized triangular greenness index extracted from the RGB images acquired using the camera on-board the UAV is used to calculate the vegetation fraction and the weed estimate. Damages in the crops due to insect attacks are also monitored by the system. Holes are detected and localized for an immediate response from the farmers. This helps in the mitigation of the spreading of pests. The use of vegetation indexes in conjunction with machine learning techniques in weed mapping and health monitoring is successfully employed in the system using commercially available equipment with no installation required. Although a functional monitoring system has been successfully developed, the part of the crop health monitor – specifically pest damage detection – is significantly limited to Solanum melongena. It is recommended therefore that an intensive data collection for UAV images of various crops be conducted to expand the detection capability to other crops as well.
Citation:
de Ocampo, A.L.P. and Dadios, E.P., 2021. Integrated weed estimation and pest damage detection in Solanum melongena plantation via aerial vision-based proximal sensing. Philippine Journal of Science, 150(4), pp.677-688.