Integrated Weed Estimation and Pest Damage Detection in ...

12
1039 Integrated Weed Estimation and Pest Damage Detection in Solanum melongena Plantation via Aerial Vision-based Proximal Sensing Anton Louise P. de Ocampo 1 * and Elmer P. Dadios 2 1 Electronics, Instrumentation, and Mechatronics Engineering Department Batangas State University, Batangas City 4217 Philippines 2 Manufacturing Engineering and Management Program De La Salle University, Manila 1003 Philippines The Philippine government’s effort to transcend agriculture as an industry requires precision agriculture. Remote- and proximal-sensing technologies help to identify what is needed, when, and where it is needed in the farm. This paper proposes the use of vision-based indicators captured using a low-altitude unmanned aerial vehicle (UAV) to estimate weed and pest damages. Coverage path planning is employed for automated data acquisition via UAV. The gathered data are processed in a ground workstation employing the proposed methods in estimating vegetation fraction, weed presence, and pest damages. The data processing includes techniques on sub-image level classification using a hybrid ResNet-SVM model and normalized triangular greenness index. Sub-image level classification for isolating crops from the rest of the image achieved an F1-score of 97.73% while pest damage detection was performed with an average accuracy of 86.37%. Also, the weed estimate achieved a true error of 5.72%. Keywords: crop health, proximal sensing, UAV-based monitoring, vegetation fraction, weed estimate *Corresponding Author: [email protected] INTRODUCTION The world food demand is expected to increase steadily even amidst the uncertainties brought by the COVID-19 pandemic. The Food and Agriculture Organization (FAO) Food Price Index rebounded to 93.2 points as of June 2020 – higher than May by 2.2 points, which manifests the returning demand (FAO 2020). The uprising demand imposes a pressing matter on farmers to increase the agricultural yield. To cope up with the increasing demand for food, farms worldwide are becoming smart. Precision agriculture, which is the third wave of agricultural evolution next to mechanization (first) and green revolution (second), is the application of what is needed to where and when it is needed in the farm (Saiz-Rubio and Rovira-Más 2020). The accuracy of such a method relies mainly on how the farming parameters are observed and measured. The management of the inputs and outputs that are based on the measured indicators can propel the farming industry's efficiency and high productivity. Despite the rapid development in farming technologies, the implementation of intelligent monitoring systems to aid in agricultural activities accounts for only a fraction of the world’s total farming undertakings. According to the report of Bhutani and Wadhwani (2018), the main challenge or pitfall in the global market of smart farming (precision agriculture) is the low adoption rate in developing Philippine Journal of Science 150 (3): 1039-1050, June 2021 ISSN 0031 - 7683 Date Received: 17 Dec 2020

Transcript of Integrated Weed Estimation and Pest Damage Detection in ...

Page 1: Integrated Weed Estimation and Pest Damage Detection in ...

1039

Integrated Weed Estimation and Pest Damage Detection in Solanum melongena Plantation

via Aerial Vision-based Proximal Sensing

Anton Louise P. de Ocampo1* and Elmer P. Dadios2

1Electronics, Instrumentation, and Mechatronics Engineering Department Batangas State University, Batangas City 4217 Philippines

2Manufacturing Engineering and Management Program De La Salle University, Manila 1003 Philippines

The Philippine government’s effort to transcend agriculture as an industry requires precision agriculture. Remote- and proximal-sensing technologies help to identify what is needed, when, and where it is needed in the farm. This paper proposes the use of vision-based indicators captured using a low-altitude unmanned aerial vehicle (UAV) to estimate weed and pest damages. Coverage path planning is employed for automated data acquisition via UAV. The gathered data are processed in a ground workstation employing the proposed methods in estimating vegetation fraction, weed presence, and pest damages. The data processing includes techniques on sub-image level classification using a hybrid ResNet-SVM model and normalized triangular greenness index. Sub-image level classification for isolating crops from the rest of the image achieved an F1-score of 97.73% while pest damage detection was performed with an average accuracy of 86.37%. Also, the weed estimate achieved a true error of 5.72%.

Keywords: crop health, proximal sensing, UAV-based monitoring, vegetation fraction, weed estimate

*Corresponding Author: [email protected]

INTRODUCTIONThe world food demand is expected to increase steadily even amidst the uncertainties brought by the COVID-19 pandemic. The Food and Agriculture Organization (FAO) Food Price Index rebounded to 93.2 points as of June 2020 – higher than May by 2.2 points, which manifests the returning demand (FAO 2020). The uprising demand imposes a pressing matter on farmers to increase the agricultural yield. To cope up with the increasing demand for food, farms worldwide are becoming smart. Precision agriculture, which is the third wave of agricultural evolution next to mechanization (first) and green revolution (second), is the application of what is needed to where and when it is needed in the farm (Saiz-Rubio and Rovira-Más 2020). The accuracy of such a method relies mainly on how the farming parameters are observed and measured. The management of the inputs and outputs that are based on the measured indicators can propel the farming industry's efficiency and high productivity. Despite the rapid development in farming technologies, the implementation of intelligent monitoring systems to aid in agricultural activities accounts for only a fraction of the world’s total farming undertakings. According to the report of Bhutani and Wadhwani (2018), the main challenge or pitfall in the global market of smart farming (precision agriculture) is the low adoption rate in developing

Philippine Journal of Science150 (3): 1039-1050, June 2021ISSN 0031 - 7683Date Received: 17 Dec 2020

Page 2: Integrated Weed Estimation and Pest Damage Detection in ...

1040

countries. This is indeed the case in the Philippines. Although the Philippines is known as an agricultural country, very few owners use technology in managing their farms. Not only on 02 Mar 2019 did the Philippine government acknowledged the development of the first smart farm in the country, which will serve as a model for smart farming (NICE 2019). Moreover, the program is government-owned – initiated by the Department of Science and Technology (DOST) – and not by any local farmer. Fortunately, the government now sees the importance of precision agriculture to boost the industry. The initiative is manifested by the government support to the Project SARAI (Smarter Approaches to Reinvigorate Agriculture as an Industry in the Philippines), which employs technology to assists local farmers in farm operations and management. However, most of the local farmers still rely on the traditional methods of observing the conditions of the crop fields, which are highly dependent on the skills and experience of the farmers. Two of the indicators that farmers look into are the presence of weed and insect pests in the crop field.

Weed management is a critical aspect of crop production. Weed removal timing greatly influences crop yield losses (Kumar and Jha 2017). Losses due to the presence of weed in crop fields can be direct or indirect. Direct loss from weeds occurs because they compete with crops for nutrients and water. Because weed can adapt to limited water and nutrients, it can be the victor of the competition during scarce conditions. Indirect loss due to weeds include being hosts for some pest in the field, being able to disrupt operations in the farm, and the cost incurred in weed removal and control (Mawandha et al. 2020). Weed estimation plays an important role in the assessment of the adverse effect of weeds on crops (Beckie et al. 2020). Weeds are considered pests and impose serious challenges in crop production. Currently, farm managers perform weed estimation through manual scouting, which requires great time and effort but is deemed inefficient and less economical (Sapkota et al. 2020). Aside from weeds, insect pests also degrade the quality and production of crops. Eggplant (Solanum melongena) is one of the most important crops in Asia that is treated heavily with insecticides to maintain the quality of the fruits (Hautea et al. 2016).

Specific to eggplants, the effect of competition with weeds creates a huge decrease in production as much as 67% and fruit quality for as much as 96% (Marques et al. 2016). The interference from weeds happens because they can harbor pests, diseases, and nematodes, and can compete with nutrients. Integrated pest management for eggplants often includes weed mitigation as its core component (FAO 2003).

Detection of weeds on croplands can significantly help in mitigation or removal. The emerging methods in automated weed detection employ deep neural networks for weed vs. crop classification (Liu and Bruch 2020; Sharpe et al. 2018; Khan et al. 2019), while traditional techniques use feature engineering and machine learning models (Achasov et al. 2018; Julia and Bennet 2020; Johnson et al. 2020).

In this paper, a novel method of estimating weed and detect pest damages using hybrid machine learning and deep learning techniques in aerial imagery of crops is proposed. Also, the proposed method for weed estimation and pest damage detection uses another novel approach that utilized vegetation fraction derived from normalized triangular greenness index.

MATERIALS AND METHODSVision-based monitoring of the crop health utilized RGB aerial images acquired using the Mavic Air quadrotor drone, which is equipped with a 4K resolution camera, collision and altitude sensors, GPS, and programmable mission path execution software. The ground workstation (personal computer or laptop) running the Java-based application developed specifically for this research performs all the required analyses on the data acquired. The processes performed on the ground workstation include coverage path planning, data sorting, weed estimation, vegetation fraction calculation, and pest damage detection. The implementation set-up is illustrated in Figure 1.

Coverage Path PlanningThe area-of-interest (AOI) is defined by the user through either an online geographical mapping tool or actual farm visitation where the drone is manually driven to the extents of the AOI while recording the coordinates of the path it traverses. In general, the farms are irregular in shape and the boundaries are most likely covered by trees or fences. To address this irregularity, the bounds of the target area 𝔸𝑤 are transformed to the nearest quadrilateral equivalent as the modified AOI 𝔸𝑚, such that . This allows more efficient implementation of a grid-like pattern of the waypoints as recommended in Pix4D (2020). We calculate the drone’s coverage 𝐴𝑑, used to determine the

Philippine Journal of Science Vol. 150 No. 3, June 2021

de Ocampo and Dadios: Integrated Weed and Pest Detection

Page 3: Integrated Weed Estimation and Pest Damage Detection in ...

1041

waypoints that the UAV must traverse to cover 𝔸𝑚, based on the distance on the ground covered by its camera’s field of view � and the mission-specified altitude of the flight 𝐻𝑓. The ground sampling distance (GSD) is the equivalent ground distance between the centers of two pixels in a UAV image. Suppose that the 𝑊� is the actual width of the camera sensor and 𝐹𝑅 is the real focal length, then based on flight altitude 𝐻𝑓, the distance-on-the-ground 𝐷� covered by the camera can be calculated in Equation 1.

(1)

However, in some cases, the real focal length is not provided by the manufacturer of the camera instead of an equivalent only. In such cases, 𝐹𝑅 must be calculated first using Equation 2:

(2)

where 𝐹� is the equivalent focal length at ′𝑛′ mm, and 𝑐𝑓 is the crop factor. Hence, the GSD can be calculated in Equation 3:

(3)

where 𝑊� is the image width. The drone area coverage 𝐴𝑑 can now be computed using Equation 4:

(4)

Since the drone coverage is very small compared to the AOI (𝐴𝑑 << 𝔸𝑚), it is necessary to divide the 𝔸𝑚 into smaller regions. The number of regions along the horizontal direction is given by 𝑁𝑥, while 𝑁𝑦 is for the vertical (Equation 5).

(5)

Figure 1. Implementation set-up for mission path planning and farm data acquisition.

Philippine Journal of Science Vol. 150 No. 3, June 2021

de Ocampo and Dadios: Integrated Weed and Pest Detection

Page 4: Integrated Weed Estimation and Pest Damage Detection in ...

1042

Specific to the subject farm used in this study, the 𝔸𝑚 is divided into 768 regions in a 3-m altitude flight plan. The 768 regions are distributed uniformly to 12 different flight plans covering only 64 regions each. In an agricultural field, the visual content is generally homogenous (mostly vegetation); hence, the flight plans follow a grid-like pattern to cover all required regions just as the FAO International Telecommunication Union recommends (Sylvester 2018).

The algorithm for mission planning generates the geographical coordinates for the drone’s flight missions, which are then uploaded to cloud storage for mobile use. Every time that a flight mission is scheduled, these waypoints are downloaded to the drone’s mobile controller. To evaluate the accuracy of the generated waypoints, ground markers are used to compare the GPS coordinates to the corresponding drone’s footprint. The error is determined by measuring the distance of the marker to the position of the drone denoted by the center of the image it captures.

Derivation of Indicators for Estimating Weed and Pest DamagePre-processing operations are carried out to reduce the UAV-acquired image samples of the crops to sub-images that cover only the extent of an individual leaf for localized processing. The preprocessing includes leaf segmentation and vegetation fraction calculation.

The leaf segmentation employs superpixelation using the SLIC algorithm (Achanta et al. 2012). Clusters generated by SLIC superpixelation cover mostly the extent of each leaf. Each cluster becomes the segmented image for the leaves, i.e. 𝑐� → 𝑘�� leaf image. Crop each cluster from the parent image. The new images of clusters formed during superpixelation are denoted by Equation 6:

(6)

where 𝑥𝑘 and 𝑦𝑘 is the center of 𝑘�� cluster, and 𝑤𝑘 and �𝑘 are the width and height of the cropped image𝑅𝑘.

Vegetation fraction is estimated by using a pixel-wise classification based on normalized triangular greenness index. The normalized TGI is a modification of the triangular greenness index (Equation 7) defined by Hunt et al. (2011), which is the area of the reflectance-wavelength triangle formed by the three vertices denoted as (��, 𝑅�), (��, 𝑅�) and (��, 𝑅�), where � and 𝑅 are the wavelength and reflectance respectively.

(7)

The challenge in using the original TGI equation is that the user needs to know the peak wavelength responses of the CMOS sensors used to capture the image. In the absence of such information, normalized TGI can be used instead. To normalize the TGI equation, the TGI is divided by the maximum possible area of the triangle. Normalized TGI based

on (de Ocampo et al. 2019) is defined as , where parameter sweep can be done for

and to compensate for the peak wavelength response.

The TGI is originally used to identify vegetative regions in an image. Hence by using 𝑛𝑇𝐺𝐼, vegetation fraction can be calculated as the ratio of vegetative pixels over the total pixels in an image. Suppose the RGB image, . The reflectance 𝑅 is proportional to the channel intensity, i.e. , and

. The 𝑛𝑇𝐺𝐼 of each pixel is . By counting the number of pixels in the 𝑛𝑇𝐺𝐼 map having a value greater than the assigned threshold 𝑇𝐺𝐼�� and assigning such pixel as vegetative, the vegetation fraction is calculated using Equation 8:

(8)

where 𝑉𝐹 is the vegetation fraction, vf (k) is the marker of the vegetation status of the pixel, and N total number of pixels in the image.

Weed Detection and EstimationThe sub-image that contains any form of vegetation (weed and crop) is fed to a cascaded classification process. A deep neural network, specifically ResNet-18, is used to determine whether a sub-image is “with crop” or “without crop.”

Philippine Journal of Science Vol. 150 No. 3, June 2021

de Ocampo and Dadios: Integrated Weed and Pest Detection

Page 5: Integrated Weed Estimation and Pest Damage Detection in ...

1043

Once a sub-image is identified to contain a crop, it is forwarded to an SVM classifier, which determines if the sub-image is purely crop or not based on the texture features of the sub-images. The final output map generated from the last layer of the deep neural network is fed to a two-neuron layer where each neuron corresponds to each class label: “with crop” or “without crop.” Once classified as “with crop,” the gray-level co-matrix of the original input sub-image is then fed to the SVM model.

The GLCM for any given position is expressed in Equation 9 as:

(9)

where 𝑖 and 𝑗 are the pixel values, and 𝑥 and 𝑦 are the spatial coordinates in the image. The receptors of the SVM model standardize the input by reshaping it to 1 𝑥 𝑁, where 𝑁 is the total number of elements in GLCM. The SVM

model for “purely crop” classification is defined as , where � is the matrix of the weight parameters of the model, 𝑏 is the intercept, and � is the restriction. Finally, the kernel is defined

as .

Given that the image segmentation process yields the 𝑁 number of sub-images, then the amount of weed can be estimated using Equation 10:

(10)

where 𝑁𝑣 is the number of vegetative sub-images, 𝑁� is the number of sub-images classified as “purely crop,” and 𝑁 is the total number of sub-images.

Pest Damage DetectionThe primary symptom used to detect the presence of bad insects in the crop field is the existence of holes in leaves. If a sub-image 𝑅𝑘 is classified as “purely crop,” a hole detection algorithm is performed on the sub-image to isolate the portions of that leaf that have been consumed by various insects. The approach used in hole detection is similar to that of leaf segmentation only the first is done at the sub-image level. The 𝑘�� sub-image 𝑅𝑘 is posterized by clustering the image into 𝑀 cells and then averaging each color content. The 𝑀 cells of 𝑅𝑘 is then converted to grayscale and

finally to binary image: where 𝑇 = threshold for binarization and where 𝑥 = [𝑥, 𝑦]� denotes the spatial coordinates. This procedure emphasizes the hole present in the leaf if there is any. Detection of blobs is performed on the binarized image 𝐼𝑏 by using the Laplacian of Gaussian Lo𝐺 as the detector of the interest points. Given the 2D binarized image 𝐼𝑏 (𝒙), the scale-space representation of the image at scale 𝜎 can be described as 𝐼𝑏 (𝒙; 𝜎 ) and computed in Equation 11 as:

(11)

where and is the Gaussian kernel expressed as . Given the Gaussian kernel, the Laplacian of Gaussian Lo𝐺 kernel is written as in Equation 12:

(12)

The final response of Lo𝐺 filtering can be derived using: , where is the scale set used to describe the characteristic scale. By zero-thresholding of the final response , point of interest as the centers of the blobs can be derived, i.e. . Although the resulting coordinates are blobs, not all of them have the size which can be a manifestation of being a hole. Some can be too small and others can be too large to be considered as a hole that is a by-product of insects. Some measurements are extracted to confirm the feasibility of being a hole. Such measurements include the area and perimeter of the blob. To consider a blob to be a hole, it must satisfy the following conditions: such that , where and are the area, perimeter, and circularity of the blob respectively.

Philippine Journal of Science Vol. 150 No. 3, June 2021

de Ocampo and Dadios: Integrated Weed and Pest Detection

Page 6: Integrated Weed Estimation and Pest Damage Detection in ...

1044

RESULTS AND DISCUSSION

Coverage Path PlanningThe flight mission is a critical component for the assessment of general crop health. For illustration, only a segment of the farm is discussed here. Suppose that the extents of the AOI are defined by the user as four coordinates that impose the boundaries of the intended coverage (Figure 2). From the defined corners, a regular quadrilateral that could cover the maximum possible area is then identified. The generalized path plan follows an approximate cellular decomposition and boustrophedon waypoint pattern to cover the AOI. Specific to the new AOI, the 3-m altitude flight plan for crop health monitoring requires 12 different missions with 64 waypoints to cover the desired area. The 12 flight missions are interleaved in such a way that the whole AOI can be covered. In cases that limited flight time is available, any of the 12 flight missions can be used to represent the whole AOI. The generated waypoints are compared to the corresponding ground footprint of the drone. The error between the waypoints’ geographical coordinates and the actual position of the drone is measured in meters. The actual drone position is marked to its footprint using the center of the images it captures and several ground markers positioned using the controller’s GPS (Figure 3). The calculated GSD is 0.94 cm/px based on the camera FOV of 85 degrees. But in consideration of the environmental disturbances, manual measurements using ground markers were employed. The actual GSD is derived by dividing the distance by the number of pixels between the ground markers, which is 0.82 cm/px (Figure 4).

The mean error between the geographical coordinates provided by the mission path plan and the actual position of the drone is approximately 10.14% of the defined flight altitude. The flight altitude used during the test is 21 m from the ground. Table 1 summarizes the results of distance measurements from the markers and the geographical coordinates specified in the drone’s flight plan. The maximum error recorded is 3.9 m from the marker, while the minimum is just 0.39 m.

Figure 2. Defining the boundary coordinates for the AOI. The purple pins mark the geographical coordinates that define the extent of the mission path plan to be generated.

Figure 3. The center of the image is compared to the position of the ground marker. Difference between the GPS coordinates of a waypoint (white marker at point A) to the geographical position of the camera (center of the image at point B).

Philippine Journal of Science Vol. 150 No. 3, June 2021

de Ocampo and Dadios: Integrated Weed and Pest Detection

Page 7: Integrated Weed Estimation and Pest Damage Detection in ...

1045

Vegetation Fraction and Weed EstimateThe general indicators used in this work include the estimated amount of vegetation and weeds on the farm and the presence of leaf damages in crops. As an illustration, consider an RGB image from the dataset (Figure 5). Although the image contains mostly crops, the weeds are starting to cover much of the area. The situation is similar across the farm. The method employed for isolating the crops from the weeds used sub-image classification that determines whether it’s part of the crop or not. Although the segmentation cannot fully separate each leaf, the size of each sub-image is small enough to define a portion of the crop and adequately large to include multiple weed strands as presented in Figure 6, where some sub-images of crops and non-crops are sampled. The performance of the sub-image level classification which isolates crop from non-crop is summarized in a confusion matrix from the classification test of image dataset containing 450 and 552 samples of crops and weeds, respectively. The hybrid ResNet-SVM model achieved an accuracy of 98.00% and an F1-score of 97.73% (Table 2).

By employing normalized TGI, the vegetation part can be isolated from non-vegetation. In Figure 7, a heat map is used to show the vegetative portion of the region captured in the sample image. The increasing level of vegetation is denoted by

Figure 4. GSD estimation equals to ground distance in cm divided by the image distance in pixels (15.4 m / 1893 px = 0.8 cm/px).

Table 1. Comparison between geographical coordinates of the flight plan waypoints to the actual position of the UAV during the mission.

Marker no. Marker pixel coordinates Pixel distance from

center (in px)Actual metric distance (in m)

Calculated metric distance (in m) Percent error

𝑰𝑥 𝑰𝑦

1 1882 1140 146 1.39 1.19 14.39%

2 2260 960 294 2.79 2.39 14.34%

3 1941 1169 92 0.87 0.75 13.79%

4 2001 1550 411 3.9 3.34 14.36%

5 1929 1039 141 1.34 1.15 14.18%

6 1979 922 223 2.12 1.82 14.15%

7 1996 856 286 2.71 2.33 14.02%

8 2069 1133 42 0.39 0.34 12.82%

9 2099 1476 343 3.26 2.79 14.42%

10 2011 1412 273 2.59 2.22 14.29%

Philippine Journal of Science Vol. 150 No. 3, June 2021

de Ocampo and Dadios: Integrated Weed and Pest Detection

Page 8: Integrated Weed Estimation and Pest Damage Detection in ...

1046

darkening red while non-vegetation is shown in dark blue. On the other hand, the weed part of the sample image is annotated with red pixels and non-weed is retained as the original image (Figure 8). The result of the weed estimation is compared to the ground truth data (Figure 9). The average weed estimate of the 22 regions is 53.91%. Compared to the ground truth of the same regions with an average weed content of 59.63%, the algorithm achieved a true error of 5.72% (Figure 10).

Table 2. Performance of the sub-image level classification for isolating crops from non-crops.

Confusion matrixActual Accuracy 98.00%

Crop Non-crop Precision 100.00%

PredictionCrop 430 0 Recall 95.56%

Non-crop 20 552 F1-score 97.73%

The more defining feature that helped the isolation of sub-images of crops from the rest is the texture defined by their GLCMs. The sub-image containing mostly weed has a coarser texture than that of mostly crop. Sub-images containing mostly crops are more homogeneous than the non-crop. To visualize the comparison between the GLCM of different classes of images, statistical measures such as contrast, correlation, energy, and homogeneity can be used (Figure 11). The difference in contrast and energy is quite high for the non-crop or weed and crop sub-images. Higher contrast is present in the non-crop sub-image, which is evident in the pixel intensities. Also, the GLCM of the sub-image of crops contains more energy than that of the non-crop.

Pest Damage DetectionThe final component of crop health monitoring is the detection of the presence of insect pests on the farm indicated mainly by the damages (holes) left in the leaves of the crop. This indicator is an effective measure since the insect attacks mostly at night while the UAV-based monitoring is best implemented during the daytime. The detection of holes in images is performed only on the sub-image classified as crops (Figure 12). Only the blobs with areas greater than

Figure 5. Sample UAV-captured image of the plant canopy. Figure 6. Sample segments containing weeds (top row) and crop leaves (bottom row).

Figure 7. The normalized triangular greenness index is used for calculating, vegetation fraction, weed estimate, and separating crops from the rest.

Figure 8. Result of separation of weeds (in red) from the crops.

Philippine Journal of Science Vol. 150 No. 3, June 2021

de Ocampo and Dadios: Integrated Weed and Pest Detection

Page 9: Integrated Weed Estimation and Pest Damage Detection in ...

1047

Figure 9. Ground truth used for verification of the results in weed estimate. There are 22 regions covered by the ground truth image: a) original; b) manually annotated.

Figure 10. Actual weed vs. the result of the weed estimation algorithm. The weed estimates in the 22 regions covered by the area of the ground truth data used in verification are averaged and compared to the actual weed content.

Figure 11. Gray-level co-occurrence matrix of two sample segments: non-crop or weed (top) and crop (bottom). The difference between weed and crop is highly emphasized in contrast and energy measurements but low on correlation and homogeneity.

Philippine Journal of Science Vol. 150 No. 3, June 2021

de Ocampo and Dadios: Integrated Weed and Pest Detection

Page 10: Integrated Weed Estimation and Pest Damage Detection in ...

1048

1% but less than 10% of the sub-image area are considered as holes. Also, the circularity of the holes is measured: the higher the circularity (the circularity of circle is 1), the less likelihood of classifying it as a hole. The detection system can identify holes with an average accuracy of 86.37%. Figure 13 shows 10 randomly picked holes from the detected.

Figure 12. Hole detection was performed on each leaf segment that was classified as part of the vegetation: A) the image segment in RGB; B) result of superpixelation; C) posterization of the superpixels; D) binary image; E) annotated image.

Figure 13. Ten randomly picked segments from the identified damaged leaves by using hole detection.

CONCLUSIONThe machine vision monitoring and detection system used a UAV equipped with an RGB camera for data collection. The collected visual data is used to estimate how much of the total vegetation is weeds and to detect the leaf damages caused by insect pest attacks. The monitoring system is easily deployable for it only requires three pieces of equipment: the UAV, a smartphone, and a laptop.

The generalized path plan follows an approximate cellular decomposition and boustrophedon waypoint pattern to cover the AOI. The flight altitude of 3 m is used for weed estimate, vegetation fraction calculation, and pest damage detection. The normalized triangular greenness index extracted from the RGB images acquired using the camera on-board the UAV is used to calculate the vegetation fraction and the weed estimate. Damages in the crops due to insect attacks are also monitored by the system. Holes are detected and localized for an immediate response from the farmers. This helps in the mitigation of the spreading of pests. The use of vegetation indexes in conjunction with machine learning techniques in weed mapping and health monitoring is successfully employed in the system using commercially available equipment with no installation required.

Philippine Journal of Science Vol. 150 No. 3, June 2021

de Ocampo and Dadios: Integrated Weed and Pest Detection

Page 11: Integrated Weed Estimation and Pest Damage Detection in ...

1049

Although a functional monitoring system has been successfully developed, the part of the crop health monitor – specifically pest damage detection – is significantly limited to Solanum melongena. It is recommended, therefore, that an intensive data collection for UAV images of various crops be conducted to expand the detection capability to other crops as well.

ACKNOWLEDGMENTSThe authors would like to acknowledge the DOST Engineering Research and Development for Technology Consortium, De La Salle University– Gokongwei College of Engineering, and Batangas State University for funding the research project, and Mr. Romeo V. Bustamante for allowing the use of his farm for data gathering and validation.

REFERENCESACHASOV AB, SEDOV AO, ACHASOVA AO. 2018. Methodological Basis of the UAVs Use for the Weed Detection.

Man and Environment 29(1). https://doi.org/10.26565/1992-4224-2018-29-02

ACHANTA R, SHAJI A, SMITH K, LUCCHI A, FUA P, SÜSSTRUNK S. 2012. SLIC superpixels compared to state-of-the-art superpixel methods. IEEE Trans Pattern Anal Mach Intell 34(11): 2274–2282. https://doi.org/10.1109/TPAMI.2012.120

BECKIE HJ, OWEN MJ, BORGER CPD, GILL GS, WIDDERICK MJ. 2020. Agricultural Weed Assessment Calculator: An Australian Evaluation. Plants 9(12): 1737. https://doi.org/10.3390/plants9121737. PMID: 33316952; PMCID: PMC7763356.

BHUTANI A, WADHWANI P. 2018. Precision farming: market size worldwide 2020. In: Statista. https://doi.org/Report ID: GMI227

DE OCAMPO ALP, BANDALA AA, DADIOS EP. 2019. Estimation of Triangular Greenness Index for Unknown Peak Wavelength Sensitivity of CMOS-acquired Crop Images. Proc IEEE HNICEM Conf. https://doi.org/10.1109/hnicem48295.2019.9072796

[FAO] Food and Agriculture Organization of the United Nations. 2020. World Food Situation: FAO Food Price Index. Retrieved on 06 Jul 2020 from http://www.fao.org/worldfoodsituation/foodpricesindex/en/

[FAO] Food and Agriculture Organization of the United Nations. 2003. Eggplant Integrated Pest Management: An Ecological Guide Training Resource Text on Crop Development, Major Agronomic Practices, Disease, and Insect Ecology, Insect Pests, Natural Enemies and Diseases of Eggplant. In: Vegetables in South and Southeast Asia. FAO Inter-Country Programme for Integrated Pest Management. Retrieved on 09 Feb 2021 from http://www.fao.org/3/ca8262en/ca8262en.pdf

HAUTEA DM, TAYLO LD, MASANGA APL, SISON MLJ, NARCISO JO et al. 2016. Field Performance of Bt Eggplants (Solanum melongena L.) in the Philippines: Cry1Ac Expression and Control of the Eggplant Fruit and Shoot Borer (Leucinodes orbonalis Guenée). PLOS ONE 11(6): e0157498. https://doi.org/10.1371/journal.pone.0157498

HUNT R, DAUGHTRY CS, EITEL J, LONG DS. 2011. Remote sensing leaf chlorophyll content using a visible band index. J Agron 103(4): 1090–1099. https://doi.org/10.2134/agronj2010.0395

JOHNSON R, MOHAN T, PAU S. 2020. Weed Detection and Removal Based on Image Processing. Int J Recent Technol Eng 8(6): 347–352. https://doi.org/10.35940/ijrte.B3679.038620

JULIA JPM, BENNET D. 2020. Weed Detection and Classification using ICA Based SVM Classifier. Int J Recent Technol Eng 8(5): 1557–1560. https://doi.org/10.35940/ijrte.C5410.018520

KHAN A, ILYAS T, UMRAIZ M, MANNAN ZI, KIM H. 2020. CED-Net: Crops and Weeds Segmentation for Smart Farming Using a Small Cascaded Encoder-Decoder Architecture. Electronics 9(1602). https://doi.org/10.3390/electronics9101602

Philippine Journal of Science Vol. 150 No. 3, June 2021

de Ocampo and Dadios: Integrated Weed and Pest Detection

Page 12: Integrated Weed Estimation and Pest Damage Detection in ...

1050

KUMAR V, JHA P. 2017. Influence of Nitrogen Rate, Seeding Rate, and Weed Removal Timing on Weed Interference in Barley and Effect of Nitrogen on Weed Response to Herbicides. Weed Sci 65(1): 189–201. https://doi.org/10.1614/ws-d-16-00047.1

LIU B, BRUCH R. 2020. Weed Detection for Selective Spraying: A Review. Agriculture Robotics 1: 19–26. https://doi.org/10.1007/s43154-020-00001-w

MARQUES LJP, BIANCO S, FILHO ABC, BIANCO MS. 2016. Phytosociological Survey and Weed Interference in Eggplants Cultivation. Planta Daninha 34(2): 309–318. https://doi.org/10.1590/S0100-83582016340200012

MAWANDHA HG, SUPARYANTO T, PARDAMEAN B. 2020. Weeds e-Catalog as a Tool for Identification of Weeds in Plantation. Int Conf Ecol Eng Dev.

[NICE] Network for Information and Collaboration Exchange. 2019. Philippines’ first “Smart Farm.” Retrieved on 09 Jul 2020 from http://lbf-nice.org/philippines-first-smart-farm/

[Pix4D] Pix4Dmapper. 2020. Pix4Dmapper. Retrieved on 10 Aug 2020 from https://support.pix4d.com/hc/en-us/articles/204272989-Offline-Getting-Started-and-Manual-pdf

[SARAI] Project SARAi. n/d. Smarter Approaches to Reinvigorate Agriculture as an Industry in the Philippines. Retrieve on 01 Jun 2020 from https://sarai.ph/

SAIZ-RUBIO V, ROVIRA-MÁS F. 2020. From smart farming towards agriculture 5.0: a review on crop data management. Agronomy 10(2). https://doi.org/10.3390/agronomy10020207

SAPKOTA B, SINGH V, COPE D, VALASEK J, BAGAVATHIANNAN M. 2020. Mapping and Estimating Weeds in Cotton Using Unmanned Aerial Systems-borne Imagery. AgriEngineering 2(2): 350–366. https://doi.org/10.3390/agriengineering2020024

SHARPE SM, SCHUMANN AW, BOYD NS. 2018. Detection of Carolina Geranium (Geranium carolinianum) Growing in Competition with Strawberry Using Convolutional Neural Networks. Weed Sci. https://doi.org/10.1017/wsc.2018.66

SYLVESTER G. 2018. E-Agriculture in Action: Drones for Agriculture. Bangkok, Thailand: Food and Agriculture Organization of the United Nations – International Telecommunication Union. 112p.

Philippine Journal of Science Vol. 150 No. 3, June 2021

de Ocampo and Dadios: Integrated Weed and Pest Detection