1. Introduction
Peanuts are an important crop in the United States, providing a source of food crop and oil in the country. Domestic production of peanuts in 2017 is estimated by USDA to be 5.57 billion pounds [1]. The U.S. peanuts fall into four basic types: Runner, Virginia, Spanish and Valencia [2]. Each of these peanuts is distinctive in size and flavor. economic value of peanuts production in 2022 was estimated by USDA to be $1.6 billion [1]. Peanuts pods grow underground and often mature unevenly, resulting that choosing the correct time to harvest peanuts is more complicated than other crops. Determining when to dig is important because peanut maturity determines flavor, grade, and yield [3].
Peanut maturity can be predicted with four different methodologies: days after planting (DAP) method, growing degree day (GDD) method, shell-out method, and hull scrape method (mesocarp classification method) [4]-[6]. Peanuts can reach harvest maturity from 120 to more than 150 days after planting, and 1900-2500 GDDs, depending on variety and growing season [7]. Days after planting method and GDD method can be good guideline for when to check fields and can be used in combination with other methods, but DAP method or GDD method is not recommended to be used as the sole basis for determining digging date due to different soil moisture and ambient temperature conditions in the field [6].
In the shell-out method, pod samples are collected and opened to expose the inner shell lining, known as the endocarp, along with the seed coats [6]. The color of these components is then assessed to predict the maturity of the peanuts. As the peanuts mature, the seed coat color transitions from white to dark pink [8]. Despite its effectiveness, the shell-out method is quite subjective and time-consuming, requiring careful visual examination and assessment of the color changes in the seed coats and endocarp. This subjectivity and the detailed nature of the process can lead to variability in results and extended evaluation times.
The most popular method of assessing peanut pod maturity is hull scrape method, which is also known as the mesocarp classification method [6] [7] [9]. Pod maturity can be determined by scraping away the outer hull layer with a knife or blasting with a pressure washer with 1300 - 1600 psi to remove the outer skin of the pod (called the exocarp) to expose the color of the middle layer (called the mesocarp) [6] [9]. To avoid damaging the pods, the sprayer should hold the pressure washer about one meter away from the pods [5]. This ensures that the exocarp is effectively removed without harming the fruit inside. By examining the exposed mesocarp, researchers can determine the maturity of the peanuts more accurately. This method, while widely used and more objective than the shell-out method, still requires careful handling and attention to detail to ensure the integrity of the pods during the assessment process.
The color of the mesocarp changes as the peanut matures, transitioning from white to pale yellow, deep yellow, orange, brown, and finally black [10]. The presence of orange, brown, and black mesocarp colors indicates that the kernels are mature enough for harvest [6]. Two key parameters are used to determine the maturity target: the percentage of pods in the orange, brown, and black categories combined (OBB), and the percentage of pods that are black (BL) [6]. For Virginia peanut types, the target is to have 70% of the pods in the OBB category and 1% - 2% in the BL category [6]. For runner types, the target is to have 75% - 80% of the pods in the OBB category and 5% in the BL category [6]. While this method provides a clear framework for assessing peanut maturity, it does have its limitations. A significant drawback is that it relies heavily on the assessor’s ability to accurately determine the color of the mesocarp through visual inspection, which can introduce variability and potential for error. Proper training and experience are crucial to mitigate this limitation and ensure consistent and accurate maturity assessments.
In 1981, Williams and Drexler proposed a manual method for classifying peanut development stages that is widely used by farmers today. About 150 - 200 pods are collected, blasted to reveal mesocarp color, and placed onto a color chart called peanut maturity profile board [10] to predict days to maturity. Profile board displays colors associated with 7 maturity classes, with each class divided into 4 subclasses [10] [11] (Figure 1). The pods are placed in columns according to colors on the profile board that mostly closely match the mesocarp of each pod [10]. Based on the shape of distribution from pods, guidelines are provided to estimate the number of days remaining until the peanut should be harvested. Color separation can be highly susceptible to an individual’s ability to determine which color column pods should be placed on profile board. The individual placing the pods in the columns will only gain the knowledge to properly place the pods through experience. This method is still subjective and time-consuming.
Estimating the peanut maturity using profile board or counting manually can be a frustrating and time-consuming process; therefore, accurate estimation of maturity will help save producers time and increase accuracy to determine when to dig. It would be beneficial to develop and apply image processing to assess features of peanut mesocarp for improving measurement and estimation of maturity.
There have been many studies that have classified peanut pod maturity using the image analysis technique. Ghate et al. [8] classified peanuts into immature, mid-mature, and mature categories by analyzing the visual texture of the peanut pod’s outer coat of the seed. Bolder et al. [12] classified peanut mesocarp color into 5 color categories: yellow, orange A, orange B, brown, and black. Colvin et al. [13] also classified mesocarp color based on the proportion of brown and black pods in a sample. Bindlish et al. [14] determined pod maturity based on the 10 mesocarp colors and 7 different pod sizes. These studies used the seven maturity categories proposed by Williams and Drexler [10] as basic color categories to develop maturity model.
Figure 1. Peanut maturity profile board developed by Williams and Drexler [10].
Recent advancements in image analysis and sensing technologies have significantly improved the accuracy and efficiency of agricultural monitoring and yield estimation. Studies have demonstrated the potential of RGB and hyperspectral imaging for various crop assessments, including disease detection, yield calculation, and phenotyping [15] [16]. Hyperspectral imaging has become a popular method for estimating peanut maturity. Zou et al. [17] used hyperspectral imaging to determine pod maturity, leveraging spectral differences between mature and immature pods within a classification algorithm, but their method still required pod blasting. In contrast, Balasubramaniyan and Navaneethan [18] developed a method that combines Hyper Spectral Invariant Scaled Feature Selection (HSISFS) and an Adaptive Dense Net Recurrent Neural Network (ADNRNN), achieving a classification accuracy of 91% without the need for pod blasting.
Hyperspectral cameras, while effective, are expensive compared to digital cameras or cell phones. This makes it challenging for county educators and peanut growers in southern US states to afford such equipment for determining peanut pod maturity. Therefore, our study aims to develop a fast, robust, and inexpensive methodology for straightforward image processing and interpretation using Red-Green-Blue (RGB) imagery captured with digital cameras or cell phones in the field. This approach will allow for accurate definition of the mesocarp area and estimation of peanut maturity.
Color classification techniques in the RGB color spectrum can be used to distinguish peanuts from background (e.g. cardboard and shadow) in images. Several statistical measurements of similarity between groups, in terms of multiple characteristics, have been proposed, such as Kolmogorov’s variation distance, Bhattacharyya distance, and Mahalanobis distance [19]. Mahalanobis distance classification is widely used for pattern recognition and data analyses when groups have different means but similar standard deviations [19] and is most suitable in image processing for precision agriculture [20]-[22]. Chena et al. [20] extracted 28 color features from imagery for identifying corn varieties at a success rate of 90%. Diago et al. [21] extracted 40 colors in 7 groups to characterize grapevines, leaves, and background, and the segmentation results showed a performance of 92% for leaves and 98% for grapes. Liang et al. [22] extracted 20 - 25 colors in 8 groups to characterize soybean leaves and background, and showed 96% of correct classification for leaves and background.
Estimation of peanut maturity using image analysis highly depends on the completeness of pod blasting process. As pod blasting time is too short, yellow exocarp can’t be removed completely, resulting in underestimation of peanut maturity. When pod blasting time is too long, the peanut maturity is also underestimation due to damage of endocarp and fruit. However, there are no studies for image analysis that have evaluated peanut maturity of OBB percentage at different pod blasting times with different peanut types. Therefore, our study will also address the impact of pod blasting time on the maturity model and develop efficient methods for color classification, in areas that have not been previously explored.
Our research aims to develop a practical solution for peanut maturity estimation using RGB imagery and color classification. The objectives of this paper are to: 1) compare Mahalanobis distance algorithms and without Mahalanobis distance for classifying colors of peanut mesocarp, 2) develop maturity model to estimate percentage of peanut maturity in the OBB and BL, 3) validate the proposed maturity model using other peanut pod images and compare the results, 4) compare peanut maturity at different pod blasting times with different peanut type and varieties.
2. Materials and Methods
Our methodology employs RGB imagery, which is more accessible and cost-effective compared to hyperspectral imaging, combined with the Mahalanobis distance classification method. This approach aligns with recent trends in precision agriculture that favor non-destructive, high-accuracy techniques for crop monitoring [16]. In our developed software, the area of interest can be drawn manually in rectangle by the user for peanut management and assessment. The image digitization performed by the software involves scanning each pixel in the image taken from field, outputting pixel brightness, red, green, and blue color space values, different colors of peanut mesocarp (%), estimation of OBB (OBBest) and BL (BLest).
To be able to correlate the estimated mesocarp maturity with real pod counts, a detailed experimental setup for the acquisition of images was developed based on individual images of peanut pods photos taken from the field. Calculations were used to correlate the estimated colors of OBBest and BLest (%) with actual pod counts.
2.1. Experimental Site and Image Acquisition
During 2014, 2016, and 2017, images were acquired from Virginia and Georgia runner types planted at the Edisto Research and Education Center of Clemson University. Pod sample size were roughly 100 pods. The pods were blasted with a pressure washer of 1500 psi and then counted manually to determine the observed percentage of OBBobs and BLobs. The 323 images of peanut were randomly selected for photography with a conventional RGB camera (MOTOROLA XT1030) at a distance of approximately 30 cm from cardboard. The images were captured as a resolution 6000 × 4000. The peanut maturity profile board (Figure 1) and twenty-eight representative peanut pod images proposed by William and Drexler [10] (Figure 2) were selected to classify 6 maturity color groups and train software for estimating different colors of mesocarp (%), color saturation, brightness, and two maturity target OBB and BL (%) (Figure 1 and Figure 2). The 323 images were used to develop the peanut maturity model of the OBBest and BLest (%). The images taken in 2017 were used to test peanut maturity model and compared the results of estimated OBBest, BLest and actual pod counts of OBB and BL.
![]()
Figure 2. Mesocarps of 7 maturity classes developed by William and Drexler [10].
2.2. Pod Colors Determination
Mahalanobis distance is an adequate classification method for analyzing pods color. The Mahalanobis distance (Equation (1)) measures the similarity between an unknown sample group and a known one. It accounts for the fact that the variances in each direction are different as well as for the covariance between variables [23]. The Mahalanobis distance between two random vectors (X, Y) with the same distribution, and covariance matrix S, can be defined as:
(1)
The Mahalanobis color distance standardizes the influence of the distribution of each feature considering the correlation between each pair of terms. In the case of RGB color images, S is computed as (Equation (2)):
(2)
and the element of S is calculated as:
(3)
where Ri, Gi, Bi are the values of the ith match (
), and
,
,
are the mean color values for R, G, B in the given images, respectively.
X was a three-dimensional vector (R, G, B), which represented pixels from the image to be processed. Y was also a three dimensional vector (
,
,
), which represented the reference pixels (reference group) for each class to be identified.
In the proposed methodology of this work, 9 reference groups of pixels were selected to generate the classification, in which every group represented relevant characteristics of peanut pods and background classes. The 9 groups identified were: peanut pods (white, pale yellow, deep yellow, orange, brown, and black pods groups) and background (red, blue, and green cardboard). If any group was not present, or a new color group appeared on the image, the number and/or the group labels could be modified in the program.
Each reference group of peanut pods was manually selected from peanut maturity profile board (Figure 1) and a set of 28 representative images (Figure 2) proposed by William and Drexler and a set of 20 colors in each reference group was chosen [10]. Mahalanobis distance was computed over a set of 323 images in our developed software to obtain the percentage of each color group, OBBest and BLest. To implement the classification and provide graphical interface to the user, the software was developed based on Visual Basic 2017. These images were used to train the model for the OBBest and BLest. Details of the identified peanut pods were shown original color in the output figures and background in the output figures were transformed to pink color (Figure 3). All selected group colors, including pod and background classes, were also analyzed for the images without using classification method.
(a)
(b)
Figure 3. Example performed over a peanut pod image: (a) Original image; (b) Processed image to obtain complete pod area.
2.3. Peanut Pods Maturity Determination
Two parameters are used to determine maturity target: one is the percentage of pods in the orange, brown, and black categories combined (OBB); the other is the percentage of pods in black (BL). The representative images mentioned above were selected to classify 6 maturity color groups (white, pale yellow, deep yellow, orange, brown, black) and train software for estimating different colors of mesocarp (%) and two maturity targets (OBB and BL) (%). The selected pod colors were defined as: C1 (white), C2 (pale yellow), C3 (deep yellow), C4 (orange), C5 (brown) and C6 (black); each image was analyzed by the software program and pixels were classified accordingly. After analysis, individual pixel number for each color class (C1 through C6) were mathematically calculated. Equations (4) and (5) were used to compute OBB and BL pods percentage for 323 images. The actual number of OBBobs and BLobs pods were also counted for 323 images. These actual pod counts were performed by experienced county educators and peanut specialists.
(4)
(5)
A linear relationship was found between the OBBest percentage obtained from image analysis and the OBB percentage from OBBobs. Empirical models for OBBest percentage were developed to evaluate statistical trends in the data. The trends were studied by conducting percentage of 6 color group, and OBBobs using SAS (SAS, 2017, Institute SAS Inc., Cary, NC). The correlations between the OBBest and OBBobs were analyzed, and the R2, error, and root mean square error (RMSE) were calculated (Equation (6) and Equation (7)).
(6)
(7)
where n is number of photos; and rest and robs are modeled and observed OBB percentage, respectively. The hypothesis is that the total pixel number of OBB can be applied to estimate the actual OBB percentage.
2.4. Peanut Maturity Model Validation
Based on the data and images collected in 2014, 2016, and 2017, a comprehensive and predictive OBB and BL model was developed for estimating peanut maturity. To validate this model, additional data were collected in 2017 from studies conducted at the Edisto Research Station of Clemson University on August 28th, and in Orange County, SC on September 14th and October 20th. The 27 pod samples were randomly selected for photography using a conventional RGB camera or digital cell phone at a distance of approximately 30 cm on a green cardboard background. The number of pods in each color class was then counted. The data and images obtained in 2017 were used to test the peanut maturity model and compare the results with the actual pods categorized as OBBobs and BLobs. These actual pod counts were performed by experienced county educators and peanut specialists.
2.5. Peanut Blasting Time Determination
Six peanut cultivars, Bell, Emery, Sullivan, GA-06G, GA-16H0, and AU-17, were planted at Edisto Research Station of Clemson University. To determine the optimal pod blasting time for image analysis, 100 pod samples were randomly selected with three replicates and poured into a picker basket, and the basket is placed in a five-gallon bucket. The pods were blasted with a lightweight pressure washer at 1500 psi each minute, then laid on the cardboard and photographed. Each image was taken with a conventional RGB camera (MOTOROLA XT1030) at a distance of approximately 30 cm on a green cardboard. Pods numbers were also counted according to color classes.
3. Results and Discussion
3.1. Pods Area Performance
The reference pixels of 6 pods color group selected from Williams and Drexler [10] for calculating Y and S (Equations (1)-(3)) were shown in Figure 1 and Figure 2 to generate classification (Figure 4). Mahalanobis distance method identified more surface area of pods from an image (6000 × 4000 pixels) when evaluating algorithms for pattern recognition (Figure 4). For unknown color pixels (not selected in the reference groups), Mahalanobis distance computed for each reference group and determined whether unknown pixels were classified as pods color or background. The classifiers for 9 reference groups performed well without any adjustments to contrast, brightness, or color (Figure 5(c)). Manual validation showed 73% and 94% of correct classification for pods or background with the unclassified image and Mahalanobis distance processing, respectively (Figure 5(b), Figure 5(c)). Most of the misclassifications were due to the shadow of black pods, which exhibited similar color as black pods group. Figure 5(a) was taken from author’s experiment.
![]()
Figure 4. Selection of 6 reference color pixel groups proposed by Williams and Drexler [10].
(a)
(b)
(c)
Figure 5. (a) Example of original peanut pod image; (b) Area of identified peanut pod without classification; (c) Area of identified peanut pod using Mahalanobis distance.
3.2. Peanut Pods Maturity Estimation
The OBBobs (Figure 3(a)) and a processed photo (Figure 3(b)) using Mahalanobis distance showed that OBBest was positively influenced by the sum of orange (r = 0.30, p < 0.001), brown (r = 0.61, p < 0.001), and black (r = 0.69, p < 0.001) percentage based on image analysis (Figure 6). Actual counts showed negatively relationships with brightness (r = −0.23, p < 0001) and saturation (r = −0.21, p < 0.001). The most dominant direct effects on OBBest were brightness, saturation, and 6 color pixel (%) from image analysis. Multiple linear regression analysis (SAS procedure PROC REG) was used to generate linear equation (Equation (8)). However, saturation and brightness exhibited multicollinearity based on a VIF (variance inflation factor) test. Therefore, only brightness was selected as an independent parameter, which had a greater effect on OBB. Adding saturation on top of brightness didn’t improve the model based on Cp, AIC, BIC, and adjusted R2. The residual plot showed no linearity violation and constant variances, and QQ plot depicted the data is normally distributed. Statistical interactions were also introduced (Equation (8)) to perform better estimation.
![]()
Figure 6. Influence of sum OBB from image analysis on actual OBB. Note: n = 315 (p < 0.001).
(8)
where 15.2 < OBBest < 86.2%; 311 < Brightness < 730; 0 < PaleY < 38.2%; 3.8 < DeepY < 44.6%; 6.6 < Orange < 47.6%; 8.7 < Brown < 36.9%; and 0 < Black < 48.6%.
The validation curves for the estimated and observed OBB using linear equations indicated that the correlation between the observed and predicted defoliation were both close to the 1:1 line and the values of R2 of linear regression was 0.93 (Figure 7). Plot of correlation curve of OBBest and OBBobs (Figure 7) showed the RMSE produced by linear equation at 4.1%.
Figure 7. Linear correlation between the actual OBB percentage (%) and OBBest from images. Note: n = 315 (R2 = 0.93, RMSE = 4.1%, p < 0.001). Dotted line is 1:1 line.
The BLest (Figure 3(a)) and a processed photo (Figure 3(b)) using Mahalanobis distance showed that BLest was influenced by the sum of orange (r = −0.42, p < 0.001), and black (r = 0.86, p < 0.001) percentage based on image analysis. The most dominant direct effects on BLest were orange and black pixel (%) from image analysis. Multiple linear regression analysis (SAS procedure PROC REG) was also used to generate linear equation (Equation (9)). The residual plot also showed no linearity violation and constant variances, and QQ plot depicted the data is normally distributed. Statistical interactions were also introduced (Equation (9) to perform better estimation.
(9)
where 6.6 < Orange < 47.6%, and 0 < Black < 48.6%.
The validation curves for the BLest and BLobs using linear equations indicated that the correlation between the observed and predicted values were both close to the 1:1 line and the values of R2 of linear regression was 0.88 (Figure 8). Plot of correlation curve of BLest and BLobs (Figure 8) showed the RMSE produced by linear equation at 1.8%.
Figure 8. Linear correlation between the BLobs (%) and BLest (%) from images. Note: n = 132 (R2 = 0.88, RMSE = 1.8%, p < 0.001). Dotted line is 1:1 line.
3.3. Validation of Peanut Pod Maturity Model
To validate the maturity model, the images taken on August 28th, September 4th, and October 15th, 2017 were selected for photography with a conventional RGB camera (MOTOROLA XT1030) at a distance of approximately 30 cm. The ability of the developed maturity model to predict OBB and BL as computed image in 2017 was evaluated and compared (Table 1). The validation of OBB using other images from field provided reasonable estimation (R2 = 0.98 and RMSE = 2.73%). The average error between OBBest and OBBobs was 2.37% (Table 1). The average error between BLest and BLobs was 0.76% (Table 1). The peanut maturity model provided reasonable estimation compared to OBBobs and BLobs obtained in 2017 images for validation. This maturity model currently only can be used in images with resolution (1000 × 750 pixels) or above, which can provide reasonable estimation. When the resolution is under 1000 × 750 pixels, the OBBest is underestimated due to the insensitivity of color detection. To make the model more suitable for use in the field, future work can focus on making the proposed software more robust against different weather conditions of pod color classification.
Table 1. Error analysis of peanut pod maturity model based on 27 images from the 2017 harvest season in South Carolina.
Type |
Variety |
OBBest (%) |
BLest (%) |
OBB (#) |
BL (#) |
Other (#) |
OBBobs (%) |
BLobs (%) |
Error1 (%) |
Error2 (%) |
Virginia |
N/A |
55.65 |
0.12 |
168 |
2 |
118 |
58.74 |
0.69 |
−3.09 |
−0.57 |
Virginia |
Bell |
88.99 |
12.31 |
89 |
12 |
11 |
89.00 |
10.71 |
−0.01 |
1.60 |
Virginia |
Bell |
87.86 |
10.33 |
98 |
11 |
17 |
85.22 |
8.73 |
2.64 |
1.60 |
Virginia |
Bell |
95.27 |
11.4 |
100 |
14 |
9 |
91.74 |
11.38 |
3.53 |
0.02 |
Virginia |
Emery |
73.42 |
2.21 |
78 |
5 |
31 |
71.56 |
4.39 |
1.86 |
−2.18 |
Virginia |
Emery |
78.39 |
3.52 |
84 |
6 |
20 |
80.77 |
5.45 |
−2.38 |
−1.93 |
Virginia |
Emery |
69.04 |
4.44 |
82 |
5 |
30 |
73.21 |
4.27 |
−4.17 |
0.17 |
Virginia |
Sullivan |
56.05 |
0.32 |
64 |
2 |
46 |
58.18 |
1.79 |
−2.13 |
−1.47 |
Virginia |
Sullivan |
60.90 |
2.08 |
72 |
3 |
44 |
62.07 |
2.52 |
−1.17 |
−0.44 |
Virginia |
Sullivan |
64.06 |
2.32 |
77 |
4 |
39 |
66.38 |
3.33 |
−2.32 |
−1.01 |
Runner |
N/A |
53.05 |
0.21 |
177 |
2 |
137 |
56.37 |
0.63 |
−3.32 |
−0.42 |
Runner |
N/A |
20.55 |
0.01 |
65 |
0 |
231 |
21.96 |
0.00 |
−1.41 |
0.01 |
Runner |
N/A |
33.74 |
0.03 |
90 |
0 |
178 |
33.58 |
0.00 |
0.16 |
0.03 |
Runner |
N/A |
55.89 |
0.57 |
115 |
1 |
87 |
56.93 |
0.49 |
−1.04 |
0.08 |
Runner |
N/A |
39.95 |
0.03 |
177 |
0 |
225 |
44.02 |
0.00 |
−4.07 |
0.03 |
Runner |
N/A |
65.84 |
1.52 |
155 |
3 |
75 |
67.39 |
1.29 |
−1.55 |
0.23 |
Runner |
N/A |
64.03 |
1.18 |
169 |
3 |
78 |
68.42 |
1.20 |
−4.39 |
−0.02 |
Runner |
N/A |
63.03 |
2.3 |
155 |
4 |
81 |
65.68 |
1.67 |
−2.65 |
0.63 |
Runner |
AU-17 |
77.02 |
6.88 |
86 |
6 |
27 |
76.11 |
5.04 |
0.91 |
1.84 |
Runner |
AU-17 |
81.04 |
7.21 |
85 |
7 |
25 |
77.27 |
5.98 |
3.77 |
1.23 |
Runner |
AU-17 |
86.87 |
5.66 |
94 |
7 |
20 |
82.46 |
5.79 |
4.41 |
−0.13 |
Runner |
GA-06G |
73.73 |
5.21 |
79 |
5 |
30 |
72.48 |
4.39 |
1.25 |
0.82 |
Runner |
GA-06G |
84.23 |
8.93 |
88 |
8 |
17 |
83.81 |
7.08 |
0.42 |
1.85 |
Runner |
GA-06G |
72.43 |
4.53 |
78 |
4 |
35 |
69.03 |
3.42 |
3.4 |
1.11 |
Runner |
GA-16H0 |
78.27 |
4.66 |
85 |
5 |
23 |
78.70 |
4.42 |
−0.43 |
0.24 |
Runner |
GA-16H0 |
58.60 |
1.93 |
53 |
2 |
43 |
55.21 |
2.04 |
3.39 |
−0.11 |
Runner |
GA-16H0 |
77.74 |
6.02 |
78 |
6 |
28 |
73.58 |
5.36 |
4.16 |
0.66 |
Note: 1Error = OBBest − OBBobs; 2Error = BLest − BLobs.
3.4. Pod Blasting Time for Peanut Maturity Estimation
To determine the optimal pod blasting time for image analysis, six peanut cultivars—Bell, Emery, Sullivan, GA-06G, GA-16H0, and AU-17 were selected. Approximately 100 pods from each cultivar were randomly chosen and photographed at one-minute intervals during pod blasting. Figure 9 illustrates the peanut pod images of the AU-17 variety at different blasting times. The OBBest values increased with pod blasting time, eventually plateauing for several minutes before decreasing due to mesocarp damage from the pressure washer, as shown in Figure 10 and Figure 11. The red frame in Figure 9(i) highlights the white kernels. The optimal pod blasting times for each cultivar are summarized in Table 2. On average, Runner type peanuts required 1 - 2 minutes longer than Virginia type peanuts to achieve thorough pod blasting. When the optimal pod blasting time was reached, the average error between the estimated OBBest and the observed OBBobs was only 1.18%.
(a) (b) (c)
(d) (e) (f)
(g) (h) (i)
Figure 9. Example of original peanut pod image (AU-17) at (a) 1 min, (b) 2 min, (c) 3 min, (d) 4 min, (e) 5 min, (f) 6 min, (g) 7 min, (h) 8 min, and (i) 9 min.
Figure 10. Pod blasting time for selected variety of Virginia type.
Figure 11. Pod blasting time for selected variety of Runner type.
Table 2. Optimal pod blasting time for different peanut varieties.
Type |
Variety |
Optimal blasting time (min) |
OBBest (%) |
OBBobs (%) |
Error (%) |
Virginia |
Bell |
6 |
88.99 |
89.00 |
−0.01 |
Virginia |
Emery |
5 |
73.92 |
71.56 |
2.36 |
Virginia |
Sullivan |
6 |
56.05 |
58.18 |
−2.13 |
Runner |
AU-17 |
7 |
77.02 |
76.11 |
0.91 |
Runner |
GA-06G |
8 |
73.73 |
72.48 |
1.25 |
Runner |
GA-16H0 |
7 |
78.27 |
78.70 |
−0.43 |
Although various studies have estimated peanut pod maturity [8] [10] [12] [17] [18] [24], they employed a range of methods, most of which were manual and relied on scanners or hyperspectral sensors. These methods can be time-consuming and costly for growers to implement. Therefore, this study has developed an accurate, fast, and non-destructive method for estimating peanut maturity, which can significantly enhance the efficiency of scouting for color classification, making it more practical for widespread use.
4. Conclusions
This paper described the methodology that calculates OBBest and BLest of peanut maturity. The following conclusions are drawn from this paper:
1) This paper has presented a new automated approach for classifying peanut pod maturity. The Mahalanobis distance processing methodology to estimate peanut maturity is highly adaptable and robust and the classification methodology allows discriminating 2 different classes, corresponding to pods and background images. The classifier’s performance for the identification of mesocarp had 94% effectiveness.
2) The two empirical equations have the potential to estimate OBBest and BLest (%) from images of peanut pods taken from the field. The average error between OBBest and OBBobs was 2.37%. The average error between BLest and BLobs was 0.76%.
3) Pod blasting time affected image analysis results. The optimal pod blasting time for Virginia type and Runner type were 5 - 6 min and 7 - 8 min, respectively. The average error between OBBest and OBBobs was only 1.18% when optimal pod blasting time was reached.
4) This software proposed a simple, inexpensive, and non-destructive method for image acquisition as only a commercial camera or smartphone is needed. The accurate estimation of peanut maturity during harvest season will provide valuable information and help reduce the level of subjectivity that is associated with human assessment of peanut pod maturity.
Acknowledgements
This project is currently funded by the Public Service and Agriculture of Clemson University.