Progress report for GNE22-285
Green fruit thinning is the process of removing young fruit from trees to increase the size of remaining fruit. Previous applications of robotic technologies in agriculture imply the feasibility of an automated green fruit thinning system, which could mitigate drawbacks present in more-traditional thinning methods, e.g., non-selective thinning, significant labor requirements, etc. In the proposed research project, a robotic green fruit thinning system will be developed and tested. The project will consist of four main objectives: 1) to develop a stem-cutting end-effector prototype based on garden snippers; 2) to develop a 3D vision system for detecting coordinates of target fruit and boundaries in point-cloud form in an orchard environment; and 3) to integrate the components developed in objectives 1 and 2, along with those developed prior to the project start date, to create a robotic green fruit thinning system. The expected outcome of this research project is the development of a robotic green fruit thinning system that satisfies the following criteria: 1) the machine vision system detects green fruit with high accuracy; 2) the end-effector removes green fruit in an optimal manner; 3) the end-effector does not remove incorrect fruit; 4) the robotic manipulator moves to its desired position in a timely manner; and 5) the tree is not damaged during thinning. It is expected that the developed system will serve as a proof of concept, showing the feasibility of robotic green fruit thinning systems, potentially motivating further development for commercial implementation, and ultimately, improving the profitability of orchard management.
The primary goal of this project is the development of a robotic system for green fruit thinning. The following are the objectives for the project:
Objective #1: Green Fruit Removal End-Effector Development
An end-effector for green fruit thinning will be developed. Based on the results of prior green fruit removal dynamics experiments conducted by the project team members, a stem-cutting mechanism will be used for simplicity and effectiveness of design. The end-effector design will be based on garden snippers. The end-effector will be designed such that no unintended damage is caused to trees and their structures The performance of the end-effector will then be evaluated in orchard tests. The end-effector performance will be compared to that of a previous designed developed previously by the project team members. It is expected that the end-effector will thin green fruit with a high success rate, while causing no damage to the tree or unintended removal of non-target fruit.
Objective #2: 3D Vision for Robotic Green Fruit Thinning
A 3D vision system will be developed for this project. The primary purpose of the 3D vision system will be to determine the 3D coordinates of target green fruit and a boundary map for collision-free path planning. The 3D vision system will build on results obtained from deep learning-based computer vision algorithms implemented for robotic green fruit thinning by the project team members. Data obtained during Spring 2022 including RGB-D (color-depth) images will be used to aid in developing and implementing algorithms for 3D vision. A 3D vision system than can determine boundary points and fruit locations with high accuracy is expected.
Objective #3: Robotic Green Fruit Thinning System
A completed green fruit thinning system will be developed. The system will utilize the components developed for objectives #1 and #2 as well as components developed before the proposed project period. The end-effector will be mounted onto the robotic manipulator present in the lab facilities used for this project. The vision system will be used, as described in Objective #2, to determine target fruit locations and boundaries for the robotic manipulator to navigate. Path planning algorithms developed before the project start period will be used to determine the shortest path between multiple fruit. The performance of the system will be evaluated in orchard tests. A system that thins green fruit from a tree with high accuracy and success with no collisions with obstacles is expected.
Green fruit thinning is the process in which the number of fruits is reduced early in the growing season. It is the final opportunity to set the final fruit amount and placement on a tree limb (Lewis, 2018). The three main purposes of thinning, in order of importance, are the following: 1) to improve the size and quality of the remaining fruits; 2) to reduce the risk of biennial bearing, i.e., the overproduction of a crop one year, and the underproduction of the crop the next year; and 3) to reduce the occurrence of limb breakage resulting from leaving too much fruit on it. Also, the thinning of fruit reduces the risk of the spread of pests and diseases (Vanheems, 2015).
Several methods for green fruit thinning have been employed, including manual (i.e., by hand), chemical, and mechanical thinning. Hand thinning allows for thinning to be highly selective, i.e., one can choose specifically which fruits on a given branch are removed. However, this process is very time consuming, and the high labor requirements make hand thinning infeasible at large scale. In chemical thinning, chemical thinner is applied to the trees at a fixed rate within an orchard. This method of thinning is an established and essential practice performed by fruit growers each year. For many apple growers, chemical thinning is becoming the single most important spray practice through the season. However, one main problem with chemical thinning is the inconsistency of response due to environmental and tree factors. The correct timing of application and choice of thinner formulation are vital factors in optimal chemical thinning that are dependent on fruit varieties and growing stages (Tyagi et al., 2017). Mechanical thinning has shown some success in fruit thinning, with an example being a drum shaker, discussed by Miller et al. ( 2011), which thinned 37% of apples at the green fruit stage and reduced follow-up hand-thinning costs significantly. However, similarly to chemical thinning, mechanical thinning has been shown to be non-selective, and it can cause damage to the trees.
There are pros and cons to each of the previously-discussed thinning methods. It is desired to develop a robotic system that is as selective as hand thinning, while being cost-effective like mechanical and chemical methods. The proposed research is an investigation of green fruit-thinning dynamics and the development of a green fruit-thinning system, specifically for apples. The project will consist of the following three research objectives: 1) the development of a stem-cutting end-effector; 2) development of a 3D system to detect target green fruit in an orchard environment, as well as generate a boundary map for collision-free path planning; and 3) the integration of the robotic green fruit thinning system by combining the 3D vision system and stem-cutting end effector with a robotic manipulator.
The project will consist of three primary phases. The first phase is a pre-projected phase, named Phase 0, which includes all work done for the development of a robotic green fruit thinning system before the proposed project start date. This includes initial green fruit removal dynamics experiments in prototype evaluation, green fruit segmentation and orientation estimation algorithms implementation, and robotic manipulator path planning for green fruit thinning. The next phase, named Phase 1, will include the development of the stem-cutting end-effector and 3D vision system described in objectives #1 and #2, respectively. These objectives will be completed in parallel. Lastly, Phase 2 will integrate the components developed in Phase 0 and Phase 1 to develop a robotic system for green fruit thinning. A flowchart of the project is shown in figure 1.
Figure 1. Methodology flowchart for the proposed robotic fruit thinning system
Objective #1: Green Fruit Removal End-Effector Development
In Spring 2021, a stem-cutting end-effector (figure 2) was developed based on the results of previous fruit-removal dynamics experiments conducted by the project team members. The end-effector design incorporated a PVC pipe with a DC motor attached to a side, and a utility blade attached to the shaft of the motor. For a given target fruit, the PVC pipe would encapsulate the fruit, and the motor would be activated to cut the stem of the fruit against a 3D-printed cutting mount. This prototype was found to have a high fruit-removal success rate. However, it was somewhat difficult to maneuver in an orchard environment when attached to a robotic manipulator. Neighboring fruit and canopy would need to be moved in some cases to allow the target fruit to be properly encapsulated. A snipper-based end-effector is believed to be easier to maneuver, as such an end-effector could move directly to the stem of the target fruit without having to encapsulate the fruit itself first.
Figure 2: Stem-cutting end-effector prototype developed prior to the proposed project period.
A stem-cutting end-effector for green fruit thinning modeled after garden snippers will be developed during the proposed project period. The choice of end-effector was determined through green fruit removal dynamics experiments conducted by the project team members in Spring 2021, during which a stem-cutting end-effector was determined to be both a sufficiently simple and effective device for green fruit thinning in apple orchards. It is believed that the snippers should be able to access the stems of target fruit without great difficulty. The prototype will be designed and developed between August 2022-April 2023. SOLIDWORKS design software made available by the Pennsylvania State University will be used to aid in the design process.
In the tentative proposed end-effector design, a pair of garden snippers will be used as the cutting mechanism (figure 3). A DC or stepper motor will be placed near the handles, with the shaft placed between the handles. A wire will be attached to one of the handles and wrapped and secured around the shaft of the motor. When the motor is actuated, the wire will pull on the wire-attached handle, causing the snippers to close around the stem of a target fruit. A mount will be made using a 3D printer to secure the motor and the static handle of the snippers. A microcontroller and motor driver board will be used to control and drive the end-effector motor.
Figure 3: Proposed snipper-based end-effector design. As the motor is actuated, the wire pulls down on the handle of the snippers, which causes the snippers to cut the stem of the target fruit.
The end-effector prototype will be tested and evaluated between May-June 2023 at the Penn State Fruit Research and Extension Center in Biglerville, PA. The prototype will be evaluated while mounted onto a UR5e robotic manipulator. Fuji and Gold Rush trees located at the center will be used for testing; 50 fruit for each cultivar will be tested. The prototype will be evaluated on its success rate in fruit removal, i.e., the ratio of successful fruit removal attempts to total fruit removal attempts, and its ability to maneuver around obstacles in the orchard environment while attached to a robotic manipulator will be qualitatively determined.
The testing procedure for each test fruit will be the following: 1) record the fruit diameter and length and the stem diameter and length; 2) place snippers of end-effector around the stem of the target fruit; 3) record the angle between the snippers and stem; 4) actuate the end-effector motor and record whether or not fruit removal was successful. After testing and data recording, the success rate of fruit removal, as described previously, will be obtained. One-way ANOVA tests will be applied to obtain relationships between fruit/stem dimensions, cutting angle, and the success of fruit removal. The performance of the end-effector will be compared to that of the previous prototype developed by the project team members discussed previously.
Upon the successful completion of Objective #1, an end-effector with a >90% fruit-removal success rate is expected. It is also expected that the end-effector can maneuver throughout tree obstacles, such as branches, non-target fruit, leaves, etc., without significant difficulty. The completed end-effector will be integrated with the vision system for path planning and fruit removal.
Objective #2: 3D Vision for Robotic Green Fruit Thinning
A 3D vision system will be developed to detect the 3D coordinates of green fruit, as well as determine boundary coordinates in the form of a point cloud. Data will be obtained in May-June 2022, before the project start date, for processing and algorithm development. The Intel® RealSense™ Depth Camera D435i will be used to obtain RGB-D (color-depth), where depth images contain pixels whose values indicate the distances of corresponding points from the camera in an environment. The camera will be mounted to the UR5e Robotic manipulator, which will be mounted in a utility vehicle (UV). 15 Fuji 10 Golden Delicious, and 10 GoldRush trees located at the Penn State Fruit Research and Extension Center will be used for image acquisition. The camera will take images from a range of 20-100 cm. The fruit sizes will be between 15-30 mm. For each tree the UV will drive up to align the mounted robotic manipulator with the tree. The robotic manipulator will be manually positioned to take images of each suitable cluster within the tree. At least three images of each cluster will be taken. In total, about 1000-1500 RGB-D images will be taken.
The color images will be used for green fruit and stem segmentation and orientation estimation, where segmentation is the generation of pixel-wise masks of a target object. The project team members have implemented algorithms for green fruit and stem segmentation and orientation estimation prior to the project start date, of which an example can be seen in figure 4. The depth images will be converted to 3D point clouds for robotic manipulator path planning. The masks of the green fruit and stems will be mapped onto corresponding points in the point cloud generated from the depth image to obtain 3D green fruit and stem objects. Since multiple point clouds of the same cluster will be taken, 3D reconstruction will be performed to properly combine the point clouds to obtain a complete point cloud of the environment around the cluster for proper collision-free path planning. The masks and orientations will be used to help determine which fruit in the 3D point cloud space to remove, as well as provide target locations for the end-effector during path planning. Overall, these algorithms will be implemented and developed between August 2022-April 2023 and evaluated in May-June 2023 in the orchards.
The accuracy of 3D fruit and stem object detection will be determined by comparing root-mean-square point-to-point distances between the generated 3D object points to the ground truth 3D object points. If a generated 3D object has a distance value below a predefined threshold with the closest ground truth fruit, it will be labeled a true positive. Otherwise, it will be labeled a false positive. The expected outputs for 3D fruit and stem object detection are >80% precision (# true positives vs. # true + false positives) and >80% recall (# true positives vs. # all ground-truth objects in scene). The completed 3D detection algorithm will be included in the overall machine vision system for the robotic green fruit thinning system.
Figure 4. Green fruit segmentation and orientation estimation.
Objective #3: Robotic Green Fruit Thinning System
The end-effector and 3D vision system algorithms will be integrated with a UR5e robotic manipulator to obtain a robotic green fruit thinning system. A series of tests will be conducted to evaluate the integrated robotic fruit thinning system. Field evaluation will be carried out in research apple orchards at Penn State FREC in May-June 2023. The Fuji and Golden Delicious cultivars will be tested in these orchards. The performance of the fruit removal process and thinning targeted fruits, and the effects on other fruits/canopy will be the major indicators to evaluate the overall performance of the thinning system. Before the test, the total number of fruits and clusters in targeted trees, and the number of thinned fruits and corresponding locations will be recorded.
The major criteria for evaluation of the performance of the developed robotic green fruit thinning system are as follows:
Overall success of the fruit thinning: the percentage of total removed fruits by the robotic system with respect to the total number theoretically that should be removed. Expected outcome: >90%.
Processing speed of fruit removal operations: for the individual fruit removing end-effector, the processing time for removing each fruit will be recorded. Expected outcome: <3 seconds/fruit.
Influence on neighboring fruits and tree canopy: unexpected fruit removal, namely, the number of fruits being removed accidentally by the robotic system; damage to the neighboring fruits and tree canopy will be analyzed. Expected outcome: No influence on neighboring fruits or tree canopy.
For objective #1, over 550 RGB-D images have been obtained of Fuji and Golden Delicious cultivars using the RealSense d415 stereo camera from May-June 2022 at the Fruit Research and Extension Center in Biglerville, PA. This number of images was determined to be sufficient for segmentation of green fruit using the stereo camera. There exists a segmentation model previously trained on a set of over 500 smartphone images of Fuji, Golden Delicious, and GoldRush cultivars obtained from May-June 2021; training on this pre-trained model with the RGB-D data instead of a completely new model will reduce the training dataset requirements. Processing and algorithm development of 3D vision algorithms will take place from January-June 2023. Evaluation will take place from May-June 2023.
For objectives #2 and #3, there are currently no results to report. development of the end-effector and integrated robotic green fruit thinning system, respectively, will take place from January-June 2023. Evaluation will take place from May-June 2023.
Education & Outreach Activities and Participation Summary
The primary goal of this project is to proof the concept of robotic green fruit thinning by integrating the green fruit removal dynamics, green fruit detection, and green fruit removal executing. The outreach program of this proposed study will enhance the visibility and acceptability of the newly developed robotic green fruit thinning system through various outreach activities. The audiences for the project can be from academia, agricultural industry, to the tree fruit growers. During the period of this project, we will expand our dissemination and outreach activities by providing: (i) technical presentations in professional meetings, (ii) presentations during stakeholder meetings and public events, (iii) demos and presentations during field day events, and (iv) through research and extension publications. Below are a few detailed plans for outreach activities.
- Share the scientific findings in the professional community.
The results of each objective will be shared with researchers through publications, conference presentations, posters, and seminars. Project PIs will participate in various domestic conferences, including the Annual International Meeting of American Society of Agricultural and Biological Engineers (ASABE), Northeast Agricultural and Biological Engineering Conference (NABEC). The results will be presented at conferences through oral presentations and exhibition/posters at those scientific and professional society meetings to get feedback from other researchers. Major funding will also be published at some flag scientific journals, such as the Transactions of the ASABE, Computers and Electronics in Agriculture, and Journal of Field Robotics.
- Introduce the concept of robotic green fruit thinning technologies to tree fruit growers.
An article introducing the automated green fruit thinning will be published in the Fruit Times Newsletter to bring attention from tree fruit growers. Meanwhile, as the project going forward, we will report periodical results on this newsletter and other extension publication as well. The widely distributed information to fruit growers in Pennsylvania and surrounding states will help generate interest in this system prior to other outreach activities such as workshops and demonstrations.
- Workshop and demonstrations to the stakeholders
The PIs will present the research findings to stakeholders during the annual Penn State Fruit Research and Extension Center (FREC) field day and Mid-Atlantic Meetings with 50-100 participants. We could also present the research findings to broad grower community through events such as winter and spring grower meetings organized by Penn State Extension.
- General public showcase
The PIs will also showcase the developed sensing technology and robotic pruning system to the general public, such as STEM students at Penn State and 4-H robotic groups. The results or the robotic system will be exhibited or posted at the Penn State Gamma Sigma Delta Research exposition and Penn State Ag Progress Day.
Some outreach has been done so far on the current status of the Robotic Green Fruit Thinning Project. First, mechanical demonstrations of the robotic green fruit thinning system were done at Penn State Ag Progress Days 2022 from August 9-11. Although an exact number of how many farmers/ranchers, agricultural educators, or service providers attended cannot be provided, the number of general attendees interacted with during the event is estimated to be in the hundreds. During the event, the project was featured in two TV interviews by Pennsylvania Cable Network (PCN) and WTAJ. On September 14, 2022, A brief presentation of the current progress of the robotic green fruit thinning system was done for about 30 farmers at the Plant Protection Field Day at the Fruit Research and Extension Center in Biglerville, PA. Third, the project was briefly mentioned in WQED Pittsburgh's The Growing Field: Future Jobs in Agriculture.
More outreach is planned in the future. A poster on the project will be presented at the Mid-Atlantic Fruit and Vegetable Convention 2023, and presentations are planned for ASABE AIM 2023 and NABEC 2023.
There are no results to show yet for this project, so no outcomes can yet be stated. A comprehensive update on this section will be provided with the final report.
Much of the tasks for this project still have yet to take place, so there is not yet much considerable change to describe. A comprehensive update on this section will be provided with the final report.