Progress report for GS24-301
Project Information
Strawberry farming plays a crucial role in the economic landscape of US agriculture, as the USA stands as the world’s second-largest strawberry producer. However, snails and slugs present substantial risks to the quality of strawberry yields by physically feeding on the soft fruits. While chemical treatments exist, their efficacy is limited and may harm surrounding ecosystems, including soil health and non-target organisms. In addition, physical and mechanical approaches require significant human resources, leading to challenges for middle-class system of family farmers in terms of affordability and availability of labor. Moreover, human visual capabilities are hindered during the night, when these mollusks come out of hiding places and behave actively. In response, we propose the development of a Mollusk Control roBot, so-called MoCoBot, that can autonomously navigate strawberry fields, detect mollusks, and physically remove them from the ground and plants in a safe manner, especially during nighttime hours. Specifically, we will equip a robotic platform with infrared night-vision cameras, 3D-printed grippers, and AI models trained with a novel image dataset of snails and slugs. Our cost-effective robotic solution will be validated and improved through rigorous field testing conducted at both the KSU Field Station and local farms. Our goal is to significantly reduce the number of pests, while relying less on traditional methods. Instructions will also be released for potential users with limited programming experience to adapt the technology to different scenarios. We anticipate that our work will lay the key foundation for the development of effective pest control robots.
- Develop night vision-based visual mollusk detectors. We will create a novel hardware and software framework capable of effectively detecting individual mollusks in real agricultural environments. In particular, an affordable night-vision camera of $40 will be employed to ensure accurate detections under low light conditions, which are expected during early mornings or late nights when those pests are most active. To train our AI models, we will maintain live snails and slugs in our lab facility to collect night-vision image data and annotate them. Leveraging state-of-the-art machine learning algorithms, our final detector will be capable of recognizing variable structures and shapes of mollusks, whether they are on the grounds or on plants, regardless of their size and lighting conditions. Our goal is to gain a minimal error rate—e.g., an error margin of less than 1/8 inch in localizing the head, tail, and central body of a 1-inch slug—significantly outperforming traditional algorithms, designed for typical cameras, especially in low light conditions.
- Build an AI-empowered robot arm with customized grippers for precise and safe disposal of mollusks. We will employ a cost-effective commercial robotic platform (priced approximately at $1,300), featuring a robotic arm on its four-wheeled mobile base, to enable it to physically pick and place visually localized mollusks into a container for disposal. Our grippers will be 3D-printed, resembling a pinset with obtuse endpoints to ensure precise and secure handling while transporting individual snails or slugs without damaging plants. In addition, an AI model will be trained to extend the robot arm and position grippers optimally aligned with the target mollusk’s body direction. For snails, in particular, the grippers will be configured to maneuver to securely pick the shell. Integration of the night vision-based detector from Objective 1 will enhance the robotic manipulation capability under varying lighting conditions. Hence, we aim to achieve a success rate of over 90%, ensuring reliable and safe collection of detected mollusks regardless of their species, size, or the lighting conditions encountered in agricultural environments.
- Add mobility, and validate & improve in real fields. The night-vision cameras (Objective 1) and developed robot arm (Objective 2) will be equipped onto a four-wheeled mobile base, enabling it to safely navigate real strawberry fields for pest control during late-night hours. AI methods will be leveraged to process night-vision video frames to identify the traversable regions between plant rows. This complete form of the robot, MoCoBot, will be demonstrated within real strawberry fields for validation of its practicality. It will be deployed to conduct 10 one-hour field test sessions at local farms. After each session, quantitative evaluations will be performed to assess the proposed solution regarding its effectiveness to improve overall performance. Additionally, qualitative feedback will be gathered from the farmers to gain practical insights into developing practical robotic solutions for pest management on farms. Our aim is to 1) achieve fully autonomous robotic pest control, requiring zero human intervention, and 2) observe a 20% increase in mollusk removal while reducing reliance on traditional methods.
Cooperators
- (Researcher)
- (Researcher)
Research
We aim to develop mobile robots for physical mollusk pest control in real strawberry fields, with the goal of enhancing both the quality and quantity in production. Specifically, our robots are designed to autonomously navigate, detect, and physically contact mollusks for removal, particularly during late-night hours, when these pests are most active and easily observable. To address this, we adopt a multi-layered approach, encompassing hardware, software, and AI models, with a particular focus on safety, practicality, and affordability.
- Night-vision Mollusk Detection (Q1Y1~Q2Y1): We adapt our previous AI algorithms that demonstrated the state-of-the-art performance in detection of unhealthy strawberries [1] to reliably identify mollusks (i.e., snails and slugs) in agricultural fields. Considering the low-light conditions during nighttime hours, we will adopt night-vision cameras equipped with infrared sensors to enhance visibility. While these sensors have been widely utilized to observe various nocturnal animals [24], we will be the first to apply them to night-time detection of mollusks. In particular, these hardware components will be empowered by cutting-edge AI models. Since these models were mainly designed to work only with typical colorful images, we employ two strategies to address it.
Firstly, we will maintain 15 to 30 land snails and slugs within chambers at our laboratory each year, and they will be utilized to collect a dataset comprising 500 to 700 night-vision photos in Q1Y1. While 15 snails and slugs of legal species (e.g., Helix Aspersa) will be purchased from retailers, additional instances will also be gathered in KSU Field Station—a 25-acre outdoor space for interdisciplinary research, including green houses where various crops are cultivated. Whilst each image will display one or multiple mollusks arranged in an arbitrary fashion, the background of image will be visually diversified with different soils and plant parts. The student investigator will then manually annotate each image, leveraging a software tool, like LabelMe [25], specifically to log the locations of the head, tail, and body of each observed mollusk. These detailed annotations are required for precise predictions for subsequent tasks, such as robotic picking.
To further enhance detection performance, our second strategy is to employ an RGB-to-night-vision image converter to transform diverse colorful images from online into night-vision versions. These converted images will be used to augment our original night-vision dataset to expose our detector to a wider range of visual characteristics and achieve more reliable performance. We will convert 1,000 online images for the data augmentation and evaluate the trained detector’s efficacy based on its detection rate across 300 unseen image examples of mollusks. Our goal is to gain a minimal error rate—e.g., an error margin of less than 1/8 inch in localizing the head, tail, and central body of a 1-inch slug.
- Manipulation (Q3Y1~Q4Y1) & Navigation (Q1Y2~Q2Y2): Our robot platform features a single arm with 3D-printed grippers resembling pinsets to pick a target pest flexibly and precisely for removal. The robot arm will leverage the detection results from night-vision cameras whilst it is trained for picking in our laboratory environments where mollusks are manually posed in different orientations repeatedly. In each setting, the student investigator will manually control the arm to demonstrate effective ways for picking, particularly teaching the robot to align its grippers with the mollusk’s orientation (i.e., imitation learning [28]). For snails, the grippers will be trained to grasp their shells securely for stable transportation. Upon successful picking, the robot will deposit the target into a container of salty water to dehydrate them, ensuring careful robot-arm control to avoid any damage to nearby plants. Evaluation metrics will be designed to measure the success rates of the picking-and-placing, and we aim to achieve a success rate of 90%.
Our robot will also be designed to safely navigate strawberry farms, primarily following the rows besides plants. This feature will be guided by video frames captured by an additional night-vision camera mounted on the front of the mobile base of robot. As in [27], AI models will be employed to visually segment the regions of traversable rows from plants and other background while safely driving the robot body within the rows. To develop this AI-enabled segmentation model by Q2Y2, we will gather data using the night-vision camera from KSU Field Station in Q1Y2. Our four-wheeled robotic vehicle will finally not only identify the traversable rows but also navigate through them. Our goal is to minimize operational failures, such as driving out of safe rows, to one or two instances during each 30-minute operation. By the end of the project, however, through rigorous field testing, we aim to reduce the number of failures to zero during an 1-hour operation.
- Field Test (Q3Y2~Q4Y2): For field tests, we will utilize the resources of KSU’s Field Station and our established connections with local farmers in Georgia to assess our robot system in authentic field conditions. In particular, we plan to conduct 10 validation sessions in real strawberry farms. We will schedule these visits between March and early July to validate our methods across different seasonal environments and fruit growth stages. Field tests will primarily be performed during night hours, gathering not only quantitative performance data but qualitative feedback from the farmers. After each session, we will examine their questions and concerns to implement cooperative design ideas into our robotic solution and maximize its adaptability and practicality. As the quantitative evaluation will assess the effectiveness of our robotic system, our aim is to achieve a 20% increase in mollusk removal even with less usage of traditional methods (e.g., chemical baits).
- Practical & Affordable Design: Our design philosophy prioritizes accessibility for a broad spectrum of farmers. Although our robot will incorporate a novel combination of different components, each part will be one commercially available. Our explorative model is estimated to cost $1,500 to $2,000. We, however, aim to optimize costs by substituting components with lower-cost alternatives to target a platform priced under $1,500, while maintaining performance standards. Additionally, all instructions for users will be made publicly available through online repositories and instructional videos on YouTube.
Educational & Outreach Activities
Participation Summary:
- A workshop paper at ICRA2025 is under review. This paper will present our research on improving reliability of autonomous robot navigation in agricultural fields. The student author will participate in the workshop and introduce the work at the poster session.
- Submission of a conference paper to ECMRA2025 is in preparation (due on April 17th). This paper will be an extension of the above workshop paper and be included in the conference proceedings if it is accepted.
- More than two papers are in preparation for submission in Fall 2025 and even more in Spring 2026. One of them will be a paper on the datasets that we will collect from local strawberry fields.
- Kennesaw State University (KSU) will feature this project and the interview with the faculty mentor in their website and social media this summer for introduction to the public.
- Faculty mentor presented the project and progress to the public at the IT Prospect Student Day in Fall 2024 and will also do in May 2025.
- Student collaborators will present their work at KSU's C-Day event in Fall 2025 to disseminate gained knowledge to the visitors.
Project Outcomes
Our project will enable farmers to monitor and protect their crops during nighttime hours. This will lead to economic benefits from high-quality food production by minimizing the risk of crops being damaged by harmful pest (such as snails/slugs). In addition, farmers will need a significantly less amount of pesticide to apply, which makes agricultural practices more environmentally friendly. Furthermore, farmers will be able to focus on other essential tasks (e.g., marketing) while our AI/robotic solutions autonomously operate to eliminate pests outdoors. As a result, their practices will become not only productive and profitable but also cause significantly less stress on the farmers.
- We have tested low-cost night-vision cameras a research farm. In complete dark environments, nearby plants (e.g., bell peppers and leafy greens) were still clearly visible, while original colors were not identifiable. This observation supports our framework design to employ low-cost night-vision camera sensors for mobile robot navigation and operation during nighttime hours. We are discussing more of highly impactful ideas and applications, beyond pest management, using this technology.
- Several public datasets of snail/slug images are available. We have trained a detector of snails and slugs in these images, and the evaluation results have been promising. However, there is no image dataset gathered during night times, so we are discussing whether to collect our own night-time image dataset or develop an AI model that only require daytime images to perform well on nighttime images. Due to the scarcity of the research in this direction despite broad applications expected in agriculture, our research will lead to a significant impact on the community.
- Several datasets for model training of robot visual navigation have been found. Using these, we have trained an AI model that detects the lines of crop rows to navigate a mobile robot reliably and safely. In addition, we have discovered a novel image augmentation technique, called Crop-Aligned Cutout, to enhance the line detection performance by augmenting training data in a specialized way. A workshop paper will be presented at ICRA2025 in May 2025, and another paper will be submitted to a robotics conference in April 2025.
- From interactions with local farmers, we have received positive feedback about our research direction. Most of them were highly interested in the AI and robotics technologies, but many were not aware of recent advancement. In particular, they have all shown an interest in our idea of nighttime robot operation to protect their crops. This has reconfirmed the potential of this work to contribute to actual agricultural practices.
- In summary, in year 1, we learned various opportunities and limitations in conducting research towards the proposed idea. In year 2, we will focus on developing and integrating all the modules to maximize the potentials while solving the identified problems and plan to deploy a prototype in a real strawberry field. In addition, we will publish papers at various venues based on the gained knowledge to support future relevant research use these groundworks to initiate new impactful projects.