The multifunctional orchard robot use-case has achieved remarkable success in the development of an efficient gripper and detection system tailored for pear picking. Traditionally, research activities on orchard harvesting robots are only targeting apple picking. However, the latest developments in this use-case have presented a solution to harvest pears. The key innovation lies in the gripper system, which was realized following extensive research and development efforts. Unlike conventional methods that employ the robot’s arm movement to detach fruit from the tree, the novel concept allows the gripper mechanism itself to perform the pivotal pear-picking movement.
WUR researchers wrote software to integrate the earlier developed deep-learning peer detection algorithm, into the robot’s control system using ROS2 (Robot Operating System 2).
After indoor testing, a 2-day real-world test was conducted during the pear harvest period in September 2023 in Randwijk, where the experimental robot platform with the ABB robotic arm was moved into the orchard.
The results are convincing, with the robot being able harvesting pears using its innovative gripper and detection system. The trials have also gained invaluable insights into what works well and what can be improved. The data collected during these picking experiments will be analyzed to further refine the robot’s capabilities and make necessary improvements.
Red currant bush pruning
The past summer months have been used to explore better options for the sensor system. The research team is in search of high-quality sensors that can map plant architecture in 3D.
Two tracks are being explored for this purpose.
First, in collaboration with sensor experts at IMEC/OnePlanet, different 3D sensors (cameras, LIDARs) and combinations of these sensors and corresponding classification algorithms are being investigated. Second, the collaboration between WUR (Jochen Hemming) and Oregon State University (Alex You, Joe Davidson, and Cindy Grimm) has been instrumental in a study investigating how a simple 2D camera can be used to map a branch in 3D. The results of this study will be presented in May next year at the prominent International Conference on Robotics and Automation in Japan (ICRA 2024).
By Jochen Hemming (WUR)