CE33 - Interaction, robotique 2020

Autonomous Robotic Platform for Orchards Navigation – ARPON

ARPON : A robot prototype to autonomously navigate through orchards

The ARPON project aims to develop a navigation strategy enabling a robot to autonomously navigate through a commercial orchard. Since GPS information cannot be used due to the canopy, the strategy relies on visual data provided by multiple onboard cameras for path generation and tracking. By the end of the project, the robot travelled nearly 800 meters in a commercial orchard, successfully validating the expected proof of concept.

An autonomous robot to address the challenges of 21st-century agriculture

21st-century agriculture faces two major challenges: increasing production to meet the needs of a growing global population, and addressing labor shortages while improving working conditions. The increase in production must be considered within the context of economic and environmental constraints, meaning that resources such as biodiversity, water, and soil must be protected. To achieve this, the right amount of inputs must be applied at the right time and place to support sustainable production growth. This approach is now known as precision agriculture. Agricultural robotics, by offering solutions based on intelligent sensors, either onboard or integrated into autonomous systems, can help address this first challenge. The second challenge is also important. Traditionally, farmers have relied on seasonal workers, often poorly paid, to carry out maintenance and harvesting tasks at certain seasons. Recent studies suggest that, due to socio-economic, structural, and political factors, this type of employment is no longer attractive, and the agricultural workforce, whether local or not, is insufficient to meet demands in many regions of the world. This issue is further compounded by the physical strain and repetitive nature of agricultural tasks, which can lead to musculoskeletal disorders among workers and farm operators. Once again, agricultural robotics can help meet this second challenge. Robots can be used as "cobots" (collaborative robots), working alongside human workers to assist them in their tasks, such as transporting heavy loads. Additionally, robots can be deployed to replace humans in performing low-skilled, labor-intensive tasks that involve repetitive motions or awkward postures, such as manual weeding or harvesting fruits and vegetables. The ARPON project follows this dynamic and aims to address these challenges by developing a prototype capable of autonomously navigating a commercial orchard. The goal is both scientific and technical. On the one hand, it involves designing a strategy to address this problem, which must work without GPS (as the canopy perturbs it) and without any orchard instrumentation. On the other hand, the project will aim to achieve a proof of concept. The strategy will be implemented on a prototype and will be evaluated through experimental tests in commercial orchards.

The orchard is a large natural outdoor environment that varies greatly in appearance depending on seasons, treatments, or sunlight. It is also dynamic since vehicles or workers can be present. Additionally, GPS is often disrupted due to the canopy or tree protection nets. Thus, the orchard appears as a very challenging environment that raises various robotic issues, particularly in the fields of perception, control, and mapping. These three areas are at the heart of the ARPON project.

 

The methods used have sought to address these challenges. Regarding perception, the solutions have been both technical and scientific. From a technical standpoint, the robot was equipped with five cameras (three facing forward, two on the sides) to provide a wide field of view, allowing it to perceive the trees, even during U-turn manoeuvres in the headlands. This setup ensures that visual information characterizing the environment is always available. From a scientific perspective, we chose to detect the trunks, which are the least changing elements in orchards in terms of geometry. Two algorithms (one "classical", and the other based on deep learning neural networks) were implemented on the robot.

 

Regarding control, to address the dynamic nature of the environment, we opted for reactive strategies relying on visual information while accounting for various constraints. The visual data provided by the perception system is processed to construct a reference path that allows the robot to follow a row or perform a U-turn, depending on its position in the orchard. The robot is then controlled to follow this path while respecting different constraints (visibility, actuator limitations, etc.). Two strategies were developed and tested. Based on the results, only the second strategy was implemented on the robot and evaluated during several experimental campaigns conducted in different orchards.

 

Finally, considering the large size of the orchard requires the implementation of a map. Here, we opted for a specific type of map called a 'topological map,' known for being less sensitive to environmental variations and precise enough to efficiently perform the various tasks envisioned. The map consists of a set of images captured during the construction phase, prior to navigation. Localization is then achieved by comparing the current image with those in the map using dedicated neural networks. Additionally, to achieve an onboard localization system, the map is created using specific methods that drastically reduce computational costs.

In this project, efforts were focused on both scientific and technical aspects. From a scientific perspective, the project led to the development of a complete navigation strategy, allowing a mobile robot to operate in an orchard using visual data only (thus without relying on GPS, one of the most widely used sensors in agricultural robotics). Moreover, it exhibits a certain level of generality since it can work regardless of the orchard's geometry (straight rows or circular ones) and does not require any particular instrumentation of the environment.

From a more technical perspective, a robotic platform was chosen and fully equipped to meet the project and the navigation strategy needs. The technical decisions led to select a car-like robot (to facilitate manoeuvres and avoid damaging soils). It was then equipped with a perception system of five cameras and a laser designed to perceive the environment with a wide field of view. The developed navigation strategy was then implemented on the robot and validated on two different sites, demonstrating the versatility of the approach. The first orchard is located at the agricultural high school in Auzeville Tolosane and is dedicated to training. Small in size (40m long by 4m wide rows), it was used for initial evaluations. The second orchard, at CEFEL, dedicated to commercial production, is larger (100m long by 4m wide rows) and allowed us to test our solutions in a more realistic context. During the final experiments, the demonstrator travelled about 800 meters within the commercial orchard at CEFEL, completing approximately eight row followings and the associated U-turns using solely vision. This result validates the proof of concept, which was the primary goal of the ARPON project. This video showcases the results achieved in CEFEL at the end of this project:

wp.laas.fr/arpon/2024/06/15/final-results-the-proof-of-concept-is-fulfilled/

 

However, to perform the desired tasks, it was also necessary to build a map and localize the robot within it. We first developed a topological map based on images from the onboard cameras. Then, we designed an algorithm that allowed localization within this map. This algorithm compares the current image perceived by the robot with the images stored in the map to determine the closest match and thus establish the robot’s position in the environment. Finally, we implemented our approach on the robot to evaluate the accuracy and computation time, as localization must be performed within 1 second, given the intended application. The obtained results validate the chosen approach.

The ARPON project led to the development and experimental validation on a fully equipped prototype of an autonomous navigation strategy without GPS in an orchard. By the end of this project, the demonstrator was able to travel nearly 800 meters in a commercial orchard, successfully sequencing row-following and U-turns based on data provided by the onboard cameras only. The expected proof of concept has therefore been fully achieved.

 

These results have opened up many roads for research in various fields of mobile robotics. In terms of perception, the currently implemented techniques can still be improved, particularly by combining the two mentioned algorithms. Additionally, it will be necessary to enhance the transition robustness between row-following and U-turn manoeuvres. It might also be considered to replace the "low-cost" cameras intentionally used in this project with higher-performance ones. Finally, in the longer term, it would be possible to consider new sensors (in addition to cameras) that would allow tree detection, even with reduced lighting (evening/night) or in case of unfavourable weather conditions (rain, fog, etc.).

 

As for control, it would be interesting to integrate an obstacle avoidance method into the proposed strategy to ensure safety. This will require the proper detection of obstacles and the estimation of their potential movement before adjusting the strategy accordingly. Similarly, it would be interesting to equip the robot with an arm or an active tool to carry out specific maintenance tasks. This would lead to considering the problem of controlling a mobile manipulator, including the issues of motion coordination. Finally, in this context, managing disturbances (vibrations, uneven ground, etc.) will be necessary to increase the reliability and robustness of the prototype. Potential solutions could involve the introduction of new sensors and estimation techniques, as well as adjustments to the control problem, thus opening new research directions.

 

Finally, regarding the mapping of the environment and localization based on images, no currently available method is sufficiently general to handle all possible scenarios that may occur in an orchard (day/night, sun/rain, occlusions, seasonal changes). Therefore, it would be relevant to address this problem in the future, as it presents a real challenge due to the extremely variable nature of the environment and the need for a fast localization algorithm given the considered application.

This project aims at developing robotics tools participating in the resolution of some of the challenges of modern agriculture, such as doubling the production by 2050 to meet the increasing food demand, producing in a sustainable manner or the lack of workers. It is proposed to develop an autonomous navigation system for a mobile base evolving in orchards, which may then be used to embed devices to prune, thin, weed, monitor or harvest. In order to be suitable for commercial use, the navigation system has to be easily scalable to any kind of orchards, safe for the workers and the environment, cost-effective, and user-friendly. In the meantime, it has to deal with challenges such as the absence of GPS to navigate, a constant evolution of the environment, a rough terrain and different climatic conditions. These objectives will be addressed by investigating, developing and combining methods from control theory, image processing, graph theory, and artificial intelligence.

Project coordination

Viviane CADENAT (Laboratoire d'analyse et d'architecture des systèmes du CNRS)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partnership

LAAS-CNRS Laboratoire d'analyse et d'architecture des systèmes du CNRS
UFPE Universidade Federal de Pernambuco (UFPE) / Universidade Federal de Pernambuco (UFPE)

Help of the ANR 249,976 euros
Beginning and duration of the scientific project: February 2021 - 36 Months

Useful links

Explorez notre base de projets financés

 

 

ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter