Go to contentGo to menuGo to searchGo to the news list

Laboratoire Angevin de Recherche en Ingénierie des Systèmes


Main navigation

    Search

    Breadcrumb

    ENGRAIS

    ENGRAIS

    • Share this page on social networks
    • E-mail this page

      Send by mail


      Separated by coma
    • Print this page

    Research project ENGRAIS

    Merging data for the (autonomous) navigation of a symmetric agricole robot

     

    Teams: Dynamic and discrete events and Optimization and

    Information, Signal, Image Processing and Life Sciences

     

    Labeling: none


    Term: 1 year (2019)

     

    Funding: Atlanstic RFI 2020

     

    LARIS staff involved: Rémy Guyonneau, Étienne Belin, Philippe Lucidarme, Franck Mercier and an engineer being recruited

     

    Project partners: Antoine Juan, Florian Gardes (ez-wheel)

     

     

    Abstract

    In France, the Ecophyto 2018 plan is one of the actions proposed by the Grenelle de concertation on environmental issues at the end of 2007. This plan was readjusted in 2015 via the Ecophyto II plan, which aims to reduce by 50% the phytosanitary products by 2025 (with a step of 25% for 2020). Thus, the minimization of phytosanitary products constitutes a major stake for French agriculture. For example, a strong action is being taken by the state right now to ban glyphosate in France within three years.

    The robotics community is buzzing to offer credible alternatives for weeding, and robots are starting to appear (Naïo Technology, Ecorobotix, Vitirover ...). The common problem of these robots is to have a navigation adapted to the target crops. This navigation must provide several functions that are:

    • Proper positioning of the robot in the row;
    • Management of the plot cover;
    • Assurance of human, animal and material safety functions.

    The weak point of the currently proposed robots  lies in their dependence on a mono-modal perception, which leads to operating  limitations . Thus an  adapted robot to weeding intra-rows   in market gardening (leek, beetroot, etc.), on LiDAR's1 perception, can only work at a certain stage of development of interest crops (premature weeding  impossible), and below some stage of weed invasion. This is justified by the fact that in the first case the LiDAR does not perceive any plants and can do not be located, in the other case the LiDAR is drowned with information and can not extract the information useful for its positioning. This demonstration is generalizable to all other mono-modal robot by multiplying the hazards, and leads to the conclusion that the multiplicity of the means of perception complement each other well and would allow to return robust the robots' navigation.


    This is the challenge of the ENGRAIS project which aims to propose a way to merge the perception methods to extract useful information for navigation. This data fusion opens the door to multi-cultural robots, and this new deal leads to anticipating a demonstrator able to approach different cultures.

    The reflections carried out and the analysis of the state of the art have led to two important innovations complementary to the data fusion:

    • The manufacture of a symmetrical robot, which has the double peculiarity of limiting maneuvers (especially during the change of rank) and therefore limiting soil compaction but also increasing the perception data (at the front as well as at the back of the robot) which should allow a more robust navigation;
    • The integration of a new wheel technology with  motorization and energy onboard, thus allowing a better stability by lowering the center of gravity and therefore the use of the robot in the presence of steep slopes, without the risk of overturning

    The proof of concept proposed for the ENGRAIS project is ultimately to design and build this symmetrical robot based on a new mobility technology, both material (motorized wheels) and software (data fusion).

     

    1 Light Detection And Ranging


     

    Bibliography

    [AHM17] A. AHMAD, R. GUYONNEAU, F. MERCIER, E.BELIN, Image processing based on shape descriptors and SVM classifier for the crop plant/weed discrimination, The 20th World Congress of the international Federation of Automatic Control, Toulouse, 10-14 july, 2017.

    [AHM18] A. AHMAD, R. GUYONNEAU, F. MERCIER, E. BELIN, On the use of shape and intensity features to discriminate crop plants and weeds from RGB images, International Conference on Image and Signal Processing (ICISP), Cherbourg, France, 2018.

    [BEL00] T. Bell, Automatic tractor guidance using carrier-phase differential {GPS}, Computers and Electronics in Agriculture 25 (12) (2000) 53 – 66.

    [BEN17] Benet, B., Lenain, R., & Rousseau, V. (2017). Development of a sensor fusion method for crop row tracking operations. Advances in Animal Biosciences, 8(2), 583-589.

    [ECO18] The ecophyto 2018 plan, http://agriculture.gouv.fr/ecophyto-english, accessed: 2017-06-01.

    [GER84] J. B. Gerrish, T. Surbrook, Mobile robots in agriculture, 1984 ASAE Publication. ASAE, p.30-41 12 p.

    [HAG97] Hague, T., Marchant, J. A., & Tillett, N. D. (1997, April). Autonomous robot navigation for precision horticulture. In Robotics and Automation, 1997. Proceedings., 1997 IEEE International Conference on (Vol. 3, pp. 1880-1885). IEEE.

    [HIR14] S. A. Hiremath, G. W. Van Der Heijden, F. K. Van Evert, A. Stein, C. J. Ter Braak, Laser range finder model for autonomous navigation of a robot in a maize field using a particle filter, Computers and Electronics in Agriculture 100 (2014) 41–50.

    [ISA12] H. Isack, Y. Boykov, Energy-based geometric multi-model fitting, International journal of computer vision 97 (2) (2012) 123–147

    [LEV11] Levinson, J., Askeland, J., Becker, J., Dolson, J., Held, D., Kammel, S., ... & Sokolsky, M. (2011, June). Towards fully autonomous driving: Systems and algorithms. In Intelligent Vehicles Symposium (IV), 2011 IEEE (pp. 163-168). IEEE.

    [MAH06] Mahlisch, M., Hering, R., Ritter, W., & Dietmayer, K. (2006, September). Heterogeneous fusion of Video, LIDAR and ESP data for automotive ACC vehicle tracking. In Multisensor Fusion and Integration for Intelligent Systems, 2006 IEEE International Conference on (pp. 139-144). IEEE.

    [MOR10] Moras, J., Cherfaoui, V., & Bonnifait, P. (2010, December). A lidar perception scheme for intelligent vehicle navigation. In Control Automation Robotics & Vision (ICARCV), 2010 11th International Conference on (pp. 1809-1814). IEEE.

    [PRE12] M. Prez-Ruiz, D. Slaughter, C. Gliever, S. Upadhyaya, Automatic gps-based intra-row weed knife control system for transplanted row crops, Computers and Electronics in Agriculture 80 (2012) 41 – 49.

    [RAS18] P. RASTI, A. AHMAD, E. BELIN, D. ROUSSEAU, Learning on deep network without the hot air by scattering transform. Application to weed detection in dense culture," 6th International Workshop on Image Analysis Methods for the Plant Sciences (IAMPS), Nottingham, 22-23 jan, 2018.

    [REI87] J. Reid, S. Searcy, Vision-based guidance of an agriculture tractor, IEEE Control Systems Magazine 7 (2) (1987) 39–43.

    [WOL08] Wöltjen, C., et al. "Plant growth depression by selective irradiation of the meristem with CO2 and diode lasers." Biosystems engineering 101.3 (2008): 316-324.

    [ZHA99] ZHANG, Qin, REID, John F., et NOGUCHI, Noboru. Agricultural vehicle navigation using multiple guidance sensors. In : Proceedings of the International Conference on Field and Service Robotics. August, 1999. p. 293-298.