Chonnam National University researchers have successfully implemented 'agricultural spatial intelligence' by integrating drone footage and ground LiDAR data using artificial intelligence (AI) technology to accurately reproduce and understand complex agricultural environments in three dimensions (3D).
The research team led by Professor Lee Kyung-hwan of the Department of Convergence Bio-System Mechanical Engineering at Chonnam National University announced on the 20th that they have presented a new spatial recognition method that operates stably even in irregular environments such as orchards based on cross-modal AI technology that precisely connects data from different points and forms into a single coordinate system.
This achievement was published in the world's top academic journal in the field of agricultural AI, 'Artificial Intelligence in Agriculture,' proving the global research competitiveness of AI-based digital agriculture technology.
The team developed a technology that precisely aligns aerial imagery and ground sensor data using a transformer-based deep learning structure, overcoming the limitations of conventional location estimation methods relying on Global Navigation Satellite System (GNSS). By simultaneously implementing centimeter-level precise location recognition and 3D environmental modeling, the potential for practical application in agricultural fields has been significantly increased.
This technology has expanded beyond map generation to a structure that can 'understand' agricultural environments. It has laid the foundation for precisely analyzing the structural characteristics and spatial relationships of orchards, proving its potential expansion into various intelligent agricultural services such as crop growth analysis, operation planning optimization, and productivity prediction.
The greatest significance of this research is the implementation of a spatial intelligence system that can perceive and judge agricultural environments. It can be utilized as a foundational technology for agricultural robots to understand the environment and perform tasks autonomously without human intervention.
It is expected to have a ripple effect across the entire digital agriculture industry in the future as it can be integrated with technologies such as digital twins, autonomous robots, and data-based decision-making.
Professor Lee Kyung-hwan stated, “We have presented a new approach to understanding agricultural environments by integrating aerial and ground data” and “We will continue to expand it to various crops and agricultural environments to develop a fully autonomous AI agricultural system as a key technology.”
This research was supported by the BK21 Phase 4 IT-Bio Convergence System Agriculture Education Research Group, the Ministry of Agriculture, Food and Rural Affairs' Agriculture, Food and Technology Convergence Researcher Training Program, and the Ministry of Science and ICT and the Institute for Information & Communications Technology Promotion's AI Convergence Innovation Talent Training Program.