posted on 2025-03-12, 09:24authored byXuelong Sun, Michael Mangan, Jigen Peng, Shigang YueShigang Yue
Achieving a comprehensive understanding of animal intelligence demands an integrative approach that acknowledges the interplay between an organism’s brain, body and environment. Insects, despite their limited computational resources, demonstrate remarkable abilities in navigation. Existing computational models often fall short in faithfully replicating the morphology of real insects and their interactions with the environment, hindering validation and practical application in robotics. To address these gaps, we present I2Bot, a novel simulation tool based on the morphological characteristics of real insects. This tool empowers robotic models with dynamic sensory capabilities, realistic modelling of insect morphology, physical dynamics and sensory capacity. By integrating gait controllers and computational models into I2Bot, we have implemented classical embodied navigation behaviours and revealed some fundamental navigation principles. By open-sourcing I2Bot, we aim to accelerate the understanding of insect intelligence and foster advances in the development of autonomous robotic systems.
Funding
National Natural Science Foundation of China (Grant Nos. 62206066 and 12031003)
ActiveAI - active learning and selective attention for robust, transparent and efficient AI
Engineering and Physical Sciences Research Council
All the source codes, model of the robot and environments are open-sourced via Zenodo [108] and Github [107].
Supplementary material is available online [109].