University of Leicester
Browse

I2Bot: an open-source tool for multi-modal and embodied simulation of insect navigation

Download (18.7 MB)
journal contribution
posted on 2025-03-12, 09:24 authored by Xuelong Sun, Michael Mangan, Jigen Peng, Shigang YueShigang Yue
Achieving a comprehensive understanding of animal intelligence demands an integrative approach that acknowledges the interplay between an organism’s brain, body and environment. Insects, despite their limited computational resources, demonstrate remarkable abilities in navigation. Existing computational models often fall short in faithfully replicating the morphology of real insects and their interactions with the environment, hindering validation and practical application in robotics. To address these gaps, we present I2Bot, a novel simulation tool based on the morphological characteristics of real insects. This tool empowers robotic models with dynamic sensory capabilities, realistic modelling of insect morphology, physical dynamics and sensory capacity. By integrating gait controllers and computational models into I2Bot, we have implemented classical embodied navigation behaviours and revealed some fundamental navigation principles. By open-sourcing I2Bot, we aim to accelerate the understanding of insect intelligence and foster advances in the development of autonomous robotic systems.

Funding

National Natural Science Foundation of China (Grant Nos. 62206066 and 12031003)

ActiveAI - active learning and selective attention for robust, transparent and efficient AI

Engineering and Physical Sciences Research Council

Find out more...

History

Author affiliation

College of Science & Engineering Comp' & Math' Sciences

Version

  • VoR (Version of Record)

Published in

Journal of The Royal Society Interface

Volume

22

Issue

222

Pagination

20240586

Publisher

The Royal Society

issn

1742-5689

eissn

1742-5662

Copyright date

2025

Available date

2025-03-11

Spatial coverage

England

Language

en

Deposited by

Professor Shigang Yue

Deposit date

2025-02-11

Data Access Statement

All the source codes, model of the robot and environments are open-sourced via Zenodo [108] and Github [107]. Supplementary material is available online [109].

Usage metrics

    University of Leicester Publications

    Categories

    No categories selected

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC