Browse/search for people

Publication - Dr Tilo Burghardt

    Deep Learning for Exploration and Recovery of Uncharted and Dynamic Targets from UAV-like Vision

    Citation

    Andrew, W, Greatwood, C & Burghardt, T, 2019, ‘Deep Learning for Exploration and Recovery of Uncharted and Dynamic Targets from UAV-like Vision’. in: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2018). Institute of Electrical and Electronics Engineers (IEEE), pp. 1124-1131

    Abstract

    This paper discusses deep learning for solving static and dynamic search
    and recovery tasks – such as the retrieval of all instances of actively
    moving targets – based on partial-view Unmanned Aerial Vehicle
    (UAV)-like sensing. In particular, we demonstrate that abstracted tactic
    and strategic explorational agency can be implemented effectively via a
    single deep network that optimises in unity: the mapping of sensory
    inputs and positional history towards navigational actions. We propose a
    dual-stream classification paradigm that integrates one Convolutional
    Neural Network (CNN) for sensory processing with a second one for
    interpreting an evolving long-term map memory. In order to learn
    effective search behaviours given agent location and agent-centric
    sensory inputs, we train this design against 400k+ optimal navigational
    decision samples from each set of static and dynamic evolutions for
    different multi-target behaviour classes. We quantify recovery
    performance across an extensive range of scenarios; including
    probabilistic placement and dynamics, as well as fully random target
    walks and herd-inspired behaviours. Detailed results comparisons show
    that our design can outperform naïve, independent stream and
    off-the-shelf DRQN solutions. We conclude that the proposed dual-stream
    architecture can provide a unified, rationally motivated and effective
    architecture for solving online search tasks in dynamic, multi-target
    environments. With this paper we publish
    3
    3
    Source code available at: https://data.bris.ac.uk/data and https://github.com/CWOA/GTRF key source code and associated models.

    Full details in the University publications repository