TY - JOUR
T1 - Navigating the landscape for real-time localization and mapping for robotics and virtual and augmented reality
AU - Saeedi, Sajad
AU - Bodin, Bruno
AU - Wagstaff, Harry
AU - Nisbet, Andy
AU - Nardi, Luigi
AU - Mawer, John
AU - Melot, Nicolas
AU - Palomar, Oscar
AU - Vespa, Emanuele
AU - Spink, Tom
AU - Gorgovan, Cosmin
AU - Webb, Andrew
AU - Clarkson, James
AU - Tomusk, Erik-Arne
AU - Debrunner, Thomas
AU - Kaszyk, Kuba
AU - Gonzalez-de-Aledo, Pablo
AU - Rodchenko, Andrey
AU - Riley, Graham
AU - Kotselidis, Christos
AU - Franke, Bjoern
AU - O'Boyle, Michael
AU - Davison, Andrew J
AU - Kelly, Paul H. J.
AU - Luján, Mikel
AU - Furber, Steve
N1 - This work was supported by the Engineering and Physical Sciences Research Council (EPSRC) under Grant EP/K008730/1, PAMELA Project
PY - 2018/11
Y1 - 2018/11
N2 - Visual understanding of 3D environments in real-time, at low power, is a huge computational challenge. Often referred to as SLAM (Simultaneous Localisation and Mapping), it is central to applications spanning domestic and industrial robotics, autonomous vehicles, virtual and augmented reality. This paper describes the results of a major research effort to assemble the algorithms, architectures, tools, and systems software needed to enable delivery of SLAM, by supporting applications specialists in selecting and configuring the appropriate algorithm and the appropriate hardware, and compilation pathway, to meet their performance, accuracy, and energy consumption goals. The major contributions we present are (1) tools and methodology for systematic quantitative evaluation of SLAM algorithms, (2) automated, machine-learning-guided exploration of the algorithmic and implementation design space with respect to multiple objectives, (3) end-to-end simulation tools to enable optimisation of heterogeneous, accelerated architectures for the specific algorithmic requirements of the various SLAM algorithmic approaches, and (4) tools for delivering, where appropriate, accelerated, adaptive SLAM solutions in a managed, JIT-compiled, adaptive runtime context.
AB - Visual understanding of 3D environments in real-time, at low power, is a huge computational challenge. Often referred to as SLAM (Simultaneous Localisation and Mapping), it is central to applications spanning domestic and industrial robotics, autonomous vehicles, virtual and augmented reality. This paper describes the results of a major research effort to assemble the algorithms, architectures, tools, and systems software needed to enable delivery of SLAM, by supporting applications specialists in selecting and configuring the appropriate algorithm and the appropriate hardware, and compilation pathway, to meet their performance, accuracy, and energy consumption goals. The major contributions we present are (1) tools and methodology for systematic quantitative evaluation of SLAM algorithms, (2) automated, machine-learning-guided exploration of the algorithmic and implementation design space with respect to multiple objectives, (3) end-to-end simulation tools to enable optimisation of heterogeneous, accelerated architectures for the specific algorithmic requirements of the various SLAM algorithmic approaches, and (4) tools for delivering, where appropriate, accelerated, adaptive SLAM solutions in a managed, JIT-compiled, adaptive runtime context.
KW - SLAM
KW - Automatic Performance Tuning
KW - Hardware Simulation
KW - Scheduling
U2 - 10.1109/JPROC.2018.2856739
DO - 10.1109/JPROC.2018.2856739
M3 - Article
SN - 0018-9219
VL - 106
SP - 2020
EP - 2039
JO - Proceedings of the IEEE
JF - Proceedings of the IEEE
IS - 11
ER -