Multisensory locomotion control
We develop a multisensory locomotion control architecture that uses visual feedback to adaptively improve acoustomotor orientation response of the hexapod robot AMOS II. The robot is tasked with localising an audio-visual target by turning towards it. The circuit uses a model of the peripheral auditory system of lizards to extract sound direction information to modulate the parameters of locomotor central pattern generators driving the turning behaviour. The visual information adaptively changes the strength of this acoustomotor coupling to adjust turning speed of the robot. For more details see Shaikh et al, CLAWAR, 2017.