The peripheral auditory system of a lizard is strongly directional in the azimuth plane due to the acoustical coupling of the animal's two eardrums. This feature by itself is insufficient to accurately localize sound as the extracted directional information cannot be directly mapped to the sound direction, and neural post-processing becomes a necessity. We implement a model of the auditory system coupled with a Cerebellar Model Articulation Controller based neural network and employ online reinforcement learning to build an accurate representation of sound direction in simulation.
Main Research Area:
International Workshop on Bio-Inspired Robots, 2011