In natural environments, animals are bombarded by a myriad of sensory information, much of which is irrelevant to the tasks they go about in their daily lives. I am interested in how diverse animals’ sensory and neural systems are adapted to filter information from complex habitats to efficiently solve problems pertinent to their survival and reproduction.
I took a behavioral approach to this question during my PhD at Duke University (USA), where I worked under the supervision of Sönke Johnsen on the arboreal jumping spider Lyssomanes viridis. Using a variety of methodologies, including computer animation, I decomposed relevant stimuli into their component parts to determine which types of visual and chemical cues these spiders pay attention to, and which they ignore, across a variety of different contexts, including rival assessment, species and sex recognition, and microhabitat selection.
Currently I am working as a postdoc in the Lund Vision Group (Sweden) under the supervision of Dan-Eric Nilsson, where I am taking a computational approach to the question of how animals’ sensory organs are adapted to filter relevant information from habitats. Using custom-made light measurement equipment and pre-existing data on the optical and neural architectures of animal visual systems, we are developing methods to model entire visual scenes through the eyes of major groups animals. Such models will enable us to demonstrate what visual information is available and what information is filtered out by animal visual systems. This will place us and others working in sensory and behavioral ecology in more informed positions to generate hypotheses as to what animals might possibly be using their eyes for.