News article
New video camera system captures the coloured world that animals see
Posted on behalf of: Lauren Ellis
Last updated: Monday, 5 February 2024
Researchers at the 麻豆传媒社区入口 and colleagues from the George Mason University US, have developed a new camera system, which will allow ecologists and filmmakers to produce videos that accurately replicate the colours that different animals see in natural settings.
Animals perceive the world differently because of the capabilities of the photoreceptors in their eyes. For example, honeybees and some species of bird can see UV which is outside the range of human perception. Reconstructing the colours that animals actually see can help scientists better understand how they communicate and navigate the world around them.
In a new paper published today (Wednesday 24 January) in , researchers detail how they developed a novel camera and software system that captures animal-view videos of moving objects under natural lighting conditions.
The camera simultaneously records videos in four colour channels: blue, green, red and UV. This data can be processed into “perceptual units”, i.e., units of photoreceptor responses, to produce an accurate video of how those colours are perceived by animals, based on existing knowledge of the photoreceptors in their eyes.
The team tested the system against a traditional method and found that the new system predicted perceived colours with an accuracy of over 92%.
Dr Vera Vasas, Research Fellow in Ecology and Evolution at the 麻豆传媒社区入口 says:
“Our new approach allows researchers and filmmakers to record animal-view videos that capture the interplay of colours over time. Now we can fully appreciate how much information we missed when we were only photographing immobile objects in the lab. This newfound ability to accurately record animal-specific colours in motion is a crucial step towards our understanding of how animals see the world.”
This novel camera system will open new avenues of research for scientists, and allow filmmakers to produce dynamic, accurate depictions of how animals see the world around them, the authors say. The system is built from commercially available cameras, housed in a modular, 3D-printed casing, and the software is available open-source, allowing other researchers to use and build on the technology in the future.
Senior author Daniel Hanley, Assistant Professor of Biology at George Mason University adds:
“We’ve long been fascinated by how animals see the world. Modern techniques in sensory ecology allow us to infer how static scenes might appear to an animal; however, animals often make crucial decisions on moving targets (e.g., detecting food items, evaluating a potential mate’s display, etc.). Here, we introduce hardware and software tools for ecologists and filmmakers that can capture and display animal-perceived colours in motion.”