Plants, animals, and fungi display a rich tapestry of colors. Animals, in particular, use colors in dynamic displays performed in spatially complex environments. In such natural settings, light is reflected or refracted from objects with complex shapes that cast shadows and generate highlights. In addition, the illuminating light changes continuously as viewers and targets move through heterogeneous, continually fluctuating, light conditions. Although traditional spectrophotometric approaches for studying colors are objective and repeatable, they fail to document this complexity. Worse, they miss the temporal variation of color signals entirely. Here, we introduce hardware and software that provide ecologists and filmmakers the ability to accurately record animal-perceived colors in motion. Specifically, our Python codes transform photos or videos into perceivable units (quantum catches) for any animal of known photoreceptor sensitivity. We provide the plans, codes, and validation tests necessary for end-users to capture animal-view videos. This approach will allow ecologists to investigate how animals use colors in dynamic behavioral displays, the ways natural illumination alters perceived colors, and other questions that remained unaddressed until now due to a lack of suitable tools. Finally, our pipeline provides scientists and filmmakers with a new, empirically grounded approach for depicting the perceptual worlds of non-human animals.