This paper explores the use of Google's Edge TPU, a purpose-built ASIC designed to run AI at the edge. Our evaluations are done based on the use case application of automated cattle activity classification, which requires classification (inference) to run on energy limited embedded devices. For this application, we consider a deep neural network classifier, which traditionally has been a challenge to run on resource constrained edge devices. Based on a real cattle activity dataset, and with the use of a joint-time-frequency data representation (spectrogram), we explore different trade-offs between classification accuracy and energy efficiency. Our results show that the Edge TPU can provide both excellent classification performance and energy efficiency, but that it exhibits a surprising bimodal and nonlinear behaviour, which makes it highly sensitive to the chosen neural network model size. Our results demonstrate the potential and importance of scalable data representation, such as the spectrogram used in this paper, in order to tune the Edge TPU to work at its optimal operating point, where it can provide high classification accuracy with minimal energy consumption.