Graph Neural Networks (GNNs) have improved prediction accuracy and reduced model size in many applications. In this paper, we explore a new way of human activity recognition that leverages GNNs. Our technique focuses on inertial sensor data that has been acquired from accelerometers and gyroscopes in wearable devices. We introduce TinyGraphHAR, a novel GNN with attention and recurrence that constructs graphs that reflect temporal relationships of human events in the signals. TinyGraphHAR has superior discriminative power for complex events that are interconnected in time. For the benchmarks KU-HAR, PAMAP2, UCI-HAR, and Daphnet we achieve Macro F1 scores of up to 95.46 %, 75.59 %, 90.16 %, and 59.75 %, respectively. In addition, model sizes of TinyGraphHAR are up to 3 orders of magnitude smaller than comparative state-of-the-art neural architectures for the same data sets, thanks to the abstraction of the graph representation. TinyGraphHAR thus opens up new opportunities for a multitude of applications at the edge, including wearables and mobile devices with limited memory and energy budgets.