Wearable human activity recognition (HAR) systems have received much attention due to their robustness to environments and capability of privacy protection in human-centric applications such as health-care services and intelligent surveillance. Recently, researchers have made different attempts for accurate HAR model construction, while, from the perspectives of user needs and sensory data, there still exist two important issues in the current models. On the one hand, they are always difficult to obtain interpretable recognition results for users in practical scenarios. On the other hand, there may exist uncertainties in sensor measurements such as the data imprecision caused by sensor displacement or movements with respect to the body, which also brings difficulty for the derivation of accurate recognition results. Motivated by the above considerations, a multi-framework evidential association rule fusion-based activity recognition method (Multi-EARF) is proposed by combining the knowledge characterizing the uncertain association relationships between activity features and classes under different recognition frameworks. To do so, the procedures of confusion matrix-based coarse class partition and adaptive threshold-based evidential association rule mining are successively proposed for generating the rules in the coarse-grained framework. After that, a rule set optimization strategy and a belief reasoning mechanism are then presented for obtaining accurate and interpretable recognition result for each unknown input instance. In addition, a self-collected dataset is provided for illustrating the feasibility and university of our proposal in real-world applications. Experiments on public UCI Smartphone and mHealth datasets demonstrate the superiority of our proposal on recognition accuracy and interpretability.