Federated self-supervised learning (FedSSL) is an emerging method in the domain of machine learning. It collaboratively learns a powerful feature extractor among multiple participants by utilizing distributed unlabeled data. However, conventional FedSSL suffers from statistical heterogeneity due to the non-independent and identically distributed (Non-IID) data among participants. In this work, we introduce a novel method to tackle the Non-IID data issue in FedSSL. First, the relation knowledge distillation is utilized to enhance the learning from global models. Then, we dynamically update the local model with divergence-aware update (DAU) method to preserve the client’s knowledge of Non-IID data. Our experimental results demonstrate that the proposed approach outperforms other methods by up to 8% on linear evaluation, verifying the effectiveness of our approach.