Satellite federated learning (FL) applies distributed training framework to space, which enables collaborative learning on training an artificial intelligence model among scattered ground devices without directly sharing raw data. In low earth orbit (LEO) mega constellations, there are relevant use cases, such as inference based on satellite imaging. However, recent researches have stated that heterogeneous data of different ground devices may lead to the shift of local training, and this degrades the performance of FL model. Several solutions are developed to overcome this challenge. However, the ability of these solutions to solve data heterogeneity problem is limited, and their generalization performance is poor. Under this background, we propose a satellite-terrestrial collaborative federated learning with alternating contrastive training (FedAC). Firstly, we design a model initialization algorithm to make initial model learn more prior knowledge, which can increase the generalization performance of FL model and better adapt to new label domain. Meanwhile, an alternating contrastive training algorithm is proposed, where only local public model is uploaded to the satellite station for aggregation and local private model is saved locally, it decreases the impact of data heterogeneity. In addition, two special contrastive losses are introduced to correct local training direction, which can prevent the shift of local updating. Finally, simulation results show that our developed method can effectively mitigate the adverse impact of data heterogeneity and perform better when the ground device with new label domains.