Effective microseismic event identification and classification form the bedrock of data analysis in microseismic monitoring systems, facilitating real-time source location, rockburst prediction, and mine safety. However, the complex mining environment necessitates preprocessing of sensor-collected microseismic signal data, plagued by noise. Traditional methods often yield inaccurate results when events exhibit similar traits. Machine learning's high precision separation proves promising, anticipating safety alerts by learning historical microseismic event patterns, and applying them to real-time data for predictive analysis. This approach mitigates inefficiencies and errors associated with manual recognition. Hence, machine learning has gained substantial traction in microseismic monitoring. This paper reviews recent machine learning applications in microseismic signal recognition and classification, addressing limitations of traditional methods, highlighting developmental disparities, presenting machine learning-based categorization, and summarizing advancements in signal recognition models. Lastly, the potential and challenges of machine learning in microseismic signal recognition are discussed.