Coreference resolution aims at linking all mentions that refer to the same entity, which are widely adopted in many biomedical and bioinformatics tasks, such as biomedical knowledge graph construction and metabolic pathway integration. Many recent studies focus on improving neural model structures. However, we argue that a practical method that integrates commonsense knowledge can further improve coreference resolution performance, because commonsense delivers extra prior knowledge for reasoning and can enhance related representations, rather than naive mention-context occurrence modeling. In this work, we propose an effective method to integrate external commonsense knowledge into a neural coreference resolution model. Specially, a gated attention mechanism is employed in our method to leverage commonsense according to different contexts. By using ConceptNet as the knowledge base in three span-ranking backbone models, the models can yield significant performance gains on used datasets. We also achieve improvements in tasks of long-term mention detection and cross-sentence coreferences after incorporating knowledge.