Deep learning based semantic segmentation models have achieved remarkable results in recent years. However, many deep learning based models encounter the problem of catastrophic forgetting, i.e. when the model is required to learn a new task without labels for old objects, its performance drops significantly for the previous tasks. To solve this problem, an incremental learning method, a Combination of Old Prediction and Modified Label (COPML), is developed in this paper. The proposed method utilizes the prediction results of the old model and the modified labels of the new task to create pseudo labels which are close to the ground truths. By using these pseudo labels for training, the model is expected to preserve the knowledge of old tasks. In addition, knowledge distillation, the replay and parameter freezing strategy are also applied to the proposed method to further assist the model in overcoming catastrophic forgetting. The effectiveness of the proposed method is validated on two semantic segmentation models: Unet and Deeplab3 in Pascal- VOC 2012 dataset and a self-made dataset. The experimental results demonstrate that COPML enables the model to maintain most of the old knowledge while obtaining an excellent performance on a new task.