Differentiable architecture search (DARTS) is an effective continuous relaxation-based network architecture search (NAS) method with low search cost. It has attracted significant attention in AutoML research and has become one of the most effective paradigms in NAS. Although DARTS comes with great efficiency over traditional NAS approaches in handling the complex parameter search process, it often suffers from stabilization issues in producing deteriorating architectures when discretizing the found continuous architecture. To address this issue, we propose a mean-shift based DARTS (MS-DARTS) to improve the stability based on architecture sampling, perturbation, and shifting. The proposed mean-shift approach in MS-DARTS can effectively improve the stability and accuracy of DARTS by smoothing the loss landscape and sampling the architecture parameters within a suitable bandwidth. We investigate the convergence of our mean-shift approach as well as the effects of bandwidth selection toward stability and accuracy optimization. Evaluations on CIFAR-10, CIFAR-100, and ImageNet show that MS-DARTS archives competitive performance among state-of-the-art NAS methods with reduced search cost.