In recent research, researchers have developed various robust principal component analysis (PCA) methods that remove the optimal mean calculation automatically to handle high-dimensional data with outliers. However, they still exhibit sensitivity to outliers since they cannot efficiently mitigate the effect of samples that deviate significantly from the rest data. To address these problems, we propose a unique robust PCA method based on the nested ℓ p -norm and ℓ 2,p -norm. In our method, we are not required to calculate the optimal mean. Specifically, our method employs ℓ 2 -norm on the spatial dimension to retain the desirable properties of PCA such as rotational invariance while we leverage ℓ p -norm as the distance metric to measure the projected difference between each pair of instances and suppress the effect of outliers. Consequently, our method enhances the robustness to outliers. We provide a theoretical analysis and an iterative optimization algorithm to obtain the optimal solution for our method. Empirical results on synthetic datasets, UCI datasets and face recognition databases validate the effectiveness of our method in processing high-dimensional data.