Many existing pattern recognition techniques require the estimation of the covariance matrix. When the number of available samples is sufficient large relative to the dimension the features, a maximum likelihood estimator or a related unbiased covariance matrix estimator can be applied. In the classification task of the hyperspectral image, however, the number of available observations is very limited or even smaller than the number of bands due to the access of the ground truth samples is costly and valuable. In this case, the performance of the maximum likelihood related estimators will be poor. Thus, the classification accuracy of the corresponding classification methods is unsatisfied. Based on the idea of combining several different structures in an estimator, a new covariance matrix estimator called localized shrinkage covariance estimator (LSCE) is proposed in this study. The performance of LSCE is evaluated via the classification accuracy of the linear discriminant classifier (LDC) using LSCE as the estimator of its covariance matrix. The results of the simulation studies show that LSCE is an ideal covariance estimator and the classical method LDC can be a very competitive classifier comparing to other popular techniques in hyperspectral data classification.