Nearest Neighbors (k-NN) is a well-established algorithm for classification widely used in various machine learning applications. Although -NN has many advantages, it suffers severely from high latency and low throughput due to its computation-intensive nature. This paper suggests a novel, highly parallel approach based on Most Significant Digit First (MSDF) computing to accelerate k-NN algorithm where the Euclidean distance is used as the distance metric. The proposed method consists of three phases; in the first phase, subtract, and square functions are done on serially coming input data and provide the Most Significant Digit (MSD) of the distances. Then, the processed serially MSDF data are sorted, and finally, the label of test data is determined by majority voting. Having the value of MSD in advance provides the possibility of early termination of unnecessary computations. This approach leads to significantly higher performance and lower power consumption. Moreover, serial computation causes a lower area and memory footprint that paves the way to take advantage of the maximum number of parallel processing elements. The experimental results on an SoC platform indicate up to 88.3% improvement in terms of performance and energy consumption compared to the best previous designs.