Feature selection plays a key role in a classification problem with high-dimensional data. Main idea of feature selection is to reduce dimensionality of input space while preserving classification accuracy. Effective range is an efficient way to measure importance of features in classification problem as seen in a recent research named improved feature selection based on effective range (IFSER). However, IFSER only considers the overlapping area and the including area of effective ranges of each class for every feature; it fails to investigate how separated the effective ranges are. To overcome the limitation, we suggest a concept of extent of separation. In addition, we minimize redundancy among features by using a new measure to capture common discriminative power of features on the classes and decide a reflection degree of redundancy to fit each data. Finally, we show experimental results to compare our proposed method with several benchmarking methods and select the appropriate features through the forward selection algorithm.