Investigation of a Joint Splitting Criteria for Decision Tree Classifier Use of Information Gain and Gini Index
- Resource Type
- Conference
- Authors
- Jain, Vikas; Phophalia, Ashish; Bhatt, Jignesh S.
- Source
- TENCON 2018 - 2018 IEEE Region 10 Conference Region 10 Conference, TENCON, 2018 - 2018 IEEE. :2187-2192 Oct, 2018
- Subject
- Aerospace
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineered Materials, Dielectrics and Plasmas
Engineering Profession
Fields, Waves and Electromagnetics
General Topics for Engineers
Geoscience
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Decision trees
Vegetation
Indexes
Training
IEEE Regions
Conferences
Decision tree
Gini Index
Information Gain
joint splitting criterion
Random Forest.
- Language
- ISSN
- 2159-3450
Decision Tree is a well-accepted supervised classifier in machine learning. It splits the given data points based on features and considers a threshold value. In general, a single predefined splitting criterion is used which may lead to poor performance. To this end, in this paper, we investigate joint splitting criteria using two of the most used criterion i.e. Information Gain and Gini index. We propose to split the data points when Information Gain is maximum and Gini index is minimum. The proposed approach is rigorously tested and compared by constructing decision tree based random forests. All the experiments are performed on UCI machine learning datasets.