The Use of Binary Choice Forests to Model and Estimate Discrete Choice Models
- Resource Type
- Authors
- Guillermo Gallego; Zhuodong Tang; Ningyuan Chen
- Source
- SSRN Electronic Journal.
- Subject
- Discrete choice
Mathematical optimization
Computer science
Binary decision diagram
Parametric model
Measure (mathematics)
Equivalence (measure theory)
Preference (economics)
Computer Science::Databases
Random forest
Data-driven
- Language
- ISSN
- 1556-5068
We show the equivalence of discrete choice models and a forest of binary decision trees. This suggests that standard machine learning techniques based on random forests can serve to estimate discrete choice models with an interpretable output: the underlying trees can be viewed as the internal choice process of customers. Our data-driven theoretical results show that random forests can predict the choice probability of any discrete choice model consistently. Moreover, our algorithm predicts unseen assortments with mechanisms and errors that can be theoretically analyzed. We also prove that the splitting criterion in random forests, the Gini index, is capable of recovering preference rankings of customers. The framework has unique practical advantages: it can capture behavioral patterns such as irrationality or sequential searches; it handles nonstandard formats of training data that result from aggregation; it can measure product importance based on how frequently a random customer would make decisions depending on the presence of the product; it can also incorporate price information and customer features. Our numerical results show that using random forests to estimate customer choices can outperform the best parametric models in synthetic and real datasets when presented with enough data or when the underlying discrete choice model cannot be correctly specified by existing parametric models.