A Generalization Bound of Deep Neural Networks for Dependent Data
- Resource Type
- Working Paper
- Authors
- Do, Quan Huu; Nguyen, Binh T.; Ho, Lam Si Tung
- Source
- Subject
- Statistics - Machine Learning
Computer Science - Machine Learning
- Language
Existing generalization bounds for deep neural networks require data to be independent and identically distributed (iid). This assumption may not hold in real-life applications such as evolutionary biology, infectious disease epidemiology, and stock price prediction. This work establishes a generalization bound of feed-forward neural networks for non-stationary $\phi$-mixing data.