To Shuffle or Not To Shuffle: Mini-Batch Shuffling Strategies for Multi-class Imbalanced Classification
- Resource Type
- Conference
- Authors
- Mao, Yuwei; Gupta, Vishu; Wang, Kewei; Liao, Wei-keng; Choudhary, Alok; Agrawal, Ankit
- Source
- 2022 International Conference on Computational Science and Computational Intelligence (CSCI) CSCI Computational Science and Computational Intelligence (CSCI), 2022 International Conference on. :298-301 Dec, 2022
- Subject
- Computing and Processing
Training
Deep learning
Scientific computing
Computer architecture
Data models
Computational intelligence
neural networks
shuffling
imbalanced classification
deep learning
- Language
- ISSN
- 2769-5654
Mini-batch shuffling is important for the deep learning training process. Most people use the random shuffling method, which aims to produce a random permutation of the training dataset in every epoch. In this study, we explore mini-batch shuffling for multi-class imbalanced data classification by investigating several shuffling strategies. We find that different order of input data can significantly affect the results of deep learning models. The results show that our proposed strategies can improve the accuracy by around 2%, demonstrating that higher diversity and lower imbalance ratio in each mini-batch can lead to better results.