Tea-cake content-based image retrieval (CBIR) is an essential issue in tea-cake traceability. Popular CBIR methods work on medical and social networking, whereas tea-cake images have intensive and parallel representations among classes. While facing tea-cake CBIR, low inter-class and high intra-class distances increase retrieval difficulties using traditional CBIR methods. Thus, this paper proposes a tea-cake CBIR approach based on deep neural networks to retrieve tea-cakes. In the model, we establish a feature extraction model with designed dense blocks, where a cross-entropy loss function is explored to train the model and catch detailed features. Furthermore, to decrease intra-class and expand inter-class interval, a masked autoregressive discriminative normalization flow (MADNF) is presented to map the gained features in high dimensions to corresponding representations in Gaussian spaces. Particularly, a maximum likelihood function is developed to train MADNF for avoiding non-convergence. Extensive experiments on the tea-cake dataset show our method has significant performance compared with current competitors. Furthermore, experiments on the bird species dataset further demonstrate the effectiveness of our proposed approach.