Analysis of Different Deep Learning Architectures to Learn Generalised Classifier Stacking on Riemannian and Grassmann Manifolds
- Resource Type
- Conference
- Authors
- Tayanov, Vitaliy; Krzyzak, Adam; Suen, Ching Y.
- Source
- 2022 26th International Conference on Pattern Recognition (ICPR) Pattern Recognition (ICPR), 2022 26th International Conference on. :2735-2741 Aug, 2022
- Subject
- Computing and Processing
Robotics and Control Systems
Signal Processing and Analysis
Manifolds
Geometry
Architecture
Stacking
Deep architecture
Computer architecture
Predictive models
- Language
- ISSN
- 2831-7475
This paper considers different deep learning architectures to learn patterns that are objects lying on the Riemannian and Grassmann manifolds. Among them, we considered cascades of classifier ensembles (CCEs), convolutional neural networks (CNNs), and deep neural forests (DNFs). All aforementioned architectures have linearized and nonlinearized versions. Patterns that are objects of Riemannian manifolds are classifier prediction pairwise matrices (CPPMs) while objects of the Grassmann manifolds are obtained using decision profiles (DPs). We also compared our architectures with CCEs that operate in the Euclidean geometry. As seen from the experimental results deep learning architectures based on CNNs provided the best results.