Prototype based linear sub-manifold learning
- Resource Type
- Conference
- Authors
- Fan, MengLing; Tang, Fengzhen; Zhao, Xingang
- Source
- 2023 International Joint Conference on Neural Networks (IJCNN) Neural Networks (IJCNN), 2023 International Joint Conference on. :1-8 Jun, 2023
- Subject
- Components, Circuits, Devices and Systems
Computing and Processing
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Manifolds
Learning systems
Geometry
Training
Symmetric matrices
Quantization (signal)
Prototypes
Dimension reduction
Sub-manifold learning
Generalized learning vector quantization
Riemannian manifold
- Language
- ISSN
- 2161-4407
Sub-manifold learning has been widely used to project high-dimensional data into a low-dimensional manifold, preserving the structure of the original data as much as possible. Existing sub-manifold learning methods either learn an embedding manifold that may have different geometric properties as the original data space, or learn a sub-manifold without considering the nonlinear structure of the data. In this paper, we learn a sub-manifold of the original data based on learned prototypes which represent prior knowledge about the intrinsic features of the data. This allows to incorporate the prior knowledge existing in the prototypes to find the suitable sub-manifold. The sub-manifold and the prototypes are jointly learned in a unified cost function via the gradient descent algorithm. The learned prototypes are obtained in the original high-dimensional space and subsequently used to learn a projection matrix to map the high-dimensional data into a lower-dimensional subspace with better separability. The prototypes are relearned in the projected subspace. The relearned low-dimensional prototypes are then working as prior knowledge to induce the learning of a better projection matrix, leading to a better subspace. The proposed subspace learning is realized for data points living in the Riemannian space of symmetric positive definite (SPD) matrices via the generalized learning Riemannian space quantization (GLRSQ) method. Experiments on both synthetic and real-world data sets show the effectiveness of the proposed dimension reduction scheme.