Tensor-Based Sequential Learning via Hankel Matrix Representation for Next Item Recommendations
- Resource Type
- Periodical
- Authors
- Frolov, E.; Oseledets, I.
- Source
- IEEE Access Access, IEEE. 11:6357-6371 2023
- Subject
- Aerospace
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineered Materials, Dielectrics and Plasmas
Engineering Profession
Fields, Waves and Electromagnetics
General Topics for Engineers
Geoscience
Nuclear Engineering
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Tensors
Filtering
Biological system modeling
Encoding
Data models
Computer architecture
Computational modeling
Sequential analysis
Learning systems
Sequential learning
sequence-aware tensor factorization
collaborative filtering
next item prediction
- Language
- ISSN
- 2169-3536
Self-attentive transformer models have recently been shown to solve the next item recommendation task very efficiently. The learned attention weights capture sequential dynamics in user behavior and generalize well. Motivated by the special structure of learned parameter space, we question if it is possible to mimic it with an alternative and more lightweight approach. We develop a new tensor factorization-based model that ingrains the structural knowledge about sequential data within the learning process. We demonstrate how certain properties of a self-attention network can be reproduced with our approach based on special Hankel matrix representation. The resulting model has a shallow linear architecture. Remarkably, it achieves significant speedups in training time over its neural counterpart and performs competitively in terms of the quality of recommendations.