Forecasting time series is an engaging and vital mathematical topic. Theories and applications in related fields have been studied for decades, and deep learning has provided reliable tools in recent years. Transformer, capable to capture longer sequence dependencies, was exploited as a powerful architecture in time series forecasting. While existing work majorly contributed to breaking memory bottleneck of Trasnformer, how to effectively leverage multivariate time series remains barely focused. In this work, a novel architecture utilizing a primary Transformer is proposed to conduct multivariate time series predictions. Our proposed architecture has two main advantages. Firstly, it accurately predicts multivariate time series with shorter or longer sequence lengths and steps. We benchmark our proposed model with various baseline architectures on real-world datasets, and our model improved their performances significantly. Secondly, it can easily be leveraged in Transformer-based variants, which guarantees broad applications of our proposed work.