Federal learning, as a distributed machine learning paradigm that prioritizes data privacy, has gained increasing popularity across various domains. However, in cross-device scenarios, clients often face limitations in computational resources, hindering their ability to effectively participate in federated learning tasks. In this paper, we present a novel framework called U-shaped Split Federated Learning (USFL), designed to address resource constraints by splitting the intact model into three parts. By offloading computationally demanding tasks to the server, USFL significantly reduces the burden on the client side, enhancing the overall execution efficiency of federated learning. Additionally, our framework ensures data integrity, leading to a higher level of privacy preservation compared to existing methods.