Motor imagery (MI) enhances rehabilitation in brain-injured patients by leveraging neuroplasticity to rebuild neuron connections. Combining MI with exoskeleton robotics and virtual reality (VR) can further improve recovery outcomes. In this study, we integrated a lower limb exoskeleton robot (MAX-1), the Oculus Quest 2 VR headset, and an MI-BCI system. Patients, equipped with the exoskeleton and VR headset, undertake motor imagery tasks, resulting in the exoskeleton robot’s movements guided by EEG signals from the Emotiv EPOC Flex headset. These EEG signals are processed and transformed into leg commands, with concurrent feedback provided in the VR environment. The research comprises three experiments: an offline experiment for primary data collection and training, a pseudo online experiment for simulating real-time processing, and an online serious game experiment to assess the system’s real-world applicability. The offline experiment showed an EEGNet model accuracy of 85.2% for MI detection and 78.3% for classification. The pseudo online experiment achieved an accuracy of 67.2%, indicative of its potential in a real-time scenario. In the online serious game experiment, while participants were able to complete the game, challenges in game control were reported, underscoring areas for future enhancement.