Metaverse has become a powerful tool for conducting research in many domains, including education, social science, and healthcare. It mixes the virtual and physical environments and can produce various stimuli for users to experience and be immersed in the virtual-real environment. However, at present, these stimuli are preset and immobile, not responding to the user’s changing requirements. In addition, it lacks studies on how brain signals might suggest the demand or preference for specific VR content and if/how VR can interact with users’ brains directly, hands-free, and without verbal instructions. As metaverse’s natural association with learning and brain activities, receiving signals directly from the user’s brain will offer a firm edge to explore mental health issues. This research proposes a new framework, namely Brain-Metaverse Interaction (BMI), which enables the direct interaction between users’ brain signals and the adaptation of VR content in an iterative and evolving manner. Our experiment based on this framework shows promising results, although suffering from the typical limitations of hardware devices and data acquisition, such as signal noise of EEG data and sensitivity and latency of the EEG device.