In collaborative settings where multiple individuals are tasked with completing a shared goal, understanding one’s partner’s emotional state could be crucial for achieving a successful outcome. This is particularly relevant in remote collaboration contexts, where physical distance can impede understanding, empathy, and mutual comprehension between partners. In this paper, we demonstrate representing emotional patterns from physiological data in a shared Virtual Reality (VR) environment, and explore how it impacted communication styles. A user study investigated the potential effects of this emotional representation in fostering empathetic communication during remote collaboration. The study’s findings revealed that although there was minimal variance in the workload associated with observing physiological cues, participants generally preferred monitoring their partner’s attentional state. However, with the assembly task chosen, most participants only directed a minimal proportion of their attention toward the physiological cues displayed by their partner, and were frequently uncertain of how to interpret and use the information obtained. We also discuss limitations of the research and opportunities for future work.