We’re excited to share that our latest research has been accepted to HumanSys 2025! ?
Studying how people collaborate in Mixed Reality (MR) usually means complex, time-consuming data collection. So we built MoCoMR—a simulator that creates synthetic but realistic MR behavior data, capturing speech, gaze, and movement during a collaborative image-sorting task.
MoCoMR models both individual actions in collaborative settings, helping researchers understand how behavior shapes teamwork and performance. With a flexible API, it’s easy to run different scenarios and generate insights that power more human-centered MR experiences.
Catch us at HumanSys’25 as we dive into how simulated behavior can unlock real-world insights!