Mr Shivam Mukherjee

Research project: Asynchronous Virtual Reality

Abstract

Collaborating with others is an essential part of work and play, and digital platforms that facilitate remote collaboration are becoming ever more important. This is due to an increased reliance of companies on teams distributed around the globe, a growing awareness of sustainability concerns about travel, as well as adoption of remote-working practices in the aftermath of Covid-19.

Such globally-distributed teamwork typically involves tasks and personnel distributed over different time zones, which requires that systems support asynchronous work-practices. Importantly, asynchronous collaboration has several advantages over synchronous collaboration, as it increases the ability of teams to work in parallel, coordinate flexibly on time-management, while also giving teams more time to review and reflect on the tasks at hand (Mayer, 2022). Currently, instant messaging platforms such as Slack and WhatsApp natively support asynchronous interactions, while groupware such as Microsoft Office, Google Workspace and GitHub provide rich sets of tools for large groups of distributed teams to coordinate. However, despite the importance and ubiquity of asynchronous collaboration, there are currently no commercially available virtual or augmented reality platforms (XR) which support asynchronous collaboration. The main reason is that, although the interaction design criteria for standard groupware are well established, building such platforms in XR requires addressing multiple interrelated Human-Computer Interaction challenges in 3D space whose constraints and affordances are still poorly understood (Chow, 2019).

For example, instant messaging platforms such as WhatsApp support asynchronous interaction by providing simple, but subtly sophisticated affordances that go beyond what is possible in face-to-face communication. Instead of improvising on the spot, users can privately formulate, edit and perfect messages before making them public. This efficiency is further enhanced by providing users with the ability to scroll back and consult the conversational history, which allows users to refer directly to the previous context by resending, quoting or responding to specific messages. Crucially, this functionality makes the communication more robust, and is used by interlocutors to identify, signal and recover from miscommunication (Clark,1996). Moreover, unlike face-to-face interaction, where people can only participate in a single conversation at a time, removing the pressure to respond immediately enables users to engage quasi-simultaneously in multiple conversations; when (re)joining a conversation, users can rapidly read through the most recent messages to "catch up" and respond appropriately.

With the affordances of instant messaging now well understood, there is an urgent need to investigate how to provide such functionality in XR. Arguably the biggest conceptual and technological hurdle is that, unlike standard platforms which are designed to support the exchange of simple messages (e.g. text, audio, video), in asynchronous XR, the transmitted "messages" involve a massive causal nexus of data that needs to be reconciled with a potentially incommensurable causal flow in the current user's reality (Fender,2022). Consider, e.g. distributed design team working asynchronously on different versions of the same artefact: an update by each team consists of a detailed recording of avatars interacting with each other and with the artefacts in the virtual environment. A system supporting such interactions must be able to update and synchronise separate causal flows with ramifications that cannot be generally predicted in advance (Chow, 2019). The most immediate challenge, which is still unsolved, is how to provide users with the ability to seamlessly and collaboratively view, edit and reconcile multiple past histories of interactions within the interaction itself.

Biography

I am a lifelong student with brief experience in the games industry. I contributed to the development of' 1971: Indian Naval Front', which won the Indian Game Developer Conference Game of the Year award in 2023. I am designing my programming language, Chaka, and working on high-performance computing learning in my spare time.

Areas of research interest

  • Virtual Reality Applications
  • Computer Games
  • Programming Languages

Qualifications

  • MSc Game Development (Programming)
  • BTech Software Engineering

Funding or awards received

  • IGDC Game of The Year 2023 ("1971: Indian Naval Front")