Internship - Enhancing Workspace Awareness in Asymmetric XR Collaboration

Augmented Reality (AR) has been largely explored for remote assistance under a wide range of applications domains, such as industrial maintenance or home assistance. In such a context, one user, having access to an AR system, asks guidance from one or multiple remote users (collaborators). This internship aims to explore how fast radiance field methods can be used to enable workspace awareness.

To ensure optimal collaboration, first, remote users need a precise understanding of the workspace of the AR user (workspace awareness). The awareness of the workspace enables the remote collaborators to understand and assess the environment in which the AR user is located. Second, users should be able to be aware of the actions of the other users (user awareness) and should be able to communicate with traditional interaction modalities, such as voice, gaze and gestures. User awareness is required to ensure efficient communication and interaction among users, as users should be aware of the actions of others. Finally, workspace and user awareness must be synchronized, as interactions are linked with elements of the physical workspace.

However, due to the difficulty to ensure free exploration for remote collaborators, workspace awareness still remains an open problem, and it is typically supported either by virtual replicas/reconstructions or video feeds [1]. However, with the appearance of radiance field rendering methods such as Neural Radiance Fields (NeRF) [2] or more efficient point-based solutions [3], real-time and high-fidelity reconstruction of physical workspaces are becoming possible. We will leverage fast radiance-field methods to enable efficient collaboration in asymmetric configurations, in which AR users asks for support to remote collaborators in virtual reality. 

[1] Fages, A., Fleury, C., and Tsandilas, T., Understanding Multi-View Collaboration between Augmented Reality and Remote Desktop Users. ACM on Human-Computer Interaction. CSCW2, 2022, 549, 1-27. doi.

[2] Mildenhall, B., Srinivasan, P. P., Tancik, M., Barron, J. T., Ramamoorthi, R., Ng, R, Nerf: Representing scenes as neural radiance fields for view synthesis. Communications of the ACM, 2021, 65(1), 99-106, doi.

[3] Kerbl, B., Kopanas. G, Leimkühler, T., & Drettakis, G., 3D Gaussians for Real-Time Rendering of Radiance Fields, conditionally accepted to ACM Trans. On Graphics (to be presented at ACM SIGGRAPH 2023).