Brown CS Blog

The Telepresence Of Furniture In Extended Reality


In the current issue of ACM Interactions Magazine, Brown University Assistant Professor of the Practice of Engineering Ian Gonsher, who teaches the Brown CS course CSCI 1951-C Designing Humanity Centered Robots, presents a collection of prototypes developed at the intersection of robotics, ubiquitous computing, mixed reality, and furniture design. You can read the full article here.

These design research projects, developed in collaboration with Computer Science, Engineering, and RISD students, explore alternative paradigms for “spatial computing,” away from so called  “face computers” and wearables, towards the integration of extended reality and telepresence directly into furniture and the built environment. These design research projects also call attention to inequalities between local and remote telepresence users, and offer viable alternatives away from the dominant paradigm of personal devices towards the development of extended reality infrastructure as a public good.

These design research projects explore three related questions:

How might telepresence be designed to be a more equitable experience for both remote and local users?

Videoconferencing has changed the way we work. Now it is not always necessary to go into the office to attend work meetings. This can be done virtually with common videoconferencing apps such as Skype and Zoom. But this has also come at something of a cost, introducing new inequalities into the way these conversations occur. Sometimes the contributions of local users dominate the conversation at the expense of remote users, especially when the meetings involve more than a few participants, and especially when an understanding of the physical context is essential to contributing to the conversation.

How might attention to design principles such as scale, movement, and context create affordances for a more immersive and embodied telepresence experience?

Awareness frames reality. The ways designers and engineers frame an awareness of others through telepresence technology, both physically and remotely, sets the conditions for the kind of interpersonal experiences that are possible. These prototype studies establish three design principles for the development of telepresence in mixed reality: scale, movement, and context. Commensurate scale sets the initial conditions for comparison for encounters between remote and local users. Images that are smaller than life size disrupt an expectation of verisimilitude. Integrating movement into users' experiences allows for more dynamic and collaborative interactions. And context, which is to say the user's physical and virtual proximity to the space around them, to the built environment and the furniture already around us, allows for an experience that aligns more comfortably with expectations about unmediated reality.

How might the integration of mixed reality into furniture  provide a viable alternative paradigm to the emergence of "face computers" and other wearables?

Through the development of working prototypes, and in collaboration with students from RISD and Brown Computer Science and Engineering, new design typologies emerged that exist at the intersection of furniture design and robotics. Demos of this work, as well as links to published work, can be found at the links below.

Janus Table:


Large Screen Mobile Telepresence Robot:

1. Gonsher, I., Han, Y., Desingh, K., and Gokaslan, A. Prototyping mixed reality large screen mobile telepresence robots. Proc. of the 5th International Workshop on Virtual, Augmented, and Mixed Reality for HRI, 2022.

2. Gonsher, I. and Kim, J.Y. Robots as furniture, integrating human-computer interfaces into the built environment. Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction. ACM, New York, 2020, 215–217.

3. Gonsher, I. et al. Integrating interfaces into furniture: New paradigms for ubiquitous computing, mixed reality, and telepresence within the built environment. Paper presented at the Media Architecture Biennale 2023, Toronto, Ontario, Canada.

4. Gonsher, I., Ma, Y., Pineda-Dominguez, I., Lee, M., and Han, Y. The mixed reality passthrough window: rethinking the laptop videoconferencing experience. Human Interaction and Emerging Technologies (IHIET-AI 2023): Artificial Intelligence and Future Applications 70 (2023).