Brown CS Blog

Exploring The Rise Of The Metaverse: Final Projects From Brown's CSCI 1951C Designing Humanity Centered Robots

    None
    Click the links that follow for more stories about Ian Gonsher and other recent accomplishments by our students.

    This past Fall, 2021 semester in CSCI 1951C Designing Humanity Centered Robots, students explored how emerging technologies might shape our lives in the near future. They did this through designing and building working prototypes that explore the “how” and “why” of new technologies. The class is taught by Ian Gonsher, Assistant Professor of the Practice in the School of Engineering and Department of Computer Science at Brown University. His course attempts to foster a creative, collaborative environment for students that allows for the development of working prototypes that integrate both hardware and software.

    The final project explored the future in virtual, augmented, and mixed reality through the development of three prototypes. Ian explained that the class was “especially interested in ways we might think about and critique the rise of the metaverse.” The first prototype is an exploration of "Pepper's Ghost," an illusion technique used in the theatre. The prototype develops a low- cost accessory for a phone that can create a holographic-like experience. The second is an exploration of "Pepper's Ghost" that was integrated into a table. The third is a critique of virtual reality by creating a virtual world.

    Click the links that follow for much more about the three prototypes, including photos, videos, and behind-the-scenes details: 

    The attached images are found from the links above. The top right image is Dan Rapoport’s Holotable. Together with Ian Gonsher, Dan and his team at Brown are developing a holographic viewing table derived from Pepper’s Ghost, an 1880s theatrical illusion and optic principle. Based on an excellent study by Xuan Luo from the University of Washington, they are developing a simplified process to give the viewer the illusion that they are viewing an object rendered in 3D, live. Their table was designed to incorporate the electronics and optics necessary to facilitate correct viewing from all angles, and uses an internally developed Python script to do so.

    For more information, click the link that follows to contact Brown CS Communication and Outreach Specialist Jesse C. Polhemus.