Virtual Reality Robot Duplicate Mirrors Your Actions

Home AI Projects Virtual Reality Robot Duplicate Mirrors Your Actions
Virtual Reality Robot

Scientists from Cornell and Brown University have engineered an advanced telepresence robot that responds in real-time to a user’s actions and gestures within virtual reality.


This robotic system, known as VRoxy, permits a user in a confined space, such as an office, to collaborate using virtual reality with colleagues in a larger area. VRoxy showcases the most recent breakthrough in remote robotic embodiment developed by researchers at Cornell’s Ann S. Bowers College of Computing and Information Science.


“The beauty of virtual reality is that we can employ various movement techniques available in virtual reality games, such as instantaneous shifts from one location to another,” explained Mose Sakashita, a doctoral candidate in the field of information science. “This feature allows remote users to occupy a minimal physical space while collaborating with colleagues in a significantly larger remote setting.”


Sakashita is the primary author of “VRoxy: Extensive Collaboration from an Office Using a VR-Driven Robotic Proxy,” which will be unveiled at the ACM Symposium on User Interface Software and Technology (UIST), scheduled for October 29 to November 1 in San Francisco.


The automatic real-time responsiveness of VRoxy is a significant advantage for both remote and local collaborators, according to the researchers. A robot proxy like VRoxy enables a remote collaborator in a small office to partake in a group task in a considerably larger area, such as a collaborative design situation.


For collaborators, VRoxy imitates the user’s bodily position and other crucial nonverbal signs that are otherwise absent in telepresence robots or Zoom. For instance, VRoxy’s display screen, which exhibits a representation of the user’s face, adjusts based on the user’s focus.


VRoxy advances upon a similar robot from Cornell called ReMotion, which only operated if both the local and remote users possessed matching equipment and workspaces of identical size. This has changed with VRoxy. The system translates minor movements from remote users in virtual reality into larger motions within the physical space, as per the researchers.


VRoxy is equipped with a 360-degree camera, a screen displaying the user’s facial expressions captured by the VR headset, a robotic pointer finger, and omnidirectional wheels.


Wearing a VR headset, a VRoxy user can choose between two viewing modes: Live mode provides an immersive view of the collaborative space in real time, facilitating interactions with nearby colleagues, while navigational mode displays digitally rendered paths throughout the room, allowing remote users to “teleport” to their desired location. This navigation mode enhances the remote user’s mobility, reducing motion sickness, according to the researchers.


The system’s automated functionality enables remote collaborators to focus entirely on teamwork, rather than manually steering the robot, the researchers noted. In forthcoming endeavors, Sakashita intends to enhance VRoxy with robotic arms, enabling remote users to interact with physical objects in the real space through the robot proxy. He envisions VRoxy conducting its own spatial mapping, much like a Roomba vacuum cleaner. Currently, the system relies on ceiling markers to help the robot navigate a room. Sakashita believes that real-time mapping support could make VRoxy viable in educational settings like classrooms.