PHREPS: Physical Remote Presence

Proposer : Sameer Kishore, EventLab, University of Barcelona, Spain

Visited laboratory: University College London (UCL), United Kingdom

Visit Dates : 18 April to 31 April, 2012 and 2 May to 5 May, 2012

Please click to see the fascinating BBC video and read the project description below.

PHREPS, the Physical Remote Presence project, has attempted to develop a platform for virtually and instantaneously transporting visitors – people from one physical location in the world – to a destination, so they can interact with the people and the surroundings there. This is achieved by simultaneously transferring streams of multi-sensory data (audio, visual, haptic) back and forth from one physical place to the other, and using this information to create a unified shared environment that represents the physical space of the destination in real-time. The sensory information of the visitor is streamed to the destination as well, where the visitor is represented in one of a variety of different ways, including a physical robot.

There are several applications for this project, including social interaction and communication between people, remote medical support and rehabilitation, and teaching. One of the examples of the social interaction application that has been developed is called the “Acting” application. This is a form of social interaction between two or more people, which requires the exchange of several layers of sensory information to work correctly.

It involves re-enacting a scene from a specific movie, by professional actors who are physically present in different locations while interacting together in a shared environment. In the current scenario, the actor ‘beams’ from Barcelona to a location in University College London. The people involved are all body-tracked, with their movements and expressions mapped to the various avatars they are embodied in.

The visitor, at UCL, is represented by a robot. This significantly increases the quality of interaction between the participants, since there is a physical instantiation of the visitor. The movements of the visitor’s limbs are directly mapped on to the robot’s limbs, and two-way audio communication is facilitated by using the microphone and speakers on the robot. The stereoscopic cameras on the robot’s head provide a real-time 3D stereoscopic video that is streamed to the HMD worn by the visitor. The robot used for this project is the Robothespian, developed by Engineered Arts.

The entire scenario was executed successfully, and a participant from Barcelona was ‘beamed’ to University College London. The person was represented by a robot placed in the Computer Science Department at UCL, and interacted in real-time with the local people present in that room. A news crew from the BBC was also present, and conducted an interview with the visitor, a video of which can be viewed on their website: http://www.bbc.co.uk/news/world-europe-17905533.