Today telepresence is often associated with videoconferencing. However telepresence technology can be extended allowing the user to experience remotely a location as if present. It means that the user’s senses must be stimulated as if they were present at the location. The user may also be able to affect the remote location, for example through a robot (telerobotics).
This project will involve using a robot to embody (act for) the user at a remote location, for example performing an activity, interacting with others and reporting (transmitting) back stimuli to the user. Information such as voice, video, and haptic (tactile) may be transmitted both ways
The proposed project will aim to provide support to persons who are not mobile through telerobotics. The objectives will be the application of computational algorithms to simulate the sequence of events and make recommendations on the use of everyday technologies to act as the user interface. The sequence of events can be simulated with probabilistic techniques such as Hidden Markov Models.
The project will have the opportunity to make use of a wide range of sensing technologies and robots (http://www.drrobot.com/products_item.asp?itemNumber=Sputnik3 ) within the Smart Environments Research Group.
First Supervisor: Monekosso, DN Dr
Second Supervisor: Grigorash, AV Dr
Collaboration: This project does not involve collaboration with another establishment