As regular readers of this Blog will know multi touch is close to our heart, which is why we were very interested to find out about a multi touch proximity-aware tabletop collaboration between Autodesk Research, the University of Alberta and the University of Toronto.
Known as Medusa, the tabletop is composed of a Microsoft surface device and 138 infrared proximity sensors. These sensors are arranged in three rings: an outward facing ring around the perimeter of the base and two rings on the top of the table facing upwards from the bezel.
The sensors allow the system to continually monitor users’ position and distance from the tabletop, to detect the presence of hands and arms above the table, to ascertain how far a user is from the table, and even work out the angle of a user’s arm. This combination of the information and the multi touch surface means that Medusa can match touch events to users and their hands, and establish which touches belong to which user.
To demonstrate the capabilities of Medusa the project team has created ‘Proxi-Sketch’, a system that allows for developing a user interface through the addition and manipulation of UI components. Among the features of Proxi-Sketch are access to different actions depending on whether dominant or non-dominant hands are used; differentiation between group and individual manipulation; and options for manipulation that are dependant on which hand and on how many fingers are used to touch objects.
The proximity sensors also allow for Medusa to provide ‘pre-touch’ information – for example ‘hovering’ arms can be used to obtain information such as gesture guides or to access additional functionality.
We think that this project is another great example of how multi touch technology can be used to enrich and optimise the user experience by enhancing application functionality, and we look forward to seeing some practical deployments in the near future.