Take a look at triMirror's first virtual mirror experiment. It uses a Kinect for the skeletal animation and is combined with triMirror's avatar customization, real-time cloth simulation and animation, and fitting technologies.
The OpenNI drivers from PrimeSense are used with the kinect.
Aurasma is an Augmented Reality (AR) application created by Autonomy, a company based in Cambridge (UK). It is currently available for the iPhone 4, iPad 2 and Android devices and a Lite (free version) can be downloaded from the App Store or Android Market.
Rather than having to recognise barcodes or other visual tags, such as special AR markers, it can recognise objects and images in the real world and superimpose content associated with that object on the display screen.
Simply pointing the camera in your mobile device at an image will enable you to view an Aura associated with that image.
Auras are augmented reality actions - such as a photo or video overlaid on a static image, a movie clip overlaid on a specific geographical location or an animation overlaid on a cereal box.
It may be easier to watch a video that demonstrates the type of things that can be done
Leonar3Do is an interactive desktop VR system that consists of a spatial input device with 6 degrees of freedom, 3D glasses and monitor mounted sensors.
With Leonar3Do, you are able to control how you move within space - you can create and pull objects out from the monitor with the cursor.
To enjoy the benefits of virtual reality with Leonar3Do, you really only need desktop space for the palm-sized control box, a few cables and connectors. Beyond that you need some extra space for the glasses and the bird to put them down when they are out of use. That's all.
Demonstrated at the recent Siggraph 2010 Conference and Exhibition, Meta Cookie combines augmented reality technology with olfactory display technology to create an interactive gustatory display.
Meta Cookie is the world's first pseudo-gustation system which induces cross-modal effect to let humans perceive various tastes without changing chemical substances by changing only visual and olfactory information. The system evokes cross-modal effect among vision, olfaction and gustation.
The system allows users to feel that they are eating a flavored cookie even though they are eating a plain cookie with an AR marker.