Category Archives

32 Articles

Tablescape Plus

The goal of this project, from researchers at the University of Tokyo, is to increase the possibility of the interaction of physical objects by using tabletop objects as projection screens as well as input devices.

Puppet Show

Details taken from the project website at

https://www.hc.ic.i.u-tokyo.ac.jp/project/tablescape/

Tablescape Plus integrates diverse technologies:

  • Special optics based on material that is normally used for window blinds. The material is used as a horizontal tabletop screen.
  • A unique method of calibrating camera-projector combinations to display images of a horizontal tabletop screen and upstanding objects.
  • The AR Toolkit library detects the position and direction of objects placed on the tabletop. While head-mounted displays are often used in the field of augmented reality, Tabletop Plus does not require any special equipment.
  • Tangible user interfaces to deliver a broad range of interaction.
  • Software architecture for collaborative display of heterogeneous information.

Tablescape Plus uses 4 key technical innovations:

  1. Optical design of a special screen system that is is diffusive from one privileged direction and has high transparency to incoming light from other directions. The diffusive direction is used to show images on a tabletop screen. By projecting images from the transparent direction, the system can also project on the surfaces of objects placed vertically.
  2. A method for detecting the tabletop objects from inside the system. An infrared camera is installed underneath the table facing upward. Infrared light is projected from the side of the camera to the screen, and a retro-reflective material of a known shape is attached underneath each object. When physical objects are placed, the ID, position, and rotation of each object are recognized by using the ARToolKit library.
  3. A method for calibrating projector images geometrically. With this method, each projector can display identical parts of images onto identical positions.
  4. A method for harmonizing projected images. Each projected image should change relative to the other images according to the input information of placed objects.

system

The project has three subsidiary goals:

  • To develop a display system that can project separate images onto a horizontal tabletop screen and vertically placed tabletop objects simultaneously.
  • To add interactivity to this new display system by introducing camera-based tracking methods and infrared optical systems.
  • To explore the new paradigm of Tablescape Plus applications by developing attractive and specialized demos including games, simulation, education, and scientific visualization.

Deskrama

Deskrama uses a high resolution position and rotation sensor mounted on a lightweight LCD panel to allow a user to interactively explore a 3D model from a 2D plan.

Deskrama"

Moving the LCD panel on a plan drawing of a building allows the user to see a cross-section through a 3D model of the building.

There is a demonstration video at MIT

https://cat2.mit.edu/deskrama/deskrama01_128KB.wmv

a synopsis available here

https://www.kf12.com/blogs/techno/wp-content/uploads/deskrama.pdf

and further details about the project are available from the Deskrama home page at MIT.

https://cat2.mit.edu/deskrama/

I found this to be an interesting concept which may be applicable to Mechanical CAD for visualising 3D Models. Particularly interesting is the concept of combining a motion sensor (position and rotation) with the lightweight screen - perhaps this is something which could be explored further.

HoloVizio

holocad

HoloVizio from Holografika was shown at Siggraph 2006.

https://www.holografika.com/

Holografika is a Hungarian venture active in the field of emerging photonic technologies. The company developed a proprietary technology in 3D visualisation, including real 3D display devices, software applications, and 3D data compression solutions and holds several patents.

The company at present develops and sells 3D display systems. It started selling its 26” and 32” HoloVizio™ 3D displays in 2004, and plans to offer larger-scale holographic projection systems soon. Second-generation displays, a 3D camera system and a full 3D software environment are all under development.

It is claimed to be the first 3D monitor that enables you to watch the screen image in 'true' 3D in a 50-degree continuous field of view without the need for artificial aids such as special glasses or headsets - in effect, the viewer is viewing a hologram.

https://en.wikipedia.org/wiki/Holography.

Theory

Since HoloVizio is not a stereoscopic or multi-view system it lacks most of the backlogs and drawbacks currently associated with 3D displays. HoloVizio is not a purely holographic system that handles an enormous amount of redundant information. It is rather based on holographic geometrical principles with special focus on reconstructing the key elements of spatial vision. The pixels, or rather voxels of the holographic screen emit light beams of different intensity and colour to the various directions. A light-emitting surface composed of these voxels will act as a digital window or hologram and will be able to show 3D scenes undoubtedly being 3D.

The underlying principle is shown in the following image

principle

Each pixel (voxel) of the display is able to emit light beams at a different colour and intensity to the various directions.

If these lights are controlled appropriately, it appears as if they were emitted from behind or in front of the screen. In these cases, the viewer percieves the points of an image floating in space.

download128w2

Developer tools are available to provide an interface between 3rd party applications and the hardware.

Sample videos of the device are available from their homepage, although you will need to install the DivX Codec to view them.

https://www.divxmovies.com/codec/

True 3D Display

The National Institute of Advanced Industrial Science and Technology (AIST) (Japan) were demonstrating their True 3D Display concept as part of the Siggraph 2006 Emerging Technologies.

The device uses the plasma emission phenomenon near the focal point of focused laser light to construct dot arrays in the air (3D Space) (and is incredibly noisy!)

Here it is in action

MRI - Mixed Reality Interface

KOMMERZ develops individualised concepts and products in the fields of visual media and design. They were showing their Mixed Reality Interface (MRI) concept at Siggraph 2006 which was also seen at last years CeBIT.

href='https://www.kommerz.at/pages_en/20050722053921.php

mri

This is an input device for computer applications where a user can interact with the system by manipulating physical objects such as models, figures, blocks etc.

Positioning and rotating the physical object defines the the movement in the application.

I tried it out using objects including a camera and lights to set up an environment similar to that shown where I could experiment with various lighting and camera settings in the car photo-studio representation. It seemed a natural and easy way to interface with the system.

Better quality videos can be viewed from their website, but here is a 'live' video showing the system in operation at Siggraph.

Morphovision

Morphovision - the goal of this project, seen in the emerging technologies exhibition hall at Siggraph 2006, is to create unique 3-dimensional images and to pursue this technology as a new 3D image system of the future.

main_image

The display system allows you to transform and animate a 3D solid object. The object, in this case a model house, is rotated at high speed and illuminated with special lighting from a digital projector. A touch screen allows a user to choose different visual effects. The house will appear to distort in front of your eyes - this is a visual effect as the model remains unchanged and is actually reliant on persistence of vision (the ability of the retina to retain an image for a brief moment) to achieve the effect.

morpho

It was co-developed by Toshio Iwai and NHK (Japan Broadcasting Corporation) Science and Technical Research and Laboratories. Read more HERE.

So much for the technology, I also found this interesting 360 degree Panoramic photograph of the Morphovision booth which also allows you to browse around part of the exhibition hall. Check it out HERE

livePic

Shown at Siggraph 2006 livePic is a research project from Keio University Inakage Lab , Japan.

01

livePic allows a user to create a drawing using a brush and pallet to draw with, on a 'paper' like screen - then the user can animate the drawing by blowing on the screen.

On the tip of the brush is an infrared LED which lights up as the tip touches the screen. A web-cam captures the infrared images which are processed to obtain the position of the brush. Drawings are generated from this data and displayed through a projector onto the screen. An infrared thermography device is used to detect the breath of a user if he blows on the screen and this is stored as a temperature image. These images are processed to calculate the position and direction of blowing. Drawings at the position of the breaths are then animated.

See more at the website or view a demonstration video HERE.