- ARC Project Investigators: Jeffrey Shaw, Dennis Del Favero, Neil Brown, Volker Kuchelmeister, Nikos Papastergiadis, Scott McQuire, Andy Arthurs, Sarah Kenderdine, Kevin Sumption, Grace Cochrane
- ARC Project Title: Reformulating museological narrative using three models of cinematic interactivity. LP0453638
iDome is a proprietry hardware/software platform developed by the iCinema Centre that offers a cost-effective and compact immersive visualisation environment for panoramic and spherical representations, video and/or computer generated. Ideally suited for museological applications, it is configured as a three to five meter fibreglass hemisphere that stands vertically in front of the viewer, with a projector, computer, surround audio equipment and user interface.
The iDome utilises a three meter fibreglass dome as the surface for 180 degree projection made possible by a high-resolution projector and a spherical mirror as reflection surface. Size and shape of this projection set-up covers the peripheral vision of the user standing directly in front of it and thus results in a truly immersive experience.
This approach has advantages over the “projector with fish eye lens” set up, used for the installation Conversations@the Studio at the Powerhouse Museum Sydney. It does not require a large projector stand (which partly blocks the view), is much more cost effective and can take advantage of even higher resolution projectors when they become available.
360 degree global recording
Video for the iDome was shot with a Ladybug camera system from Point Grey Research. It allows digital spherical recordings at 360 degree horizontal and 240 degree vertical field of view. The camera has a tightly packed cluster of six CCD sensors with wide angle lenses and slight overlap between the images. In post processing these individual frames are colour and geometrically corrected, and then stitched to a high resolution equirectangular image (3600×1800 pixel).
Alternatively, the iCinema Spherecam can be used. The Spherecam has sufficient resolution to be able to “zoom” into the video and obtain a “close-up” of details.
A custom 3D software engine takes the high resolution video stream from a disk array and uses it as a texture map projected on the inside of a sphere. The geometric correction for the mirror is undertaken through a high resolution distortion mesh in the render pipeline. The video texture is translated inside the sphere according to the point of view of the user, which simulates “looking around” in the scene.
The spherical movie is projected inside the dome with a coverage of 180 degree. A track ball allows the user to rotate the projection freely while the movie is still running. Together with the image, the multi-channel sound field is rotated accordingly. A simple vector based panning algorithm is used to distribute the multi-channel sound dependent of the user’s principle point of view.