The CAVE2 Immersive Visualisation Platform at Monash University

The availability of massive data sets is driving the development of new large-scale visualization systems. One such is the CAVE2 Immersive Visualisation Platform built at Monash University. Described as a next-generation immersive hybrid 2D and 3D virtual reality environment, this facility exploits the latest design deployed at the Electronic Visualization Laboratory, in the University of Illinois at Chicago.  It combines  a scalable-resolution display walls with virtual-reality methods to create a seamless 2D/3D environment. The  virtual-reality simulation aims to have a resolution that can match human visual acuity.

Here is an introductory video:

By the numbers, it features:

  • A display system with a curved video wall of eighty 46″ 3D LCD panels, arranged in 20 four-panel columns;
  • Head and motion tracking facilities;
  • 3D sound facilities;
  • A high-performance compute and render cluster delivering one trillion computations per second for each of the 80 screens, for real-time display of 2D and 3D imagery;
  • A super-fast local disk system, enabling for example 30 frame per second playback of 84 million pixel images; and
  • A high performance 10Gbps network fabric.

Users will be able to build their own applications to take advantage of the following middleware:

  • The Scalable Adaptive Graphics Environment (SAGE) for large-screen, hybrid 2-d/3-d collaborative applications and meetings. SAGE comes configured with applications including image display (2-d and 3-d), movie playing (e.g. H.264 HD content), audio file playback and PDF document display. Additional applications we will be supporting in the near term (late 2013, early 2014) include desktop sharing. For programmers, SAGE provides the ability to stream pixels in from third-party programs running locally or remotely.
  • Omegalib for immersive 3-d mode including control input and head-tracked stereoscopic visualisation.
  • CalVR for immersive 3-d mode including control input and head-tracked stereoscopic visualisation.

Here are some of the programs that can be used already:

  • 3-d geometry models as composites of one or more Alias Wavefront (OBJ) format
  • 3-d geometry stored in a single Autodesk FBX format model.
  • H.264-encoded Quicktime (MOV) files up to HD resolution.
  • Single frame high-resolution 2-d images (JPEG)



This entry was posted in astroinformatics, Astronomy, computer videos, Computing, computing videos, cyberinfrastructure, Data mining, High performance computing, image mosaics, informatics, information sharing, Parallelization, programming, software engineering, visualization and tagged , , , , , , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s