Dirtchamber: Mixed Reality testing

0

The Dirtchamber, inspired by the homonymous Prodigy album, is a mixed reality (MR) testing environment for real-time global illumination algorithms. It evolved while I was writing papers to figure out if a global illumination algorithm is suitable (with adaptations) to be used as a relighting solution for an MR scenario. Two main products came out of this: Delta Light Propagation Volumes and Delta Voxel Cone Tracing.

This suite features four samples:

  • A Light Propagation Volume renderer sample
  • A Delta Light Propagation Volume MR renderer
  • A Voxel Cone Tracing renderer sample
  • A Delta Voxel Cone Tracing MR renderer

Both, MR and non-MR samples are rendered with a deferred renderer and a postprocessing pipeline which includes FXAA, SSAO, Depth-of-Field, crepuscular Rays, Bloom, a CRT monitor effect, TV grain, exposure adaptation and tonemapping. The model loader uses Assimp, which means you should be able to load a good range of different model formats. The deferred shader uses a physically-based BRDF model with GGX as its normal distribution function. Both MR samples as well as both non-MR renderers share a common file and switch between the volumetric GI method with a define.

The Dirtchamber uses Dune, a Direct3D helper library which includes many classes that simply wrap and manage the absurd amount of pointers and structs needed to upload stuff to the GPU. Hopefully, you’ll find something useful in there too.

Tracking is realized with OpenCV and textures representing Kinect cameras or simple webcams. If you have access to neither, you can still use the tracking code with still images, since the tracking code only cares for cached textures which contain an image without regard for the source. This functionality is also useful to exchange scenes with other researchers to compare different rendering solutions for mixed reality.

Read on

LEAVE A REPLY

Please enter your comment!
Please enter your name here