F u l l   P a n o r a m i c   M i x e d   R e a l i t y


Photorealistic rendering of synthetic objects into omnidirectional RGB-D Panoramas


Panorama images have been extensively used in the past. As a final representation of a set, a background plate, as reflection maps, as environment light maps or even to support the modeling of the environment geometry. However, panoramas with the light field representation when combined with the scene geometry can achieve rendering calculations not yet explored. Those particular sets of panoramas are here called light-depth maps or RGB-D panoramas. The intent of this project is to show how they can produce more accurate photorealistic rendering of shadows, light and reflections for synthetic elements inserted in a captured scene than its current alternatives.

The relevance of this project is reinforced by the growing demand for panorama content production. This is in part due to the modernization of old planetariums into digital fulldome projection systems, the existence of panorama capturing devices such as Gigapan and LadyBug and new gyroscope friendly consumer devices and applications such as Google Street View, and the soon to be released Nintendo Wii U Panorama View.

Additionally, panoramas can be used for traditional film-making aimed at conventional displays. Light-depth maps can be built in affordable ways and increase the quality of the augmented reality rendering productions.