→ 2024
→ 2023

The Metamorphosis of Storytelling: Time-based Interactivity in Virtual Reality Filmmaking

Research by Kai Feng Wu & Dana Karwas

read the paper (pre-publication) 

Our work addresses a novel technique for free navigation in a Virtual Reality film and explores the spatial - temporal narrative implications of such a technique.

With the increasing popularization of VR filmmaking, under current technology, filmmakers who choose to represent a realistic scene are faced with a choice between either graphically intensive real-time 3D rendering on a high-end VR system, or a solution based on Omnidirectional Stereoscopic (ODS) Imaging captured by a camera. Commonly known as 360 videos, they suffer from a fixed viewpoint. To address this specific issue, many recent advancements in the field proposed novel view reconstruction from spherical panoramas, planar image - based rendering, light field rendering, etc. for free-viewpoint navigation.

We explore the interactive implications of implementing such techniques for VR filmmaking. Specifically, the temporal relationship of viewpoints and the experience of navigating through them. From an initial proof of concept with a linear path, we continued to develop the technique through installations and experimented with an infinitely looping circular path around NYC grand central terminal’s clock tower.

We aim to integrate multiple optimizations and techniques into an easy-to-use open-source package for filmmakers. Including storage optimizations, novel view synthesis, and locomotion redirection. We have found that while the technical package is complicated to explain, the experience is delightfully intuitive. By sharing our discoveries with the world, we hope to enable filmmakers of the future to tell compelling stories that matter.

The interactive installation was developed and showcased at Yale CCAM in December 2021.

The resulting paper was accepted and will be published / demoed at ACM SIGCHI 2024.