Studios are increasingly using multi-camera systems to simplify editing and post. Various cameras are set up to record the scene from different angles.
The Fraunhofer Institute for Integrated Circuits (IIS) has developed a plugin for The Foundry’s Nuke that combines the data from many cameras and calculates a virtual camera movement from it. This was previously only possible with FUll-CGI. The program calculates new depths from recordings of multi-camera arrays, corrects the colour and light and thus provides the basis for a deceptively realistic virtual setting.
There’s already a taste of what’s to come at www.iis.fraunhofer.de/lightfield: in collaboration with Stuttgart Media University, IIS is demonstrating what light field technology can do in live action filming.
Interesting for professionals: the researchers are currently looking for first-time users to test the new tool in their production.