VFX solutions company Foundry, is revolutionizing the revolutionary process of virtual production. Foundry has introduced Nuke Stage, which provides filmmakers with an end-to-end solution, from preproduction to final pixel within a single content creation pipeline, eliminating the need for real-time game engines in the process. Moreover, Nuke Stage uses industry-standard VFX tools and offers live compositing and layout.
So much for tradition. Virtual production has changed how films are being made, replacing the use of age-old bluescreen and greenscreen. Virtual production has experienced fast growth over the past several years thanks to the many advantages it brings to filmmakers, from directors to the cast and crew, all of whom are able to visualize the set and environment in real time. Virtual production combines traditional methods with digital techniques, as high-resolution CG imagery is projected onto large LED screens, resulting in realistic environments and effects.

There are plenty of advantages to virtual production, but these come with changes in the traditional, familiar VFX workflow—one being that the CG imagery needs to be produced in advance, in preproduction instead of in post. Also, to achieve the real-time rendering that’s key to virtual production, a game engine like Epic’s Unreal Engine typically is inserted into the process.
However, Foundry is changing the virtual production game, announcing Nuke Stage, an end-to-end solution for virtual production and in-camera visual effects (ICVFX) workflows.
Nuke Stage is a purpose-built stand-alone software that links preproduction through final pixels in one pipeline, providing VFX artists full creative control over imagery and color from start to finish. It enables real-time playback of photorealistic environments onto LED walls and live compositing and layout—albeit without a game engine or other virtual production or ICVFX tools the team is unfamiliar with using.
Foundry says Nuke Stage brings key aspects of compositing tailored for real-time playback in an on-set environment, and utilizes open standards to ensure consistency between pre-production, on-set, and post.
Through a shared color management system, open file formats, and a familiar node graph-based compositing environment, Nuke Stage enables VFX artists to create, refine, and deliver content that transitions seamlessly from preproduction to real-time iteration on-set. The high-res imagery can then be played back as 2D, 2.5D, or 3D imagery on the LED walls.
Teams can use industry-standard formats, including OpenUSD, OpenEXR, OpenColorIO, throughout the process, as well as standard content creation tools. Nuke Stage is hardware-agnostic, so studios can use their preferred hardware to synchronize across render node clusters.
One virtual production studio user described Nuke Stage as offering a handshake between VFX and virtual production, which he says has been missing in VP. He adds that getting VFX teams on board will help to push the use of virtual production. No doubt, having the Foundry branding behind the new application is a plus, especially due to studios’ familiarity with the Nuke compositor. In fact, as part of the Nuke family, Nuke Stage’s UI will be familiar to users, Foundry says, offering a familiar workflow for building their imagery.
Meanwhile, Sam Kemp, production technical lead at Garden Studios, expressed his excitement of Foundry bringing VFX compositing tools into real time, enabling users to manipulate and blend virtual and physical sets with a node-based compositing toolkit optimized for real-time performance.
The virtual production process has been evolving recently. Last week, Chaos rolled out Chaos Arena, which enables teams to use their typical pipeline to quickly move 3D scenes from popular DCC software onto LED walls, using the same assets from preproduction to post.

LIKE WHAT YOU’RE READING? INTRODUCE US TO YOUR FRIENDS AND COLLEAGUES.