Virtual production revolutionized filmmaking. Despite the large, expensive setups, the technology opened the door to new, less costly, streamlined methods of production. But, as we know, there is always something new, and better, around the corner. To that end, Lightcraft Technology, provider of one such virtual production system, has pivoted from those roots and founded an entirely new method that’s faster, mobile, and much, much cheaper, making it affordable for everyone. This new system is built around an iPhone.
To be successful in filmmaking today, production and post teams must adapt to ongoing demands and changes within the industry. Lightcraft Technology understands this and, in fact, has followed this principle even within its own company walls.

Lightcraft has undergone a transformation since its founding nearly two decades ago by Eliot Mack, CEO. A few years ago, with assistance from Bill Warner, who had founded Avid (known for its video and audio editing software used in the media and entertainment industry), Lightcraft radically transformed its virtual production technology system and, in the process, is transforming the filmmaking industry once again.
Mack and Warner found common ground in their shared drive of finding a better way of doing things. Years ago, Mack had caught the filmmaking bug and tried to make a short film but had become frustrated with the visual effects process. Warner had been working on marketing videos at the time while at Apollo Computer and had grown frustrated with linear editing decks, commenting to Mack that he should come up with a better system. Just two guys venting, right? Wrong.
Warner went on to launch Avid, which shook up the industry by introducing a real-time digital non-linear editing system in 1989, following in the years since with a host of other innovative tools and technology for the ever-evolving M&E industry.
Meanwhile, Mack was convinced that he could build a machine that could figure out the location of a camera in 3D space. So, he put his engineering/robotic experience to work (he had designed the original iRobot Roomba and walking dinosaurs at Disney Imagineering) and started Lightcraft in an attempt to solve the visual effects problems he himself had encountered.
“I wanted my own sort of Industrial Light & Magic in a box that did all the groundwork for me so I could have this expansive vision without having to hire a thousand people—something unavailable to me [and others] at the time,” Mack explains.
In his quest, Mack had learned a bit about motion-picture work and enough about visual effects to understand how they worked. He also realized that tracking was a vital part of the process.
This was about 20 years ago, and Lightcraft was born. The company built its first product, Previzion, a real-time visual effects system that provided high-precision camera tracking, rendering, and VFX keying for use in on-set compositing of virtual backgrounds and CGI characters. The Previzion Virtual Studio System also provided accurate camera-tracking metadata for use in postproduction rendering, as well as on-set visualization for feature films, commercials, and high-end television production. The systems were large, hard to use, and expensive—$150,000–$180,000 each, which was out of reach for indie creators, including Mack. However, they made it possible to do things that nothing else could do at the time.

In 2013, the company received an Engineering Emmy for the Previzion system, which was used on notable, cutting-edge productions including Once Upon a Time, Alice in Wonderland, and many others.
Round Two
Looking at the future, Mack began rethinking the system. He contacted Warner about five to six years ago, telling him, “I know there’s something here, some type of magic,” but it was eluding him, and he couldn’t figure it out. The two joined forces and began dissecting the process into its necessary parts and determining what was really needed to create pixels on the screen. We were rethinking this fundamental process, Mack says.
“People think of virtual production as giant LED walls and roomfuls of stuff. It’s really the merging of live action and CG in real time so you can see what you’re doing. That’s it. You need to know where the camera is and how to put the pieces together. LED walls are one way of doing it, but they’re an extraordinarily expensive way to do this—it’s $40,000 a day to light one of those things up,” says Mack. “We realized, wait a second, we can do this with the phone that’s already in everybody’s pocket.”
This, of course, was the time when iPhones were starting to do tracking in 3D space, a capability that substantially improved with the iPhone 12 Pro and later generations.
“This was one of those times when, every once in a while, a watershed moment occurs and the world changes,” says Mack.” So we switched gears. We stopped building [Previzion] hardware and started building Jetset.”
Fast-forward a few years, and Lightcraft basically implemented all the various pieces of the virtual production pipeline. They built all the infrastructure and tooling, an entire tool chain. “We started out aiming for and accelerating the postproduction side of things. But this new piece, the iPhone, meant we could bring virtual production everywhere,” Mack notes.

Jetset, introduced in February 2024, is a mobile app that enables users to access a full range of virtual production tools for any stage in the process—from early story development to the last days of post, all using an iPhone or iPad. No longer were large, cumbersome, expensive systems needed for virtual production. While this new invention spelled the end for Lightcraft’s Previzion system, that endeavor still lives in part within the Jetset tech.
Lightcraft has been working continuously on Jetset for a half decade now, having just announced a major update and introducing live cinematic compositing (made possible by Accsoon SeeMo), enabling users to block shots and create dailies quickly. And even though the iPhone runs at 30 fps, since Jetset is handling the rendering and compositing, it can generate output footage at 24 fps, Lightcraft says. It also has Gaussian Splats support, so the quality of the composited footage is better, enabling near-photorealism at real-time speeds.
Also, new integrations with Aximmetry—known for its high-quality keyer and virtual production software—provides a fast path to Epic’s Unreal Engine for a wide range of productions, including YouTube, live-to-tape, and even LED wall-based projects. Lightcraft points out that Jetset puts Gaussian Splats, compositing, keying, tracking, scanning, etc. all in one location.
Running on an iPhone, Jetset can operate as the primary camera or, with the Cine version, connect with mirrorless cinema cameras (ie, Sony FX3 and Canon R5) or popular production cameras like Red and Arri. Mack explains that once connected, Jetset will capture the camera’s original video, a real-time composite, a lidar depth map, and the complete tracking information, so an organized set of data can be used throughout the entire production.

And because it runs on an iPhone or iPad, Jetset can bring more freedom to any environment, indoor or outdoor, allowing creatives to blend real-time 3D and live-action footage just about anywhere, and do it fast and efficiently.
“People just accept that productions have to be slow and heavy, when they really don’t,” says Mack. “They are used to heavy infrastructure and the brain bar, and it running in the phone is a mental leap.”
Neither do the systems have to be super expensive. According to Lightcraft, with Jetset, virtual production is 20× cheaper and offers near-photorealistic quality. As a result, it opens the door for more independent filmmakers to pursue their vision.
The new Jetset update is available now in the standard, Pro, and Cine versions. Standard licenses are free, while the Pro and Cine versions, which offer more production-focused features, cost $20 and $80 per month, respectively.
The Jetset update is off to a running start. Introduced at this year’s NAB, it received a 2025 NAB Product of the Year Awardin the Camera Support, Control, and Accessories category.
Meanwhile, Mack and his team continues revising Jetset. The company is working on an AI aspect, particularly to facilitate creation of photorealistic 3D sets. This includes adding an AI relighting system.
LIKE WHAT YOU’RE READING? INTRODUCE US TO YOUR FRIENDS AND COLLEAGUES.