Navaz and Roman Dowling extensively used virtual production to produce Entrenched, a polished Star Wars fan film that has seen them gain work in film and TV

Producing a visually ambitious Star Wars-inspired fan film for just $3,000 (£2,300) wasn’t a stunt - it was a strategic exercise in rethinking how cinematic storytelling can be achieved under constraint. Virtual production wasn’t just a tool we used; it was the foundation that allowed us to bypass the cost structures of traditional filmmaking while preserving creative control, visual fidelity, and narrative scale.
In conventional workflows, the cost of production scales rapidly with physical logistics. Location scouting, set construction, lighting rigs, camera rentals, crew hires, and post-production VFX all contribute to ballooning budgets. Even a modest sci-fi short can easily cross into five- or six-figure territory. Our approach inverted that model. By front-loading asset creation and leveraging real-time virtual production tools, we eliminated all of the most expensive line items before they ever appeared.
Jetset was central to this transformation. It allowed us to render and manipulate virtual environments in real time, compositing live-action footage directly into digital scenes during the shoot. This meant we didn’t need to build physical sets or rent studio space. We shot the entire film in a garage using an iPhone and a green screen. Jetset’s real-time feedback loop let us adjust lighting, camera angles, and scene composition on the fly - something that would typically require a full lighting crew and post-production color grading.
Blender, the open-source 3D creation suite, was our asset pipeline. We spent months building the terrain, vehicles, and environmental details of Hoth with forensic precision. In a traditional VFX workflow, this would involve multiple artists, proprietary software licenses, and render farms. Blender allowed us to work independently, iterate freely, and maintain full control over the visual language - all without incurring those costs.
But virtual production isn’t just about software—it’s about shifting the entire production logic. In traditional filmmaking, many creative decisions are deferred until post: visual effects, environment integration, even final framing. With virtual production, those decisions move to the front. We had to lock in story beats, camera logic, and asset fidelity before we ever hit record. That required more planning, but it also meant that once we were on set, we could shoot with surgical efficiency.
This front-loaded approach created a stable foundation for iteration. After screening our teaser at AI On The Lot, we received feedback from industry professionals and went back to refine several scenes. In a conventional pipeline, that would mean reassembling crews, reshooting footage, and re-rendering effects. For us, it was a matter of adjusting virtual camera paths, tweaking terrain geometry, and re-exporting composite shots. The cost? Minimal. The flexibility? Game-changing.
We also made deliberate choices about hardware. Instead of cinema-grade cameras, we used an iPhone on a Blackmagic 6K. Our reshoots, we used a Canon 80D. That wasn’t a compromise—it was a calculated decision based on our controlled lighting setup and Jetset’s compositing precision. The footage held up because the environment was engineered to support it. In traditional workflows, camera choice dictates much of the budget and workflow complexity. We flipped that equation by designing the environment to accommodate the camera, not the other way around.
Costuming and props followed the same logic. Props were 3D printed, weapons were hand-painted, and outfits were sewn by friends. We didn’t outsource fabrication or rent gear. By designing assets to integrate with virtual environments, we avoided the need for high-end physical detailing. In traditional productions, props and costumes must withstand close scrutiny under varied lighting and camera conditions. In our pipeline, they were optimized for compositing—meaning we could focus on silhouette, texture, and integration rather than durability or realism under studio lights.
The result was a film that visually evokes the scale and atmosphere of Star Wars while operating on a fraction of the budget. More importantly, it demonstrates that virtual production isn’t just a cost-saving measure - it’s a creative equalizer. It allows independent filmmakers to tackle stories that would otherwise be out of reach, not by cutting corners, but by redesigning the corners themselves.
Virtual production gave us the ability to plan with precision, shoot with agility, and iterate without penalty. Compared to traditional workflows, it replaced logistical overhead with creative freedom. And in doing so, it turned a garage shoot into a cinematic experience.

Navaz and Roman Dowling are co-founders of Bad Beetle Entertainment
No comments yet