top of page

Section 31 /Behind the Virtual Production - The Baraam Lounge

  • Fausto Tejeda
  • Jul 3, 2025
  • 4 min read

Since its debut in 1966, Star Trek has consistently pushed the boundaries of storytelling and technology. From imaginative alien landscapes to futuristic starships, the franchise has always used cutting-edge tools to transport audiences to bold new worlds. Over the decades, this commitment to innovation has only grown stronger.

In Section 31, the latest chapter in the Star Trek universe, that legacy continues—with the help of groundbreaking Virtual Production techniques. In this post, we’ll take a closer look at how state-of-the-art technology played a pivotal role in bringing this cinematic vision to life. From immersive LED volumes to real-time environments powered by Unreal Engine, we’ll explore how the PXO team used modern tools to tell a timeless story in a way that feels both epic and grounded.


creativeprocess


For Star Trek: Section 31, Virtual Production wasn’t just a visual tool—it was a storytelling partner. Two of the film’s key locations, the Baraam Lounge and the Outpost Tunnel, were fully realized using real-time environments projected on an LED volume.

These environments offered the creative team unprecedented flexibility while preserving the cinematic feel expected from the Star Trek franchise.


baraamLounge


  • The Baraam Lounge was a significant technical achievement—it was Pixomondo's first fully enclosed 360° Virtual Production Environment. It was engineered with integrated wild walls to allow physical camera movement in and out of the space, offering filmmakers an unprecedented level of flexibility while maintaining seamless visual continuity.

    We had 2 large practical pillars on set, which were seamlessly extended into the Volume, which wrapped around the back of the virtual bar.

  • We had 2 different lighting scenarios we had to account for, Neutral and Party Mode. While neutral provided a more nuanced and calm environment, party mode included moving spotlights and flashing lights. These lighting changes also had to affect all the characters and structural elements in the environment.

  • To enhance the realism and scale of the scene, we incorporated a total of 599 digital doubles (a mixture of humans and aliens alike). These were a mix of full digi doubles, and flipbook elements of pre-recorded clips of the actors in costumes. It was broken down into 258 2D flipbook characters and 341 3D characters.

  • Our team relied heavily on Unreal Engine's features, using Nanite to handle dense geometry and light bakes to deal with the various lighting conditions.

  • Actors could interact directly with their environment, which led to more grounded performances and faster turnaround times. Rather than relying on green screen compositing in post, we captured final-pixel in-camera VFX (ICVFX), meaning what the camera saw on set was already the final look—reducing postproduction workloads and enabling more collaborative lighting and cinematography decisions in the moment.

virtualArtDepartment

  • The process began in preproduction, where we worked closely with Production Designer Paul Kirby and the Art Department team, and Director Olatunde Osunsani to translate concept art into a workable layout inside Unreal Engine. This allowed us to prototype environments at scale, plan lighting, and explore camera moves weeks before physical production began. Our Virtual Art Department (VAD) played a crucial role here, taking high-level design ideas and iterating them quickly into Unreal-ready assets. We scanned practical set elements and integrated them into our environment, to ensure visual continunity.


  • During this process, we have bi-weekly reviews on the stage, leading up to the shoot, in which we define the scale and layout of our virtual assets. We also establish which sections will be our hotspots (main areas of interest).





onsetIntegration

  • On set, the real-time nature of Virtual Production meant that departments had to work in concert—often simultaneously. For example, as the DP adjusted lighting for a scene, our Unreal team would tune environmental light sources to match. Since we were capturing final-pixel content using in-camera VFX (ICVFX), these decisions were no longer theoretical—they were locked in with each take, and often times made within seconds or minutes.


  • To maintain spatial accuracy, we used camera tracking systems synced with Unreal’s virtual cameras. This ensured correct parallax and frustum alignment as the real camera moved. This proved to be quite complicated, given the amount of practical elements in the lounge, but our tracking team wsa ready with a solution for every hurdle. One of the final steps before the shoot, is scanning our practical set and lining it up with our virtual environment, to ensure pixel perfect accuracy.


realtimeControl/Flexibility

  • What made the workflow truly powerful was our ability to respond to creative input live. If the director wanted a different color on the nebula, a deeper shadow on the balconies, or a pulsing light effect, those changes could be implemented instantly by the Virtual Production team without resetting the scene. At PXO, we create all our environments to be R.A.F. ready:

    • Reliable | Needs to be performant, first and foremost. Without a working environment, we can’t shoot.

    • Adaptable | Environment needs to be built in a way that anyone who’s operating it during a shoot can easily find their way around and quickly address notes.

    • Flexible | The ability to make changes (within reason) on the fly, to accommodate DP and Director feedback (such as adding a new hotspot on the fly).

  • For instance, triggering different reactions to our digi doubles based on Director cues, both standing and reacting to a stunt sequence.

  • We also had quick control to switch between our lighting scenarios, for a standard relaxed mode, and a party mode which had spot lights swinging left and right, and flickering lights on the balconies. All of which affected light bounces on the environment and characters.

  • We included controls to change the colors and intensity on the background nebulas and twinkling on our stars.



In the end, everyone involved was thrilled with the results, and all the innovations we provided for this environment. It was a testament to what a great, synergistic working relationship can do.

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
bottom of page