SouVR.com>Case>Motion capture>OptiTrack

VR and Filmmaking Collide in the Making of “The Lion King”

Disney’s “The Lion King,” directed by Jon Favreau, is a nearly full CG feature with a photoreal aesthetic that was essentially filmed in virtual reality (VR). To achieve the photoreal look, Favreau and his team, including VFX Supervisor Rob Legato, the creative geniuses at Magnopus and artists at MPC, developed the film’s environments as 360-degree assets, and then optimized them along with the film’s characters for real-time game engine rendering. This virtual production approach effectively transformed “The Lion King” world into a VR experience, allowing Favreau, Legato and Cinematographer Caleb Deschanel to shoot the film utilizing traditional live-action techniques.

As with most large-volume, multiparticipant VR experiences, motion-capture technology helped define the film’s shooting volume, which was approximately 120 feet by 60 feet overall but segmented for different uses. A ceiling-mounted truss with 60 OptiTrack Slim 13 and Prime 17W cameras outlined roughly 40 feet by 60 feet of the overall volume, precisely tracking active markered objects in the space for the movie’s sweeping shots.

“Our original directive was to design a virtual production system using commercially available technology that would feel familiar to professional filmmakers,” said Magnopus Virtual Production Producer AJ Sciutto. “In building out the setup, we knew we’d need the flexibility to record large scale movements, like running characters, so room-scale tracking wouldn’t cut it for a fair amount of shots. High performance with the lowest possible latency solve was a must and OptiTrack Active provided that.”

“We designed the volume truss system with the OptiTrack system in mind and their support team worked closely with ‘The Lion King’ Key Grip Kim Heath to deploy the setup on stage,” added Magnopus CTO Lap van Luu. “OptiTrack’s auto-calibration feature saved us a ton of time once we got it working in production.”

Magnopus built intuitive virtual production interfaces that allowed users to interact with the technology in a creative-driven way. The custom tools took the data that was tracked and solved by the OptiTrack system and multicast it over the facility’s network via game engine. The data came in as a solved object, simplifying how filmmakers could determine orientation and position in the volume and make changes at will. Magnopus Virtual Production Operators Fernando Rabelo and Mark Allen made sure all systems ran smoothly throughout production and addressed any one-off requests from filmmakers.

In the volume, tracked objects mainly comprised representations of real-world cameras, which were designed, 3D printed and embedded with active LEDs by Magnopus Virtual Production Hardware Supervisor Jason Crosby. This made shooting with the virtual cameras, whether with a Steadicam or drone-mounted camera, more akin to live action. Filming was essentially conducted inside a massive digital file, so the filmmakers could make changes on-the-fly in VR, such as adjustments to the lighting, positioning of the sun or tweaking a character’s animation, and see real-time results versus waiting for renders. Assets and animation were continuously refined throughout the filmmaking process by MPC artists, with the work made progressively photoreal as shots and sequences were determined.

Left: Magnopus mechatronic engineer Jason Crosby fabricating custom 3D printed OptiTrack pucks.
 Right: Low latency camera data is captured by the OptiTrack system, rendered by the engine, and displayed on Henry Tirl's Steadicam monitor in real time.

Over the course of a nine-month principal photography schedule, two days of shooting stuck out to Sciutto as indicative of virtual production’s potential. “The first day we shot with the drone was a big aha moment,” he explained. “We went through a lot of trial and error leading up to the shoot, both in terms of where and how to attach the LEDs and what type of drone would work best within the building. We brought in a helicopter pilot to try out the volume using a few different drones, and determined the longest distance we could cover, corner to corner, tracking-wise. The shot was intended to ultimately to showcase the African landscape and reveal Pride Rock so we scaled the volume 35 to 1, added some C-stands as physical placeholders for trees, then our pilot expertly flew the drone through the virtual shot with perfect tracking from OptiTrack. Everyone loved what we captured, and it was used for the final shot.”

The subtle banking motion of drones (operated by Kevin LaRosa of Helinet) were captured inside the OptiTrack volume with a Magnopus custom designed and 3D printed 'spider' puck.

Sciutto was equally impressed when the team started filming with the virtual Steadicam. In preparation for the shoot, Crosby put steel blocks on the camera operator’s Steadicam rig to mimic the weight and feel of a true 75-lb rig, and added a 3D printed tracker with active LEDs tracked by the OptiTrack system. The camera operator primary filmed in a 25-foot by 25-foot space within the larger volume that was optimized for high fidelity tracking. Outside this space, a dolly was used to drive the volume through the virtual world, in which the Steadicam operator could then film. While up to 20 feet of physical dolly track could be used on-stage, the virtual distance covered could scale as much as 600 yards by recalibrating the real-to-virtual world translation. For example, every foot of physical track travelled equated to moving 30 yards in the virtual world. This methodology was applied to a wide range of distances as well as different directionality, not just laterally. While most objects in the volume were tracked optically, Magnopus chose to track the dolly using positional encoders.

“Since ‘The Lion King’ is shot similar to a documentary, the DP, Caleb Deschanel, ended up using longer lenses than is typical with features,” Sciutto noted. “And when you’re shooting with a 600mm lens, just the blood pumping through your veins can cause the camera to shake. That would get picked up by the OptiTrack system because it’s so accurate, so using the encoded system allowed Caleb to shoot however he wanted while keeping the image steady.”

After pioneering a new way of filmmaking alongside Favreau, Legato and Deschanel, the Magnopus team is applying lessons learned on “The Lion King” as they continue experimenting with workflows and refine virtual production processes for new projects.

more>>

Brand Zone

  • Haption(0)
  • Patchwork3d(0)
  • Xsens(0)
  • Manus(0)
  • ART(0)
  • CyberGlove(0)
  • DGTech(0)
  • 5DT(0)
  • Ascension(0)
  • NDI(0)
  • Polhemus(0)
  • EON(0)
  • Vuzix(0)
  • More3d(0)
  • Volfoni(0)
  • VRLogic(0)
  • zSpace(0)
  • Exceptional3D(0)
  • SuperD(0)
  • Alioscopy(0)
  • Realis(0)
  • Qualisys(0)
  • CRNT(0)
  • Nokov(0)
  • Miracube(0)
  • 3Dsystems(0)
  • STEP(0)
  • 甲尚科技(0)
  • TechViz(0)
  • Middlevr(0)