About the Virtual Production Institute
The Virtual Production Institute is the nation’s first institute of its kind that will comprehensively integrate real-world scenarios and the latest in extended reality technology to advance problem-solving and support workforce development across industries. The new institute is part of the Texas A&M College of Performance, Visualization and Fine Arts.
Credit: Image courtesy of Sony.
The institute is based on the Bryan-College Station campus with an extension at the new Texas A&M-Fort Worth campus. Institute faculty, staff and equipment to support performance capture, large-scale mixed-reality environments, technology-infused classrooms and high-performance computing and instrumentation were funded as a special item by the 88th Texas Legislature at $25 million.
Students learn the art and science of the development and applied use of extended reality — which incorporates augmented and virtual reality, display technology, sensing technology, artificial intelligence, real-time 3D graphics and simulation — using the latest technology that will prepare them for an expanding Texas job market. The institute will also support enhanced curriculum across the university as other schools and departments tap into virtual production capabilities that align with changing workforce needs.
A minor in virtual production and related courses in Texas A&M’s Visualization program are offered at both the main campus in Bryan-College Station and in Fort Worth.
The institute’s reach extends beyond media and entertainment, branching into product and architectural design; training for health care, first responders and the military; live performances; and in creating digital twins in manufacturing and aerospace. The university can collaborate with industry members to provide hands-on experience to students and explore new applications for virtual production.
The Bryan-College Station location provides direct linkage with the existing strengths of the academic programs in Visualization and their resources, including personnel, facilities and the large number of enrolled students at the main campus. Proximity to Austin’s media and entertainment companies and to the simulation and training activities at the RELLIS Campus also strengthen the institute’s activities in Bryan-College Station.
The Fort Worth location provides the opportunity to augment Texas A&M’s initiatives there with a visually compelling and technology-forward enterprise through the institute. Fort Worth also provides proximity to manufacturing, logistics, media and entertainment industries.
What is Virtual Production?
Bringing the virtual and physical together, virtual production creates immersive worlds where a subject can see and be affected by what is happening in that world, all of which is captured in-camera in real time.
This is achievable through virtual production stages, which incorporate walls of LED screens projecting computer-generated imagery to create these environments. This technology outpaces what was possible with green screens, which required building the imagery in the postproduction process. In virtual production, it’s primarily in-camera — meaning that the virtual environment and the live onstage actors, props and set pieces appear integrated from the viewer’s point of view. This augmented reality enhances the experience both onstage and for external audiences.
The computer-generated imagery is built in real-time game engines, a key aspect of virtual production. Game engines historically lacked the fidelity needed to make images look believable as real-world items. Recent advancements in computing have allowed game engines to process and display photorealistic imagery that matches the fidelity of television and film. Motion tracking on the virtual production stage keeps the viewer’s or camera’s point of view, and the perspective of the displayed virtual environment aligned.
The result is the ability to create realistic scenery and new worlds that subjects can interact with. It’s a wide-open canvas for creativity, and a significant technological step forward.
What are the Benefits of Virtual Production?
Immersion
An early game-changer in virtual production arrived in 2019: the Lucasfilm streaming series “The Mandalorian.” The Disney+ program featured a wide array of realistic virtual settings — from desolate landscapes to industrial interior scenes — creating a seamless environment with the performers. Virtual production allows these worlds to respond to the camera as if the camera is actually in that world. The environment, effects and performances were all captured in-camera in real time. For the talent, they are immersed in the environment and can respond to the look, feel and action displayed on the LED panel walls surrounding the stage.
Lighting
The ambient light in the LED panels interacts with the performers, creating a more advanced version of a director’s vision. This is an improvement from green-screen technology, especially with transparent or reflective items. This is again shown in “The Mandalorian” with Mando, the main character, with his shiny helmet and suit perfectly reflecting the environment. With a green screen, the shiny surfaces would reflect green. This would require extensive postproduction to remove, or complex on-set lighting setups to prevent, or a redesign of the character’s costume.
Camera Work
Virtual production unlocks new capabilities for cameras, both physical and virtual. Sensors in the LED wall and on the physical camera track the camera’s movement. The data collected — including location and lensing information — is fed through a virtual camera in the game engine showing the computer-generated scene on the LED wall. The virtual replicates the real, mirroring the camera and moving in tandem with it. Multiple points of view can also be supported.
Flexibility
Fast, on-set alterations are a significant benefit in virtual production. If the computer-generated imagery includes a building that needs to be shifted elsewhere, that can be done on the fly. If a boulder doesn’t line up quite right with the planned shot, the boulder can be moved. Stopping to modify a physical structure has been replaced with a quick click of a mouse. Flexibility extends to time of day; the use of maps or environments from multiple sources; and the capacity to integrate real-time data to modify or manipulate the environment or objects within it.
Time
Flexibility saves time. Filmmakers often aim to shoot during the ideal times around sunrise and sunset, but such “magic hours” are naturally limited. Shoots often extend through long hours or weather changes, which can alter the angle or intensity of the environment lighting. Shooting at multiple locations requires travel time. These time and location limits are significantly reduced with virtual production, which is a major boost to live-action shoots and training scenarios. Real-time rendering also reduces the time it takes to render offline, and postproduction of compositing in visual effects that are now captured in-camera.
Efficiency
These benefits add up and can lead to significant cost savings. There is more control over the time it takes to film. The locations can be changed with ease. And the environments, effects and performances are all in-camera, reducing the need for changes in the postproduction process.