Project Description:
At Siggraph in 2019, Unreal Engine showed for the first time a technical demonstration of a new generation of LED-based in-camera VFX¹ virtual shooting technology. Because of its freedom of creation free from the constraints of time and space, it has won many in Hollywood Favored by visual effects studios. Major productions such as "The Mandalorian" and "Westworld" have adopted this technology. By 2020, it will bring a glimmer of light to the film and television production industry under the haze of COVID-19.
Seeing that European and American counterparts use new technologies in actual combat, many domestic film and television companies have invested in the research and development of a new generation of virtual shooting solutions based on the Unreal Engine camera internal visual effects after being unwilling to do so. Among them is a film and television startup called Surreal founded by a group of veterans in the virtual filming industry. When they saw the demo of Epic Games’ new technology, they immediately realized that compared to the previous generation of virtual production solutions based on green screens, in-camera VFX can solve the problem of light-colored surfaces overflowing with green and metal surfaces against green. The problem is that the ability to simulate real ambient light allows actors to have the advantages of performing in a closer environment. It can be said that Unreal Engine has once again subverted the entire virtual shooting industry! So not surprisingly, Xian Yang immediately invested in the research and development of a new version of SURA, an autonomous virtual shooting solution.
As we all know, "practice is the only criterion for testing truth." This is why Epic Games’ own "Fortress Night" is always the first "crab-eating new feature"; oh no, it's the "crab-eating person"; it's not right, just call it the first "crab-eating" Game of it. This is of course no exception for SURA's new virtual shooting function based on LED Stage.
So Surreal teamed up with the well-known Craft Creations (Shanghai Hejiang Film and Television Production Co., Ltd.) team not long ago to complete a "practical exercise" starting from the actual needs of commercial films and TVC and focusing on the application of LED stage and virtual production.
How to Use Digital Technology to Tell Story:
In this joint test, the team jointly built a giant curved LED wall of 45m x 6m, which is composed of 1080 LED panels and a total of 38,776,320 pixels. But that's not all. The team also built a huge canopy of 21m x 15m and a total of 20,643,840 pixels composed of 1280 LED panels.
It will be a huge challenge for any engine to render such a huge image in real-time photo-level! The new version of SURA uses Unreal Engine's nDisplay as the cornerstone of its LED intelligent virtual shooting system. NDisplay in UE4.25 provides new functions such as RTT viewport, picp projection strategy and support for multiple screen synchronization strategies, which are indispensable functions for realizing in-camera VFX.
The test used a total of 5 workstations to form a cluster rendering, and hardware synchronization was performed through the newly supported Nvidia Quadro Sync technology of nDisplay, of which 4 were responsible for rendering the LED wall and 1 was responsible for the rendering of the top LED.
In order to ensure the testing of the dual-camera solution and to reduce the system delay as much as possible, the project must always run stably at 50FPS. In order to achieve this goal, Surreal has developed an intelligent and dynamic allocation of rendering resources for each nDisplay window according to the camera attitude. Features.
At the same time, in order to leave more creative space for the later stage of film and television, SURA supports the display of the LED area where the camera’s frustum is located as a green screen. With the on-site lighting and scheduling, the problem of LED green screen overflow is basically solved, and the problem of LED green screen overflow is basically solved. The advantages of LED simulation of ambient lighting.
It’s also worth mentioning that, in order to cope with the rapidly changing needs of the site, the testing team used the built-in Multi-user Editing function of Unreal Engine, which enabled multiple people, multiple stations, and multiple jobs to edit simultaneously in the same project. No interference, greatly improving the on-site response speed and shooting efficiency.
How to Enhance User Experience
With the help of Unreal Engine technology, we are able to create the world's leading visual virtual production tool, a standardized production process, and a complete virtual production process from pre-creation, real-time monitoring of the shooting site, and post-rendering synthesis and color correction. In the future, we will continue to develop and upgrade processes and tools, constantly liberate creative productivity, reduce production personnel and time costs, and help promote the digital transformation of the film and television advertising industry.
Potential Market Value
The use of LED virtual shooting is the current development direction. It further liberates the middle and late stages of film and television, sharing part of the difficulty of the lighting engineer, and at the same time providing a more realistic reference for visual effects, and reducing part of the work of visual effects production. the amount.
Making good use of this system can help new directors better plan and experiment with their own ideas, and can help directors who are accustomed to real-life shooting, allowing them to cross the green screen barrier, touch themes they dare not touch, and give them greater play to their abilities. The audience brings better and more thoughtful works. "
Comments