For as long as filmmaking has existed, there has been a need to build fantastic worlds in front of cameras. The earliest techniques borrowed from theatre: Painted curtains and wooden, two-dimensional backdrops. Then, we painted worlds onto glass, photographing them onto the film itself to blend the real and the fake. At the same time, artists worked out how to project previously-shot film onto a screen behind the actors. These days, we’ve flipped this story, dropping real actors into digital environments that only exist inside computers. But now we’re blending the very old and the very new: Digital backdrops in “virtual” studios could end the blight of green-screened cinema, and its myriad problems. That’s what’s being experimented on in a former newspaper press in Oxfordshire, UK, where the first “all virtual” film has just been shot.
But that is a side project to his day job, as co-founder and CEO of Rebellion Developments, the British studio behind Sniper Elite. Rebellion doesn’t just make games, however, and owns comics giant 2000AD, the name behind Judge Dredd and Rogue Trooper, as well as publishers Abaddon and Solaris. Now, the company is building its own studio to create TV series and movies based upon its vast library of IP.
Percival is a short film, first broadcast on Kingsley’s aforementioned YouTube channel, and marks a new chapter in Rebellion’s filmmaking ambitions. The five-minute clip depicts a battle-scarred knight of the round table played by Kingsley, who is close to death in a forest. King Arthur is dying, the (Holy) grail is missing. The titular knight is left broken and bloodied when some unknown force gets involved. Suddenly, time speeds up, aiding his recovery, and transporting him to the eerie ruins of a church, where he receives a vision that inspires a new quest.
Rebellion says that it’s the world’s first “all virtual” production, with all of the action playing out entirely in front of a halo of large flat screen displays. These monitors are connected to PCs running Unreal Engine, where the virtual environments are produced. Essentially, the painted curtains and matte paintings of yesteryear have been replaced with LED TVs showing footage from a game engine.
You might have heard of the technique before. The first high-profile instance of its use was Disney’s The Mandalorian, which shot the majority, but not all, of its scenes in these studios. In that instance, the action was filmed in a 270-degree horseshoe of LED displays 20 feet high.
Star Wars has long been a standard-bearer for titles that push the state of the art of filmmaking. The prequel movies, shot between 1997 and 2003, leaned heavily on shooting actors on green screens, with CGI backgrounds added in afterward. This process, of standing actors in front of blue or green curtains, is known as “Chroma-Key” or “Chroma-Keying.” And after Star Wars, Chroma-Key became ubiquitous for even modestly-budgeted films with special effects. The mid-noughties saw a trend of films almost exclusively using the technique, including Sky Captain and the World of Tomorrow, Sin City and 300.
These days, green screen is everywhere: the battle of New York from Avengers Assemble, for instance, was mostly shot on a New Mexico green screen studio and then tinkered with for months by an army of CGI artists.
Virtual studios have the potential to make a big difference to film-making. Because the background and environments were already visible in the shot, there was no need to add them in afterward. It also gives actors a better handle on what they’re doing, since performing in an entirely green void can understandably hamper performances. It’s also a lot easier, and cheaper, to shoot than sending your crew across the globe to real-world jungles and deserts that even a lavishly-budgeted show like The Mandalorian could hardly afford.
He Sun is the head of Rebellion VFX. He has previously worked on The Lion King (2019), Maleficent: Mistress of Evil and the aforementioned Mandalorian. He had originally planned to run a visual effects experiment with a rented LED wall, but: “Jason [Kingsley] said ‘let’s make a movie!” The studio would exist for just three days, but COVID-19 would mean that any production could only use a skeleton crew. With less than three weeks of preparation time, the shoot was run as a test to see if the virtual studio could deliver under some of the toughest conditions.
Eben Bolter BSC is the cinematographer behind Avenue 5, iBoy and The Feed, and was brought in to shoot and light Percival. Bolter explained that the film was designed to show what a virtual studio could offer over Chroma-Key. The first, but perhaps least obvious, benefit was how virtual studios enabled better lighting compared to green or blue screen. “Jason wanted to wear his [own] suit of armor and I thought ‘great’,” said Bolter, “because straight away you’ve got this reflective surface […] blue and green screen would be a problem.” Any reflected green or blue hues would be difficult to remove in post-production.
Another Chroma-Key drawback is that it dramatically reduces the options available to cinematographers about how they shoot films. “Anamorphic lenses are old-fashioned, and in digital photography we love to use them because they give us that old-ness, to make it feel more tangible and grounded,” Bolter said. He cited Michael Mann’s Heat (1995), filled with oval bokeh, as an example of how these older lenses create “beautiful” images. VFX Crews, said Bolter, “hate anamorphic lenses because they have to artificially emulate [oval bokeh] in post-production,” making effects-heavy shots look less real as a consequence. Percival is shot with a vintage anamorphic lens, with objects behind the subject that could fall away to blur and atmospheres like in-studio smoke — all things that would be difficult, or expensive to do well, in post-production.
Bolter added that if he wanted to shoot Percival on location, it would be a long process. “If I was in a moonlit forest, we’d [shoot] from 9pm to 7am, which is horrible, and a moonlit forest is interesting because it’s so fake,” he said.
“Woods at night are [pitch dark] like the Blair Witch Project. There’s no such thing as a ‘moonlit forest’. We have massive lights on cranes and we backlight it all,” Bolter said. He continued that if, on the day of the shoot, the crew needs the lights to be adjusted, it’s a long process to take down each crane and move it to change the setup.
In Percival, the knight is “pushed” through time with a faux-timelapse in which the days and nights fly past. Since the studio has a screen covering the inside of the roof, the team could build a digital moon to move across the sky as it would in reality. But this didn’t “look and feel right” compared to the audience’s expectations of how cinematic moonlight works. In a matter of minutes, however, Bolter swapped the moon for a white, circular JPEG that rolled across the ceiling to create a more cinematic feel. Compared to the hours it would have taken in the real world, being able to make the change in mere minutes inside the Unreal engine was a revelation.
Traditional big-budget movies rely on pre-visualization; crudely-made clips that help creators get a better sense of their film. So much of the VFX material won’t be finished for 18 months after shooting finishes, so these [clips] help them understand what the film will look like. It also helps them plan the shoot, picking shot angles and lighting styles before set builders bust out the drills and hammers. With a virtual studio, “All of the VFX and set extension is done in-camera and on set,” said Sun. Rather than employ a legion of CGI artists, Percival’s environments were created in Unreal Engine by three artists in just two weeks. “Virtual production,” said Sun, “is a real-time technology, it’s rendered in real-time, and it’s interactive.”
“We had this massive chain of emails and Dropboxes,” said Bolter, “with CGI renders of what [the artists] built, and we’d be able to say ‘okay, this is great, take that away, move that.’” The crew could even make last-minute changes to the set as he was traveling to the shoot, “then, suddenly, an hour later, it’s changed.” Compared to the setup of traditional films, this was almost luxurious, especially in terms of the levels of control afforded to filmmakers.
It helps that, in the longer term, it should be far less expensive to shoot specific scenes than it would to travel to a location. Bolter suggested a hypothetical short scene featuring two people talking in a desert at sunset, which could take up to three weeks to shoot. “You’d have to fly a crew to Morocco, go out into the desert, and you’d have about 15 minutes of filming per day.” By comparison, Bolter could fly himself to Morocco, “shoot a [background] plate” which could then be played back on the virtual studio’s screens over and over again. Suddenly, that sequence takes a day, with all of the cost savings that brings.
It this part that excites the people who are bankrolling the production process, like Orlando Pedregosa. He’s Chief Production Officer at Film.io, a company backing film and TV projects, who said that LED screens can offer savings close to “99 percent of the construction and production design budget,” with background plate shots saving around “70 percent, thanks to a reduction in items like transport, insurance, hotels and cast/crew expenses.” In real terms, he said that’s a potential saving of up to €1.5 million ($1.7 million) on a single set-up for a movie. Given the effect the pandemic is having on cinema right now — with major chains facing closure for the next few months as screenings become impractical — cheaper is better.
Bolter believes that we’ve barely scratched the surface for what virtual studios can do. “If you wanted to build an office block [for a movie], you can go crazy and have a thousand cubicles,” he said, “or an infinitely ridiculous, Charlie Kaufman-esque environment and you just need to build one cubicle [as a real set] and block the edges of the physical set to hide the problems.” That’s one of the key downsides, too, since the virtual studio has a number of physical limitations to work around. “You’ve got the ceiling, you’ve got the floor, and those things don’t blend together [with the wall],” said Bolter. The freedom to shoot horizontal vistas is great, but you can’t pan up and down without risking people seeing the very obvious join.
“The ceiling is for lighting, really, unless you’re doing some sort of weird shot when you’re looking up someone’s nose,” said Bolter. It’s on filmmakers to work around these limits, which could be possible using virtual camera movements inside the computer. “The background itself can ‘conveyor’ around, to fake the sort of Michael Bay spinning steadicam shots,” said Bolter, so long as there are no props or scenery in the real studio, since “trees won’t spin.”
It’s worth saying that the technology is not at the point where you could dump every project in front of it and expect success. Bolter says that the current setup is “around 90 percent,” and looks “720p to the naked eye — impressive, but only nearly good enough.” Percival isn’t aiming for photorealism, and leans into its stagey, ethereal quality, but you wouldn’t guess it was shot in front of a TV unless you already knew. And I have seen other, unpublished demos from Rebellion that looked a lot more realistic. A test sequence of a motorbike riding through (video) of central London is unbelievable. Footage of a person in a floaty dress standing on top of a hill (think: a perfume commercial) similarly looks like it was shot in the Welsh countryside. Give it a year or two, for the technology to mature even a little, and this could become a viable tool for any number of films and TV shows.
It’s likely that Rebellion will want to use virtual studios as the best way to build its new multimedia empire. In 2018, it announced that it was working with Duncan Jones (Moon, Warcraft, Mute) on a Rogue Trooper movie. And it’s also got the long-in-the-works Judge Dredd: Mega City One project looming on the horizon. Making a TV series on that scale would surely cost a huge amount with traditional green screen techniques, but perhaps this technology will make projects like this financially viable.
via Engadget http://www.engadget.com
October 17, 2020 at 01:09AM