The future of AAA game making and movie making is merging thanks to the development of increasingly powerful game engines and new film making techniques.


Interested in the Exponential Future? Connect, download a free E-Book, watch a keynote, or browse my blog.

Jon Favreau and MPC Film’s Lion King re-make was revolutionary because, unlike traditional films, it wasn’t CGI or live animation – something that had the Oscars committee stumped when they tried to categorise it. It was made entirely using the Unreal Engine, a product that’s used by AAA games studios like EA and Ubisoft to make their highly realistic blockbuster video games and virtual worlds and characters, and then it was all filmed in Virtual Reality (VR) using revolutionary new techniques and rigs.


At over 200,000 burgers a day this cultivated meat factory will be the world's biggest


Not only was this approach, which was first tentatively trialled during the production of the Jungle Book, ground breaking but it’s now being hailed as the future of almost all film making as the virtual worlds and characters we can create using this technique become increasingly life like and malleable and start to fly past uncanny valley – the point at which you can no longer tell real content from fake or synthetic content. And now the first TV series, the Mandalorian, has been getting the same treatment – something that makes George Lucas’ dreams of creating a Star Wars TV series a reality. Even if it’s still a virtual one…


Behind the scenes of Disney and MPC’s The Lion King

Lucas was dreaming of a Star Wars live action TV show long before The Mandalorian got off the ground, and one of his collaborators at Industrial Light & Magic, Richard Bluff, remembers Lucas talking about it as far back as 2008, but there was just one problem – going to a galaxy far, far away on a TV budget was nearly impossible.

“At the time, he felt he was limited in regards to how he was able to tell the story based upon the vast number of locations and worlds we would need to go to for the small screen,” says Bluff, a visual effects supervisor. “But the audience simply wouldn’t accept a Star Wars without them.”


Lockheed Martin achieves first light of it's latest 50kw laser weapon system


Now, a mere decade later everything’s changed. For one, Disney bought Lucasfilm and everything that came with it, including ILM. It also put out a lot more Star Wars movies, let Jon Favreau remake The Lion King in VR, and launched a streaming service, Disney+, where any potential live action series could live forever.

So, when Favreau started working on The Mandalorian in early 2018, the possibility that he could make a TV series without sending a whole crew to Jordan, like J.J. Abrams just did for Star Wars: The Rise of Skywalker – was far more feasible.


Behind the scenes of The Mandalorian

“Through his experience on Jungle Book and The Lion King,” Bluff says, “he felt very strongly that there had been breakthroughs in game-engine technology that were the key to solving this problem.”

Indeed. Working with Epic Games, the studio behind the Unreal Engine and the hit game Fornite, along with Bluff at ILM and cinematographer Greig Fraser, who had done a lot of work shooting LED screens on Rogue One, and other tech companies like video card maker Nvidia, Favreau and his team at Golem Creations developed a new virtual production platform that allows filmmakers to generate digital backdrops in real time, right in front of the camera. The tech, now called StageCraft and available to filmmakers everywhere, allowed the directors on each of The Mandalorian’s eight episodes to film in every part of the galaxy without ever having to leave Manhattan Beach Studios in Los Angeles.


Luma AI brings photo realistic virtual worlds closer to reality


Even though Favreau achieved what he’d set out to do, not everyone was convinced he could in the beginning. At the time, there was scepticism that the technology was good enough to do photo-real backgrounds, but “we pushed forward anyway,” Favreau says. His hope was that he would be able to get a few shots for the first season and then improve the tech as the show went on. Eventually, he thought, if the tech got good enough it could be used on other Disney productions, whether they be Star Wars films or Marvel movies. Lucasfilm honcho Kathleen Kennedy agreed and committed to letting Favreau figure it out.

“I came in with The Mandalorian and said, ‘Let this be the North Star we’re going for.’ Maybe we won’t get all the way there the first season, but at least we’ll plant our flag and try to do this,” Favreau says. “It just took one production to see what could be done with the tools we have.”

And here’s how it works. Imagine the scene at the cantina on Tatooine. The bounty hunter is there, there’s a general hive of scum and villainy vibe. But only a chunk of it is real. The booth is there, and some of the actors, but the rest is just being rendered on a 20-foot-tall, 270-degree semi-circular LED video wall. It’s like a traditional Hollywood backdrop, except this one uses Fortnite’s game engine to place 28 million pixels’ worth of characters and objects exactly where they need to be for the camera to capture them. All told, more than half of The Mandalorian was shot on virtual sets with the rest being done using practical effects on another part of the LA lot. And if that sounds like a lot, it is.


Oculus' virtual reality film Henry wins an Emmy


Fortunately for the filmmakers behind the Disney+ show, a lot of the groundwork was already in place. The StageCraft platform works by allowing filmmakers to do a lot of their pre-visualization and shot blocking ahead of time in VR, something Favreau had done with The Lion King. So as soon as the show’s concept artists and production designers came up with ideas, they could be created virtually, and the directors could put on a headset and see the world they would be filming in. That means, Bluff says, a lot of what is normally considered postproduction work is actually done in preproduction. On the day they’re shooting, the directors are working with almost fully rendered VFX, capturing everything in-camera, using Arri rigs.

StageCraft wasn’t just a boon to the show’s directors though. Because the virtual production platform was showing the backdrops in real time, actors who normally would have been standing in front of a green screen imagining some distant land could now see it. For example, when Mando meets up with Ranzar Malk in the “Prisoner” episode, Pedro Pascal and Mark Boone Jr., who plays Malk, are standing near a spaceship prop just like they likely would be on any other set, but instead of being surrounded by a sea of green, the LED screens are showing the rest of the hangar around them. Baby Yoda floating through the desert in a space bassinet? Same thing.

“Visual effects have sort of taken the fun out of making movies,” says Epic Games’ chief technology officer, Kim Liberi, who spent years at Lucasfilm and ILM before moving into gaming. “When you’re confronted with a sea of green and representations of characters on ping-pong balls or tennis balls, it becomes a pretty daunting experience for the actors and the director. I think what we’ve been able to do here is give control back to the filmmakers.”


An Instagram blogger just bought a custom digital dress for $9,500


Control and speed, and all at an increasingly affordable price. What’s most incredible about The Mandalorian is that it went from Favreau saying “Let’s do this!” to a fully formed eight episode season on Disney+ in less than two years. That seems fairly unbelievable when you consider it took Abrams much longer than that to make just one movie. But as streaming services ramp up their production slates, quick turnaround shows that don’t require a ton of location shoots are going to be vital and it’s hard to imagine that lots of other showrunners won’t be jumping at the chance to use ILM’s virtual production platform. It’s already happening on The Mandalorian itself.

“The great thing about Season 2 is that a number of the directors are coming back, and people are aware of what this technology can do now, so they’re writing to it,” Bluff says. “Dave Filoni, for example, is designing his episode around what it’s capable of.”

In other words, the galaxy is no longer so far, far away, and it’s not only the Rebels that are stirring a revolution.

About author

Matthew Griffin

Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, Bloomberg, CNBC, Discovery, RT, Viacom, and WIRED, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Aon, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.

Your email address will not be published. Required fields are marked *