THE MANDALORIAN
ILM DISCUSS A BOUNT Y OF VISUAL EFFECTS
3D WORLD sits down for a conversation with Richard Bluff, visual effects supervisor and Hal Hickel, animation supervisor, both at ILM in San Francisco, to unpack the creative adventure of telling the story of a galactic bounty hunter.
Werner Herzog, worldrenowned filmmaker and occasional actor, takes a key role in the new Star Wars TV series The Mandalorian. Speaking in 2019 about the series, Herzog made the key point that it deploys what he described as mythic images, and his description speaks to the visually compelling quality of the series and the rich tradition of Star Wars. It’s a pop-culture phenomenon that has captivated the imagination in two ways: the story unfolding on screen, and also the story of the creative impulses, choices and challenges to put those stories in motion.
Richard Bluff begins our conversation by identifying the project’s landmark approach to environment creation: “I think the biggest challenge was wrapping our head around how we wanted to utilise real-time game technology in collaboration with the LEDs, effectively prototyping out what that technology would look like, and then of course executing a production-ready tool for the first day of shooting. That was by far and away one of the ever challenges that I’ve faced in the visual effects industry.”
Fascinatingly, ILM has a connection to game engine use that dates back to their work on Steven Spielberg’s dazzling science-fiction fairy tale, A.I. Artificial Intelligence, where it was used for virtual production approaches during filming of the Rouge City sequence.
Bluff sketches out the longstanding relationship between ILM and its use of LED: “There had been an awful lot of work done prior to The Mandalorian utilising LED screens at ILM and Lucasfilm: they’d been used on Rogue One for example, and the game engine technology, particularly Unreal Engine for season one, had been used extensively for X Lab, our immersive development department at ILM, on various augmented reality and VR projects. So, there were various pieces of all of the pipeline that had been utilised in visual effects, or with Jon Favreau [series creator] and his past projects including The Lion King and The Jungle Book. I think the biggest challenge was pulling all of that together, but more than that it was the goal that we set ourselves of shooting half of the season in the LED volume, and within that amount of work making sure above 50 per cent of every take would constitute an in-camera final. And, as a result, that would have meant us building over 110 real-time environments that in theory had to be photoreal and played incamera. There was nothing that existed prior to The Mandalorian that had attempted anything near what we tried to do. Up until now it was isolated to one or two shots or scenes with content that was intended as previz only for dynamic lighting, whereas we were attempting in-camera set extensions - effectively taking the post-production aspect and putting it in prep.”
A practical set piece of the Razor Crest cockpit positioned in the StageCraft volume for real-time interactive lighting
Bluff goes on to offer context for ILM’s work in applying the game engine technology to their production and visual effects collaboration: “ILM has a rich history of projects, plus supervisors and artists working in new media. Prior to Kim Libreri joining Epic he was a visual effects supervisor at ILM and had been the lead supervisor behind an ambitious project to turn a video game environment (that was never used), and try to imagine how we could utilise that world and those characters and use it in a real-time game engine to generate content. So, he’d already been pursuing game engines for television or theatrical content for a long time.