Andy Serkis has credited the advancements in motion / performance capture part in thanks to the advancements to video game rendering engines such as Unreal and Unity.
Serkis first played a CG charecter in the Lord of the Rings when he starred as Gollum. He would once again collaborate with Peter Jackson as he starred in King Kong.
After filming King Kong, Serkis thought he would return to conventional theatre and cinema — a sentiment he had at the end of filming of the Lord of the Rings trilogy (which took circa four years). But he soon found himself directing performance capture for a video game called Heavenly Sword that was being developed by Ninja Theory.
READ: Face-to-face: Andy Serkis, founder, The Imaginarium
The lack of motion capture facilities in the UK meant Serkis had to fly the cast and technicians to New Zealand. “I had only ever worked with Weta for the Lord of the Rings which is based in New Zealand,” he says.“That led to me and my business partner Jonathan Cavendish to build a company (in the UK) that was creatively driven, where we bring talent to create digital characters on the fly and have them changed with an actor in the room. The Imaginarium was born out of a concept to do Shakespeare plays as video games,” he says. Ever since, The Imaginarium has been involved on big budget movies such as Avengers: Age of Ultron and Godzilla, to name a few.
Technologies such as augmented reality and virtual reality have transformed the manner in which performance capture is done. The advancements of video game rendering engines have even helped lower production costs.
“It’s [financially better] now because of realtime rendering using video game engine technologies. Video game engines such as Unreal and Unity are used for large scale film making which help in background shots and sequences. You can render in real time. There are certain shots, even now, that are fully rendered using video game engine technology, which saves thousands of man hours of post-production.”But the tools need to improve further. While the rendering engines work well to portray hard or shiny surfaces — it’s much more difficult to do the same with textures such as fur and water. “It’s a matter of time, and that will further reduce cost and man-hours. As soon as you can see those textures and facial performance capture rendered in real time, you’re going to start seeing costs come down hugely,” Serkis tells Digital Studio Middle East.
The Imaginarium works closely with 3D Lateral, along with Unity and Unreal. “I worked on a test to make a digital version of myself transformed into a completely different character literally within hours of shooting. We are a little way off but not too far,” Serkis expresses with eagerness.