Ultimatte Transforms Visual Effects for Shark Attack 360
Discover how Little Shadow built a virtual studio set for the Disney+ docuseries using Ultimatte, DaVinci Resolve Studio and Unreal Engine.
“Shark Attack 360” dives into the factual landscape of shark behavior, investigating why sharks bite humans. The 8×60 docuseries for Disney+ was released as part of National Geographic’s SHARKFEST, using evidence, personal encounters and state of the art visual effects (VFX) to explore what leads to shark attacks.
London based production company Arrow Media relied on VFX studio Little Shadow for 3D VFX and graphics. This included bringing to life the virtual interactive sharks and a 360 shark lab used to demonstrate the sharks’ movement and anatomy.
Finding the ideal filming location was crucial for upscaling the imagery for season two. “We found Collins’ Music Hall in Islington, an incredible underground theatre that was never finished. It was the perfect space for building the virtual 360 shark lab using live action footage and computer generated imagery (CGI),” begins Little Shadow director Simon Percy.
“To start, we used a LiDAR scan to create a 3D model of the venue for planning and scaling the project,” explains Percy. “That digital replica of the venue allowed for meticulous planning and pre visualization, ensuring that each shot could be crafted ahead of time, helping to streamline both the shoot and edit.”
Although the venue met the visual requirements, its lack of soundproofing and layout posed challenges. “Everything echoed, and any sound from the kit was amplified, so we stationed our core workflow away from the set floor. This meant running a 4K signal across 110 meters and four floors using BNC cable,” he explains.
Instead of using a traditional LED volume, Little Shadow used a mix of custom built systems and off the shelf tools to facilitate live green screen keying, camera tracking and live monitoring. Percy worked closely with senior VFX artist Lucas Zoltowski and series DP Matthew Beckett to develop that workflow.
“We needed a flexible solution to capture live content and integrate CGI assets, which played back on a virtual production (VP) box and provided immediate visual feedback. This was crucial for building the immersive underwater setting,” notes Percy.
The flight case was built around a custom PC for Unreal Engine and the Ultimatte 12 4K real time compositing processor for live keying. Signal management was handled by a Blackmagic Videohub 20×20 12G router, while several HyperDeck Studio 4K Pro broadcast decks were used for recording and playback. “This was crucial for real time decision making on set and immediate footage review, ensuring optimal takes and adjustments,” noted Percy.
“To bring the underwater scenes to life, we used a green screen and the Ultimatte, which allowed us to integrate the virtual 3D elements into the scenes using augmented reality (AR). This enabled the presenter to interact with the sharks in real time, creating a more engaging viewing experience,” he continues.
One of the big challenges faced during production was managing the intricate camera movements needed for the hybrid production. This required detailed tracking and rotoscoping to seamlessly blend live footage with virtual backgrounds, particularly for fine details such as hair and close interactions with virtual sharks.
“The Ultimatte was essential for live green screen keying as it guided the camera movements based on the virtual elements in the shot, meaning everyone on set could view this in real time. It gave us more room to experiment with the creative possibilities in the scenes while ensuring that we were all on the same page moving into post production,” Percy explains.
However, some fixes and shots did not involve a green screen. “DaVinci Resolve Studio’s AI tool such as the Magic Mask for rotoscoping proved to be hugely effective in these scenarios. We are increasingly exploring the use of DaVinci Resolve Studio and Fusion as more of our process is starting to lean that way. It’s incredibly fast in terms of EXRs and it’s all GPU accelerated, which is amazing. On this project it really defined the visuals and helped to blur the lines between CGI and live action,” he concludes.