Autodesk Media & Entertainment brings new capabilities to enhance creativity and efficiency

Access editorial sequencing data directly in Maya with Flow Animating in Context.

The Media and Entertainment industry has shown resiliency in years of transformation, and continues to reinvent how movies, tv, and games are made.  Talented creatives continue to produce remarkable content like Inside Out 2 (the highest-grossing animated film ever) and crossover content from games to tv series like The Last of Us and Fallout, while looking for efficiencies in their pipeline.

Autodesk understands that content needs to be both compelling and profitable and is committed to enhancing the performance and capabilities of M&E tools, and is integrating AI workflows to help automate tedious tasks, giving artists more space for creative iteration and storytelling. AI is a hot topic in entertainment, more so than in perhaps any other industry. Yet despite the hype, the industry will need solutions that can help augment pipelines without disrupting them, while keeping an eye on reinvention in the future.

Enhancing Autodesk’s core M&E tools with AI

Autodesk is adding AI capabilities to its existing creative tools to help accelerate artist workflows:

·       Getting less noisy renders takes a lot of computing power so AI in Arnold is helping to solve this problem. Artists can now quickly denoise images every time they render a scene.

·       If an artist is working in Flame and wants to slow down a video, using AI, they can create additional frames in between to produce a much more realistic result.

·       Maya now includes ML Deformer. The algorithm learns how a complex character moves just from the data in a scene. Animators can then pose the character in real-time, as the tool approximates complex deformations for them.

Moving the industry forward with AI

Today, artists can use Wonder Studio to easily put characters into live-action scenes.  At AU, the team is unveiling a powerful new AI capability in Wonder Studio, Motion Prediction, which anticipates character poses even when the view of an actor is obstructed by an object. Motion Prediction anticipates movement to produce more natural poses with less shaking and noise.    

Peter France, VFX Artist at Corridor Digital, an independent visual effects (VFX) studio, shared that “a lot of the time, as an independent artist, you want to make really ambitious projects […]. Wonder Studio is going to be an invaluable tool for these kinds of people where they can go after more of those ambitious ideas.”

Autodesk Research is also pushing the boundaries of what AI can do for artists. A good example of this is Neural Motion Control, a prototype which enables animators to direct a character’s actions using a handful of keyframes and a neural network. This will save animators significant time, while maintaining the low-level control they are used to, enabling them to focus on crafting the character’s unique performance.

Increasing efficiency with Autodesk Flow

utodesk Flow, the industry cloud for Media and Entertainment, helps connect workflows, data, and teams across the entire production lifecycle from earliest concept to final delivery.  At the heart of Flow is a common data model underpinning the new capabilities being showcased at AU.    

·       The reimagined user experience in Flow Capture, Autodesk’s camera-to-cloud solution, is putting assets front and center – where they should be! Studios can now find their assets in a central place, organize them with a new drag and drop capability, and much more.

  • The new Flow Graph Engine API is also being launched allowing developers to run Bifrost graphs in the cloud and build custom compute solutions for their studio.   

Autodesk is creating an ecosystem where others can build on the Flow Graph Engine.  DigitalFish, a company that develops novel tools for creating digital-media and immersive content in media & entertainment, have already put the Flow Graph Engine API to work. They are building an XR workflow leveraging Apple’s iOS and visionOS to bring compute and pre-visualization of effects to the set. The workflow starts with the crew scanning the set to create a digital twin, then using the Flow Graph Engine to complete the mesh. The VFX artist can add 3D assets and VFX simulations that react to the digital twin. Visual effects are simulated in the cloud, and the director can see it all to communicate precisely how the visual effects will transform the scene. Now the actors can refine their performances, the cinematographer can compose the best camera angles, and the entire crew can interact with the 3D elements.  

Autodesk’s Flow Retopology service (available in both Maya and 3ds Max 2025) has also been built with the Flow Graph Engine enabling artists to offload preparation of complex meshes for animation and rendering to the cloud.

  • Artist workflows are being enhanced with new Flow features that connect to M&E desktop solutions, one of which is Flow Animating in Context for Maya. The creative intent of the editor is the heartbeat of any production, but it’s not visible to artists working in Maya. This will be made possible by connecting data from Flow Production Tracking to Maya and vice versa.  Animators will be able to see the editorial timeline as they work and create better animations faster.

Autodesk strives to enable artists and studios to be more creative and make pipelines more efficient and will continue to bring value to the products in use today and add even more value with Autodesk Flow and meaningful AI solutions.