Inside an AI animation studio: how we’re rewriting the rules
Award-winning studio director Douglas McGinness reveals how the film industry is changing.
For years, visual effects (VFX) ran like a factory assembly line. Work moved step by step, change was expensive, creative flexibility was limited, and big teams were the norm. AI is changing that.
Over the next five years, we’ll see a shift from massive studios to small, focused teams capable of delivering high-quality results faster, cheaper, and with greater creative control. This evolution isn’t just about cutting costs; it’s about redesigning the entire pipeline to make better decisions sooner, iterate faster, and invest more deeply in the work that audiences actually notice.
While critics hesitate over concerns that lower costs necessarily mean lower quality, the reality is that AI has a lot more to offer to the VFX industry than many are prepared to admit.
The dawn of a new creative industry
VFX has always been a slow, resource-heavy process. The pipeline was linear, with no opportunity for deviation, and every iteration required coordination, time, and budget. It made for an ongoing scenario of clunky inflexibility. AI brings fluidity to that process.
Previously, teams would bid on each task, render it overnight, and review it daily. However, AI now enables creators to block, test, and restyle within hours, allowing them to try multiple looks before selecting one to scale. Previs and look development can happen together. Quick composition allows teams to validate creative choices early, so that only shots that pass editorial are executed, not only saving time but also resources. Additionally, you can achieve further savings by automating first passes. With rotoscoping, cleanup, and plate preparation now beginning with AI-generated drafts, artists can focus on the detailed work of edge cases and creative polish, thereby expediting the whole process.
Critically, however, that expediency doesn’t come at the cost of quality. In fact, if used properly, AI can add qualitative value through the codification of consistency. It’s now possible to quickly, effectively, and efficiently ensure colour, texture, and design language stay aligned across shots in line with show-specific style files, enhancing overall production values. When you work in thin slices, you can ship a sliver, learn fast, and iterate again. Repeat by lunch.
This cadence has already proven effective for my company. On recent campaigns and long-form sequences, AI-driven workflows have cut calendar time by 30–50% while improving visual consistency, because looks are now systematised rather than reinvented by each artist.
Daily design news, reviews, how-tos and more, as picked by the editors.
The question is, can small teams really deliver?
The answer here comes with a caveat: with AI, small VFX teams can deliver, but only when given clear boundaries. And some limitations remain. So, it’s down to you to pick the right fights. While complex VFX sequences, like fluid simulations, are still better served by traditional stacks for now, small-scale creative labs are more than capable of handling character-driven shots, stylised worlds, and design-led compositions. As an example, we recently built a custom AI solution that generated in a few hours what would have traditionally been outsourced monthly for tens of thousands of dollars for an animated series.
On one drama sequence, we tested three aesthetics against story and blocking before building any complex rigs. There were only five people in that crew. The advantage was the decision loop, not headcount, because in that scenario, fewer people means clearer accountability.
The key is to let the editorial lead and ensure that everyone is on the same page. When you kill weak ideas early, the quality is assured, and AI makes that fast and cheap, particularly when you treat the pipeline like a product – version it, measure it, and add simple QA gates.
Misinformation surrounds AI in both its strengths and weaknesses. Some clients come into an AI project expecting the world to be delivered to them in a matter of hours. Some clients kill AI too quickly, thinking it’s just for hobbyists and can’t survive live production environments. Neither scenario is true. But clear communication is essential to navigating AI projects, as is having people on the team who understand AI incredibly well to be able to guide the team and clients effectively.
Can off-the-shelf AI deliver the same results?
This is where the differentiation lies. While off-the-shelf AI models can get you 70-80% of the way, the final lift comes from targeted additions. Most of them are small show files rather than big retrains, such as training lightweight, show-specific adapters or presets, which are quick to make and easy to swap, but without them, the results can be patchy. As a general rule, we use custom-trained models wherever possible to keep outputs consistent style-wise, and then leverage off-the-shelf tools only when custom solutions can’t get us there.
Off-the-shelf solutions are high quality but, currently, less controllable - “prompt and pray” is not a strategy when working on real campaigns. On the other hand, custom solutions, though much more controllable, can’t always deliver the pixel-perfect quality that’s often desired. A mix of solutions is key, as is flexibility - these tools change every day.
But is there really a market for AI-driven VFX?
There’s been kickback against AI in the creative industries, but if you dig deep, you’ll find that most major buyers are not anti-AI; they just want guardrails.
Take Netflix and its recently published guidance on AI usage, for example. Its main requests are that vendors disclose intended AI use early, avoid tools that store or train on production inputs, prefer enterprise-secured environments, keep AI elements out of finals without approval, and never create or alter talent performances without consent. Anything touching final frames, third-party IP, personal data, or digital replicas needs written approval.
So, if you build a custom workflow from multiple tools, each step must meet those standards. These are sensible rules that small teams can meet with per-show clean rooms and clear sign-off points.
How can small VFX teams make AI work for them? For me, successful AI integration into VFX usually comes down to five key areas:
- Front-load decisions: Test multiple looks before heavy renders. Only scale what survives.
- Automate the grind: Let AI handle the repetitive tasks, such as roto and cleanup, so artists can focus on lighting, composition, and storytelling.
- Isolate data by show: Track every input and output to ensure reproducibility and compliance.
- Keep humans in the loop for finals: Compositors are the VIPs of the AI world.
- Measure everything: Log time saved, promote wins, and fix bottlenecks. When it comes down to it, AI does not lower production value. Bad pipeline choices do. If you use AI to remove toil, pull decisions forward, and protect the time that creates emotion on screen, the results can be outstanding, and small, focused teams that operate this way will not just match legacy pipelines. In many cases, they will beat them.
If you want to read more, check out our features on the 'Netflix of AI', what we discovered when we went to an AI film conference, or our list of the best animation software for a traditional approach.
Thank you for reading 5 articles this month* Join now for unlimited access
Enjoy your first month for just £1 / $1 / €1
*Read 5 free articles per month without a subscription
Join now for unlimited access
Try first month for just £1 / $1 / €1

Douglas McGinness is founder and director of the AI creative studio Animated Company. His work has received a D&AD Wooden Pencil award and has screened at the Festival de Cannes. Clients include Apple, Nike, Google, Epic Games, BBC, Paramount, and more.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.