Sign up to Creative Bloq's daily newsletter, which brings you the latest news and inspiration from the worlds of art, design and technology.
You are now subscribed
Your newsletter sign-up was successful
Want to add more newsletters?
CGI and other VFX have transfixed movie viewers over the years, making us believe in astounding characters and seemingly impossible fantasy worlds (see our pick of the best CGI movie moments for highlights). Techniques have changed over time, driven by advances in technology but also by audience expectations and industry dynamics.
Over that time, the term 'CGI' has often been used disparagingly despite there being so many great movies full of CGI. But now there seems to be a belief that CGI has somehow got worse after peaking around ten to 20 years ago with movies like Avatar (2009), and a YouTuber's theory about why this might be is generating some intense debate.
Citing the groundbreaking creation of Davy Jones in Pirates of the Caribbean: Dead Man's Chest in 2006 (see below), the YouTuber Andrew Price AKA Treehouse Detective has put forward a theory of why VFX looks worse 20 years later.
Andrew suggests part of the problem may be that audiences, or at least focus groups, expressed a preference of movies with heavy CGI, leading to more CGI in general. But he pins a big part of the blame on the bidding process that studios use to select providers, which can pressure VFX houses to keep costs down.
The VFX bidding process has certainly come in for criticism, including from some VFX artists, due to a lack of consistency and the challenges it poses for VFX teams. There can be big problems with fixed bids, which don’t allow flexibility for shot changes, and with under-bidding, where studios bid at a loss in the hope of getting more work in the future. Mismanaged bidding and scheduling can also overload teams.
But VFX bidding isn’t just about choosing the lowest quote; it can also take into consideration historical shot data and providers' track records and capacities. And the mechanism isn't new. This process has existed since the explosion of digital VFX in the 1990s.
Has CGI got worse?
So is there another reason for the perceived decline in CGI quality? Several people who have worked in the industry have responded to Andrew's video with based on their experience.
Sign up to Creative Bloq's daily newsletter, which brings you the latest news and inspiration from the worlds of art, design and technology.
Some blame technological changes, such as the adoption of game engines like Unreal Engine.
"Around 2018-2022, there was a big jump from traditional render engines (path tracing, unbiased rendering) to game engines (biased rendering, ray tracing)," one person writes. "The reason was the traditional render engines were more accurate but MUCH slower, very expensive. Game engines can be realtime, but not as accurate," writes one VFX artist who suggest that films now have a "gamey look" as a result.
But many blame a change in the way big studios approach filmmaking, and an increasing tendency to make major decisions on the fly.
"It's because studios now shoot everything on green screens so they can make thousands of last minute changes in post production instead of actually planning out a film properly," one person writes. "It's just lazy writing/planning from the directors/studios and they now change big chunks of films after all the shooting is done."
"They want to change things so much or do many different things and then don't give the VFX studio nearly enough time for them to get it done and look polished and good," someone else wrote.
"It’s cheaper to throw everything on green screen and have all of the costumes be CG than to plan for months and get costumes and sets made," someone else confirms.
"Scripts are being written quickly in a sketched out kind of way because there's not a clear vision for the movie, and then when the test screening doesn't meet expectations or the executives don't like something they change the script and expect the VFX studio to redo tons of CGI without enough time to make it look good," another person comments. "There's also a huge problem with directors asking for multiple versions of the same shot which pushed up the number of shorts sevenfold."
I wouldn't agree that modern CGI is necessarily worse. There's plenty of CGI work so impressive that audiences don't notice it – take the Ghoul's nose in Fallout, but it seems there may be an issue of quantity over quality. Good VFX does require time, money and planning.
What do you think? Has CGI got worse, or is there just more of it?

Joe is a regular freelance journalist and editor at Creative Bloq. He writes news, features and buying guides and keeps track of the best equipment and software for creatives, from video editing programs to monitors and accessories. A veteran news writer and photographer, he now works as a project manager at the London and Buenos Aires-based design, production and branding agency Hermana Creatives. There he manages a team of designers, photographers and video editors who specialise in producing visual content and design assets for the hospitality sector. He also dances Argentine tango.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.
