Does the rise of the NPU spell the beginning of the end for discrete GPUs?
NVIDIA would say 'no' but I'm not so sure...

I was recently writing my MSI Prestige A16 AI+ review, and it got me thinking about the future of laptops. My drifting off had absolutely nothing to do with the quality of MSI's latest offering; I did give it a solid 4 stars after all, but was rather a result of the huge upsurge we've seen in AI-branded laptops.
In the last 18 months, I've lost track of the number of AI laptops we've tested, and it shows no signs of slowing down. As a result, I've been questioning if high-end creatives are going to need discrete GPUs in the future or if CPUs and Neural Processing Units (NPUs) will begin to dominate the landscape.
To make sure we're all on the same page, an NPU is a specialised microprocessor designed to drastically accelerate artificial intelligence. From a creativity perspective, an NPU takes on the role of AI inference that was previously handled inefficiently by a GPU or, for lighter tasks, the CPU. As a result, laptops are able to make fast, informed decisions without needing to run extensive calculations on the GPU.
What we're finding is that the type of creative tasks we're performing is changing, and as a result, the gravitational pull of processing is moving away from discrete GPUs and towards NPUs. The point is overstated, but at the very least, the need for entry-level discrete GPUs is becoming increasingly redundant.
For example, why pay for a separate, small discrete GPU when the NPU and integrated graphics can handle all but the most intensive tasks while saving battery and space?
At this point in time, we've not yet witnessed a reduction in the requirement for high-end dedicated GPUs in specialised fields. These powerful standalone cards, with their massive parallel processing cores and dedicated high-speed VRAM, remain indispensable for workloads where power efficiency is secondary to raw performance.
Discrete GPUs are still required for high-fidelity 4K video editing, graphics-heavy gaming, and professional 3D rendering, but it's not beyond the realms of possibility for AI to become so powerful that discrete GPUs become entirely unnecessary. One example from my own world of 3D visualisation would be that rather than needing a GPU to calculate physically accurate results, AI will be able to access its knowledge base and generate the same results.
Daily design news, reviews, how-tos and more, as picked by the editors.
I appreciate we're many years away from this reality, but given the AI progress we've witnessed in the past two years, I wouldn't be surprised if we see it a lot sooner than most would think.
For now, the CPU with an NPU and integrated GPU dominates for energy-efficient, everyday computing and basic on-device AI, while the discrete GPU maintains its stronghold by offering the unmatched computational horsepower required for the most demanding visual and AI workloads.
The discrete GPU market doesn't look like it's being eliminated any time soon, but I'll be interested to see how AI-branded laptops evolve in 2026 and whether they begin to erode the need for high-end and very expensive GPUs.

Paul is a digital expert. In the 20 years since he graduated with a first-class honours degree in Computer Science, Paul has been actively involved in a variety of different tech and creative industries that make him the go-to guy for reviews, opinion pieces, and featured articles. With a particular love of all things visual, including photography, videography, and 3D visualisation Paul is never far from a camera or other piece of tech that gets his creative juices going. You'll also find his writing in other places, including Creative Bloq, Digital Camera World, and 3D World Magazine.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.