- Computing
- Computing Components
- GPU
There are some key nuances to consider here, though
When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.
Image credit: Canva
(Image credit: Canva)
Share
Share by:
- Copy link
- X
- Threads
- An Asus exec was asked whether integrated graphics on laptops represented the future of PC gaming
- He replied that "we're definitely getting into the territory where that becomes a possibility" and that "it's just a matter of time"
- The exec acknowledged it's a thorny issue, though, with a whole lot of nuances and other factors to take into account
As integrated graphics take further leaps forward in terms of performance – with the latest being Intel's Panther Lake CPUs – talk has again turned to when discrete GPUs might become effectively irrelevant.
And according to Asus, in an interview with Tom's Guide at CES 2026, we're now getting into the territory where it's possible that integrated graphics could be the future of PC gaming.
For the uninitiated, a discrete GPU, as the name suggests, means a standalone graphics card, as opposed to integrated graphics built into the CPU, and obviously, with a full, separate expansion board to work with, you can get a lot more powerful performance levels.
You may like-
The Intel Core Ultra series 3 processors look impressive enough, but the Arc B390 iGPU is the real game-changer here
-
Nvidia in 2025: year in review
-
Is 8GB of VRAM really enough for gaming? This MSI laptop helped me find out
Dedicated gaming laptops still use discrete GPUs for that reason, but how close are we getting to the point where these standalone boards are going to be effectively sidelined?
Tom's Guide asked Sascha Krohn, Director of Technical Marketing at Asus, about whether integrated graphics on laptops represented the future of PC gaming.
Krohn replied: "I would say we're definitely getting into the territory where that becomes a possibility. I think that's something that, in the past, you couldn't really do, but I think now we're getting to the point where, and just the fact that you're asking the question – you're not the only one – it shows that if you follow this trend, it is probably going to happen. And it's just a matter of time.
"Are we there right now? I'm not sure if we're already there right now. It's going to be very interesting in the end how the market reacts, like how end users react to this."
Get daily insight, inspiration and deals in your inboxContact me with news and offers from other Future brandsReceive email from us on behalf of our trusted partners or sponsorsBy submitting your information you agree to the Terms & Conditions and Privacy Policy and are aged 16 or over.Analysis: Panther Extreme Halo effect
It's a bold statement, as while Krohn says he isn't 'sure' that we're at this point, the doubt implies that we might just be – or at least be coming close. The new integrated graphics on Intel's Panther Lake chips are impressive, and the same is true for Qualcomm's Snapdragon X2 Elite Extreme, and indeed AMD's Strix Halo, the beefiest integrated graphics of them all, featuring jaw-dropping performance (albeit with caveats in terms of higher power consumption and its priciness).
Krohn points to Cyberpunk 2077 running impressively on integrated graphics these days, which is remarkable. But the question of when integrated GPUs might effectively barge aside discrete boards is a knotty one, as the Asus exec acknowledges.
Krohn observes that on the subject of integrated graphics: "And I think it really depends on who you ask, right? I think there's a lot of people who will say yes, this [integrated GPU performance] is good enough for me, this is totally fine. I don't need more. But calling that a dedicated gaming device is a whole other story, right?"
You may like-
The Intel Core Ultra series 3 processors look impressive enough, but the Arc B390 iGPU is the real game-changer here
-
Nvidia in 2025: year in review
-
Is 8GB of VRAM really enough for gaming? This MSI laptop helped me find out
He continues: "I think the expectations, once you call it gaming laptop, are probably higher. Gaming laptops are not going to go away anytime soon, even in the long term. And dedicated GPUs are going to still be around for many years. How many people are going to go for dedicated GPUs and how many people are going to go for integrated GPUs, that's something that everybody has a different take on."
And that's the crux of the matter – we aren't talking about the death of the discrete GPU here, because that's a long, long way off. If it ever happens, as enthusiasts will always want better, faster GPUs to get 4K gaming running fluidly, at native resolution (with no AI tricks), with all the bells and whistles turned on. Similarly, discrete GPUs will continue to get faster, as well as integrated solutions. Discrete will continue to sit at the top of the tree, naturally.
So, what we're really talking about is when integrated graphics become good enough so that the vast majority of gamers will be happy using them, and granted, that point may not be so far off into the future. But I do think it's still a good way off, and while integrated graphics will doubtless continue to progress nicely in terms of performance, as noted, so will standalone GPUs. Although it's true that, in laptops, within the confines of the chassis, ever-higher power consumption could be a problem for discrete GPUs, advances in cooling solutions may help.
It's a difficult call, for sure, but I think Asus is veering somewhat onto the optimistic side of the fence here. In the end, one factor that could be key for discrete GPUs is whether there will even be the willingness to keep developing faster and faster models, because if AI continues to boom, the drive behind GeForce gaming GPUs could falter. It's not difficult to envisage Nvidia throwing all its weight behind AI at the expense of gamers, and folks have been theorizing for a while now that Team Green may not continue its GeForce gaming line forever.
The best graphics cards for all budgetsOur top picks, based on real-world testing and comparisons➡️ Read our full guide to the best graphics card1. Best overall: AMD Radeon RX 9070 XT2. Best budget: Intel Arc B5803. Best Nvidia:Nvidia RTX 5070 Ti4. Best AMD:AMD Radeon RX 7900 XTX
TechRadar will be extensively covering this year's CES, and will bring you all of the big announcements as they happen. Head over to our CES 2026 live news page for the latest stories and our hands-on verdicts on everything we've seen.
You can also ask us a question about the show in our CES 2026 live Q&A and we’ll do our best to answer it.And don’t forget to follow us on TikTok for the latest from the CES show floor!
TOPICS CES Darren AllanDarren is a freelancer writing news and features for TechRadar (and occasionally T3) across a broad range of computing topics including CPUs, GPUs, various other hardware, VPNs, antivirus and more. He has written about tech for the best part of three decades, and writes books in his spare time (his debut novel - 'I Know What You Did Last Supper' - was published by Hachette UK in 2013).
Show More CommentsYou must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.
Logout Read more
The Intel Core Ultra series 3 processors look impressive enough, but the Arc B390 iGPU is the real game-changer here
Nvidia in 2025: year in review
Is 8GB of VRAM really enough for gaming? This MSI laptop helped me find out
Intel in 2025: year in review
Nvidia CEO hints older graphics cards may return to solve GPU price crisis
We might see a successor to the MSI Claw 8 AI+ soon with a Panther Lake CPU
Latest in GPU
Nvidia’s next-generation RTX 60 series GPUs rumored to be on track to launch next year
AMD exec: Radeon GPU stock won't be an issue, but price rises are likely
Nvidia CEO hints older graphics cards may return to solve GPU price crisis
Nvidia's new G-Sync Pulsar update for motion clarity is a big win for PC gamers, and here's why
Nvidia's CES 2026 keynote live - all the latest news from the GPU and AI superpower
Nvidia announces DLSS 4.5 at CES - but will it be enough to silence the 'fake frames' haters?
Latest in News
I used Meta’s Neural Band to control a car’s screen in a Garmin concept, and it kind of rocked, but not just for the gestures
Asus primes us for integrated graphics making discrete GPUs irrelevant
DJI isn't the only drone maker hit by new US laws – the world's first waterproof selfie drone could be next
Ecovacs wants to make self-clean window-bots happen, but I don't think they're going to happen
Cloudflare and La Liga's conflict deepens as piracy legal battle continues
Garmin's closest Apple Watch Ultra rival is getting a soft gold revamp
LATEST ARTICLES- 1I used Meta’s Neural Band to control a car’s screen in a Garmin concept, and it kind of rocked, but not just for the gestures
- 2The Scuf Envision Pro V2 has some of the best buttons and triggers I’ve ever used, but it’s massively let down by convoluted software
- 3‘This is an unbelievable moment in the course of human history’: Sean Astin on how he’s fighting for humanity against an onslaught of AI actors
- 4The battle of the SuperPods: Nvidia challenges Huawei with Vera Rubin powered DGX cluster that can deliver 28.8 Exaflops with only 576 GPUs
- 5Acer launches 40th Ryzen AI Max+ 395 system as it pitches Veriton RA100 mini PC as an AI workstation — but unless it is keenly priced, it will struggle against better value competitors