
Game intel
Cyberpunk 2077
Cyberpunk 2077: Update 2.0 introduces a comprehensive, free overhaul of the game's core systems. Key changes include a redesigned cyberware system, where armor…
I was alt-tabbing out of Cyberpunk 2077 when I saw it: Jensen Huang on stage at a Morgan Stanley conference, casually dropping the line that Nvidia had “created the modern videogame industry.”
Meanwhile, Night City was paused in the background, my RTX card wheezing at native 4K with ray tracing on, only looking remotely playable because DLSS was doing heavy lifting in the shadows. And that was the exact moment it clicked for me: whatever “modern videogame industry” Huang thinks Nvidia created, I’m not sure it’s one I’m happy to live in anymore.
I’ve been building PCs since the days of beige cases and 3dfx Voodoo stickers. I remember when the question was “can it run Crysis?” not “which AI hallucination method are you comfortable with?” I’ve lived through the PS2 era’s black magic optimization, the GeForce 256 revolution, the early days of shaders, all of it. So hearing the CEO of what is now essentially an AI megacorp erase arcades, consoles, and decades of dev sweat by saying Nvidia “pulled it all together” and “created” modern gaming… yeah, that didn’t sit right.
And the thing is: he’s not entirely wrong. That’s what makes this so frustrating.
Let’s get one thing out of the way. Nvidia did not “create the modern videogame industry.” Before Jensen was out there in a leather jacket talking about “neural graphics,” people were pumping coins into arcades, wearing out NES controllers, and getting motion sick on PS1 3D. Sega, Nintendo, Sony, 3dfx, ATI, Microsoft – those names are carved into gaming’s DNA long before RTX was a marketing term.
What Nvidia did create was a critical part of the modern PC graphics stack. From GeForce 256 in 1999 to CUDA in 2006 to RTX ray tracing in recent years, it has absolutely shaped how games look and run on PC. I’ve owned enough GeForce cards – from the 8800 GT to the RTX 40-series – to respect that legacy. When Huang talks about contributing “algorithms” and “libraries” to engines like Unreal, that’s not just hot air. Developers genuinely rely on that stuff.
But when he follows it up with “without RTX, there would be nothing today” and that Nvidia “created the modern videogame industry,” that’s not confidence, that’s amnesia. It erases game developers who spend years wrestling with engines, memory limits, physics, AI, and player expectations just to make something halfway coherent. It airbrushes out the people who made Shadow of the Colossus work on a PS2, or GTA: San Andreas run on hardware that looks like a toy by today’s standards.
Here’s the irony: if there’s one thing I do think Nvidia can take credit for “creating,” it’s the current culture of shipping unoptimized games and letting AI upscaling and frame generation bail everyone out after the fact.
I’m not going to sit here and pretend DLSS is bad tech. It’s not. It’s borderline black magic when it works. I’ve seen Cyberpunk 2077, Alan Wake 2, and other visual monsters go from “cinematic slideshow” to “smooth as butter” just by flipping DLSS from off to Quality or Performance. Frame generation on top of that can turn a struggling 40 fps into a “looks like” 80 fps experience. I get why people call it a cheat code.
But it’s a cheat code in more ways than one. It doesn’t just cheat the rendering pipeline; it cheats the conversation about what’s acceptable performance and what we’re actually paying for.
Run a heavy RT scene in Cyberpunk 2077 natively, no DLSS, full-fat path tracing. Watch your expensive GPU crawl. Even very recent high-end cards struggle to maintain a stable 60 fps at 4K with everything maxed. Turn DLSS on, and suddenly we’re in “playable” territory. Add frame generation, and it feels fluid. Great, right?
Now imagine reviewing that GPU without DLSS. No AI upscaling, no optical flow wizardry, just raw silicon versus the render pipeline. A lot of these cards would look embarrassingly underpowered for the price brackets they occupy. We’re talking about hardware that can’t reliably handle a five-year-old game at peak settings without leaning on a software parachute – despite costing more than full consoles.
That’s my problem. DLSS has gone from “cool bonus feature” to “this is the actual product you’re buying.” You’re not buying raw raster power anymore; you’re buying access to Nvidia’s proprietary AI bandage. And Huang standing up and taking a victory lap for that as if it’s purely a gift to gamers ignores the fact that this bandage is only necessary because of how the company has chosen to prioritize its business.

Let’s talk about the game in the middle of all this: Cyberpunk 2077. It’s the perfect symbol of what I love and hate about where we’ve ended up.
I adore this game. I’ve sunk well over a hundred hours into Night City between launch, patches, and Phantom Liberty. The path-traced “Overdrive” mode is, hands down, one of the most stunning visual showcases I’ve ever seen. Neon reflections, global illumination, the whole city reacting to light in a way that still makes my jaw drop. This is exactly the sort of future-looking insanity I want PC gaming to chase.
But here’s the catch: that future only exists because DLSS exists. Turn DLSS off and try native 4K with all the ray tracing bells and whistles on even a recent RTX card, and the whole thing collapses. You either drop the resolution to something that looks sad on a modern display, or you turn off the showcase features the marketing campaign was built on. In a very literal sense, the “modern videogame” Huang is bragging about only actually exists because of his company’s proprietary upscaling.
Is that innovation, or is that dependence?
Don’t get me wrong: of course ultra modes should push hardware to its limits. I don’t expect locked 120 fps in fully path-traced Night City. But when a five-year-old game’s flagship mode still absolutely requires AI upscaling on brand new GPUs, I start to question whether hardware progress has kept up with ambition – or whether everyone secretly decided it was fine because DLSS will just handle it.
And when that same company starts telling the world it created the very industry now chained to its software, that’s where my patience runs out.
Back on the PS2, devs were pulling off dark magic. Shadow of the Colossus was doing stuff with world streaming and animation that had no right to exist on that hardware. Burnout 3 looked and felt impossibly fast. GTA: San Andreas crammed an entire state into 32 MB of RAM. Those games were “technically demanding” for their time, but they were tuned to the box people actually owned.
Today? Games are often built for hardware that doesn’t even exist yet, then shoved out the door half-baked on PC with the unspoken assumption that AI upscaling and driver patches will mop up the blood. We’ve had a string of big-budget PC releases – from cinematic action games to open-world RPGs to even some Sony ports – that launched in states ranging from “rough” to “what the hell is this?” even on high-end rigs.
And the pattern is depressingly familiar: day one, performance is a mess. Native resolution struggles, CPU and GPU usage is all over the place, memory leaks everywhere. But if you turn on DLSS or another upscaler, things become “acceptable enough” for a big chunk of players, at least visually. That helps blunt the backlash, and within a few weeks the narrative has shifted from “this port is broken” to “just run DLSS Quality and you’re fine.”
I don’t blame devs for this nearly as much as I blame the ecosystem they’re working in. Modern PC development is a nightmare: infinite configurations, ever-changing drivers, engine updates, publisher deadlines, cross-gen constraints, and now the added joy of catering to console, multiple GPU vendors, and three competing upscaling methods. If you’re a studio under pressure to ship something, the existence of DLSS is a massive relief valve. Of course they’re going to lean on it.
But that is exactly why Huang crowing that “without RTX there would be nothing today” feels so off. Nvidia didn’t just swoop in and rescue a helpless industry. It helped design the current reality where releasing a highly optimized, performant PC build is optional, because players have been trained to accept that flipping an Nvidia-branded switch is just “how it’s done now.”
The other part of Huang’s speech that made me roll my eyes is the way he framed Nvidia as some kind of gaming-first hero company. Look at the actual numbers. Nvidia is now valued in the trillions, and its data center and AI business dwarfs gaming. Recent reports put gaming revenue somewhere around $16 billion a year. Sounds huge until you park it next to the roughly $190+ billion coming from data center and AI. Gaming isn’t the main event; it’s a profitable side hustle and a very convenient marketing story.
When Nvidia cuts back on RTX 50-series production to prioritize AI clients, or when global component shortages hit and Jensen calls the situation “fantastic” because demand is so high for their chips, what do you think that tells you about where gamers sit on the priority list? Spoiler: we’re not at the top.

The knock-on effect is obvious: fewer gaming GPUs, higher prices, longer waits between launches, and a stronger incentive to position software features like DLSS as the thing that justifies those price tags. If the raw silicon generational leap is modest – or cannibalized by data center design priorities – you need another reason to convince people a $1,000+ card is “worth it.” Enter frame generation and proprietary “neural graphics.”
Strip that away and imagine reviewing modern Nvidia GPUs as just chips. No DLSS, no Reflex, no RTX gimmicks turned on – just old-school, native rendering benchmarks. Would that hardware still deserve the kind of praise it’s getting, at the prices it’s charging? Or would a lot of cards suddenly look like what they secretly are: over-expensive tickets to access a software stack gamers have been trained to depend on?
That’s the part Huang’s “we created the modern videogame industry” line conveniently leaves out. Nvidia didn’t just build a platform; it built a dependency. And now it’s cashing in on that dependency while its real attention is focused on AI clusters and data center contracts.
I’m not uninstalling DLSS. I’m not pretending I don’t enjoy watching Cyberpunk 2077 go from 30 fps to 80+ with a single toggle. I still recommend upscaling to friends who just want their games to feel better on mid-range hardware. I get why it’s popular, because I use it too.
But I’ve changed how I look at Nvidia’s ecosystem – and how I spend my money – because of all this.
I no longer treat DLSS as a free bonus; I treat it as a hidden tax. If I’m buying an Nvidia GPU, I remind myself I’m paying for hardware that often can’t stand on its own at the settings it’s advertised around. Those “4K ray tracing” headlines are, in practice, “4K, if you’re okay with AI reconstructing half the image and generating fake frames.” That doesn’t make the experience invalid, but it does make the marketing feel dishonest.
I’ve also stopped upgrading every generation. If the jump in native performance isn’t meaningful, and the main selling point is “better DLSS” or “new frame generation mode,” I stay on my existing card. I’d rather keep my money than reward a roadmap that increasingly treats gamers as beta testers for whatever AI trick Nvidia wants to normalize next.
Most importantly, I’ve stopped giving Nvidia free moral credit for “supporting” the gaming industry. When your business makes ten times as much from AI and data centers as it does from games, you don’t get to stand on stage and act like you alone built this hobby. You built a giant, profitable section of the infrastructure that games run on, and then you architected a world where that infrastructure is required to compensate for weaker optimization and slower raw hardware gains.
I don’t want DLSS to disappear. I don’t want ray tracing to go away. I want more mad experiments like path-traced Night City, not fewer. But I want them on top of solid fundamentals, not instead of them.
From Nvidia, that means:
From developers and publishers, it means treating DLSS (and FSR, XeSS, whatever) as a bonus, not a lifeline. Ship PC versions that run respectably without upscaling, then let these tools push things even further. Don’t design your performance targets assuming every player will be comfortable with reconstructed images and tech that’s vendor-locked to a single company.
And from us, the people actually playing these things, it means stopping the reflex to clap every time a CEO tells us how lucky we are to have them. Nvidia has done a ton for PC gaming. It’s also helped create a reality where weaker native performance, higher prices, and software dependence are just “the way it is now.” Both things can be true at once.
So no, Jensen, Nvidia didn’t create the modern videogame industry. What it created is the current deal we’re stuck with: stunning visuals, AI wizardry, and a creeping sense that without one company’s secret sauce, the whole illusion falls apart. As someone who’s spent years in Night City and decades on PC, I’m not willing to call that a fantastic outcome – no matter how good the frame rate counter looks.
Get access to exclusive strategies, hidden tips, and pro-level insights that we don't share publicly.
Ultimate Gaming Strategy Guide + Weekly Pro Tips