At this point, most PC-gaming enthusiasts are familiar with some flavor of Nvidia’s DLSS technology and its ability to boost gaming frame rates. But DLSS just entered new territory in its fourth generation. Nvidia introduced DLSS 4 alongside its upcoming GeForce RTX 50-series graphics cards, and we’ve now been able to put it to the test first-hand in Cyberpunk 2077. Here’s a brief overview of what DLSS 4 is, its benefits, who can use it, what negative impacts it may have, and how it performed in my early testing with Cyberpunk 2077 on an RTX 5090.


What Is DLSS 4?

From one perspective, DLSS 4 is closely related to Nvidia’s current DLSS 3 technology. Both focus on generating additional frames beyond those rendered by the GPU’s primary cores, driven by additional AI Tensor cores inside the GPU, to boost frame rates. The key difference is that DLSS 4 introduces a new Transformer AI model (more on that in a bit) for creating these frames that, Nvidia alleges, is more accurate, with better image quality. DLSS 4 also aims to generate a greater number of artificial frames per original frame rendered than DLSS 3 could.

The way this works is relatively simple in concept; TVs and computer software have done it for decades. Ever go into a store and see TVs advertising a super-high refresh rate for sports content? They may have simply had a sign advertising 240 images every second. Or, maybe the store didn’t directly mention the refresh rate, and instead, the retail display just talked about how smooth the picture should look. If so, you’ve seen some version of this technology in use before, though having evaluated various forms of artificial frame generation, I will say Nvidia’s approach looks better than any standalone TV’s.

Nvidia GeForce RTX 5090

(Credit: John Burek)

How does it work? First, the graphics card first creates a couple of frames the old-fashioned way. Imagine a baseball player swinging a bat at a ball. In the first frame that the GPU renders, the player may be holding the bat vertically, ready to swing, and in the next frame, it could be horizontal and colliding with the baseball. In this case, you have missed all the motion between the bat being raised and hitting the ball, and you can’t get that back.

However, frame generation attempts to fill these gaps by looking at the two frames and creating artificial frames that reflect the visual states between the two. In the example above, it might add a frame of the bat positioned horizontally, but further back and not yet hitting the ball. This frame is essentially an interpolated image, so it is not 100% accurate, but it probably will look decent enough as it flashes by in a split second. In that time, you aren’t likely to notice any slight graphical errors it produces, if they aren’t grievous.

While the three GPU makers (AMD, Intel, Nvidia) have differing methods to achieve the effect, this is how frame generation works in general. It’s also an accurate reflection of how frame generation works for Nvidia’s last-generation DLSS 3, and things don’t change much for DLSS 4. The difference: DLSS 4 generates even more frames by looking at the differences between that first artificially created frame and the two original frames (before it and after it) it is based on, creating additional artificial frames between them. Technically, this method presents no limit to how many artificial frames you could make in this way. But for now, Nvidia has it capped at four artificial frames for every original frame. (According to the company during its Editor’s Day held earlier this month at CES, it didn’t go beyond one artificial frame with earlier DLSS frame gen because the image quality trade-offs were too much.)


Who Can Use DLSS 4, and Why Not to Use It

Technically, no one at this writing can use DLSS 4, as the GeForce RTX 50 series hasn’t shipped yet. Only the GeForce RTX 50 series has been stated to have official support for DLSS 4, starting with the Nvidia GeForce RTX 5090 and GeForce RTX 5080, launching January 30. I’ve seen speculation that Nvidia could extend DLSS 4 support to the RTX 40 series, but the company has not confirmed this yet.

Nvidia GeForce RTX 5090

(Credit: John Burek)

The only other known cards that will support it in the near future, besides the RTX 5090 and RTX 5080, are the Nvidia GeForce RTX 5070 Ti and the Nvidia GeForce RTX 5070, due to release in February. Mobile variants of all these GPUs will launch in March and support DLSS 4, too.

However, you’ll find valid reasons why you may not want to use DLSS 4 at all (or any flavor of DLSS, for that matter), even if you can. All versions of DLSS, just like all versions of AMD FSR and Intel XeSS, present trade-offs. When used without frame generation, all of these tools work on rendering frames at a lower resolution and then upscaling the created image. It’s effective at increasing the frame rate but negatively affects image quality.

With frame generation, you take a more significant hit to image quality than with standard DLSS, FSR, or XeSS, but you also introduce additional latency. You may also see “1% lows”—the average of the 1% lowest frames during a test—that are substantially lower than your average frame rate, which can feel jarring or stuttery in games. Also, using frame generation can, counterintuitively, lower your frame rate instead of raising it. Here’s why.

Nvidia GeForce RTX 5090

(Credit: John Burek)

Frame generation works to increase the frame rate because the graphics card has excess processing power, thermal headroom, and literal power available to make it work. This will likely happen when the graphics card is bottlenecked by either the game engine or your CPU. When that isn’t the case and the GPU itself is the bottleneck, performance can actually go down, as we observed when we tested the Nvidia GeForce RTX 4060.

Due to these issues, I generally avoid using DLSS, FSR, or XeSS. In my free time, I primarily run games at 4K, and I only use DLSS when my system struggles to stay close enough to 60fps for my liking. I go to it as a last resort, as it’s a better option than lowering the resolution, but that’s it.

I view frame generation similarly, but if you are trying to get higher frame rates in a competitive shooter, for instance, it might be advantageous to use it so long as you aren’t going past your monitor’s refresh rate. If your monitor can only display, say, 120 or 240 frames per second, you have no reason to push your graphics card to feed it more frames per second than that.

Recommended by Our Editors


Test-Driving DLSS 4 and Cyberpunk 2077

With DLSS 4, you get a few additional options than you had previously with DLSS 3. First is the AI model for frame generation. This option may not be present in every game, but in Cyberpunk 2077, you can manually select using the older Convolutional Neural Network (CNN) model used by DLSS 3, or the newer Transformer model debuting with DLSS 4. The Transformer model is supposed to provide better image quality, but as you’ll see in the chart below, it also leads to a slightly lower frame rate overall.

You will also have options for essentially setting a multiplier for how many frames you want to generate. Cyberpunk 2077 defaulted to x2 for us with both the Convolutional and Transformer models, but you can run either model with a multiplier of x3 or x4, too. The higher the multiplier, the faster the frame rate; but you also have a greater chance of graphical defects, more significant 1% low deltas, and longer latency.

Nvidia attempts to get ahead of the issue on the legacy front by forcing Nvidia’s RTX Reflex technology whenever frame generation is used. This is supposed to help reduce latency, and you have no reason not to use it except for when you want to compare with non-Nvidia graphics cards evenly. But the fact that it gets locked on shows that either Nvidia or the game’s developers deemed it essential for frame generation. The game likely needs an update here, however, as Nvidia also announced RTX Reflex 2 that is supported by Cyberpunk 2077 and is supposed to be better. But I could only activate this with frame generation off.

As you can see from the chart, the Nvidia GeForce RTX 5090 ran Cyberpunk 2077 like a champ with or without DLSS on. This game is extremely demanding, and this is the first time I’ve seen a graphics card maintain a frame rate of more than 60fps at 4K with the Ray-Tracing Ultra graphics preset. That it takes a $2,000 GPU released years after the game first launched to do this is quite astonishing in itself—and this isn’t even the highest graphics preset.

Anyway, as you can see from the numbers, the multiplier does more or less what you’d expect. From 130fps at 1080p with DLSS 2, you double your frame rate to 261fps at 1080p with DLSS 3 frame generation enabled. You get right around triple that 130fps score with DLSS 4 frame generation set to x3 and nearly four times that 130fps baseline with frame generation set to x4. It doesn’t scale quite this neatly across the board, due to the overhead on the GPU from running DLSS and frame generation, and the CPU becomes a bottleneck at various points. But it’s relatively linear.

Now, you must decide whether you want to use these features. As mentioned before, you’ll find some issues related to DLSS. While I encountered problems with Nvidia’s FrameView tool that prevented me from measuring 1% lows or latency, these are well-known if not clearly defined. You’ll also find some unquestionably advantageous places to use it, particularly if you have a high-refresh-rate monitor or struggle to run a game smoothly. However, the key to using DLSS to your advantage is recognizing when that is, and not using it when your GPU already produces high enough frame rates.

Get Our Best Stories!

Sign up for What’s New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links.
By clicking the button, you confirm you are 16+ and agree to our
Terms of Use and
Privacy Policy.
You may unsubscribe from the newsletters at any time.

Newsletter Pointer

About Michael Justin Allen Sexton

Senior Analyst

Michael Justin Allen Sexton

For as long as I can remember, I’ve had love of all things tech, spurred on, in part, by a love of gaming. I began working on computers owned by immediate family members and relatives when I was around 10 years old. I’ve always sought to learn as much as possible about anything PC, leading to a well-rounded grasp on all things tech today. In my role at PCMag, I greatly enjoy the opportunity to share what I know.

I wrote for the well-known tech site Tom’s Hardware for three years before I joined PCMag in 2018. In that time, I’ve reviewed desktops, PC cases, and motherboards as a freelancer, while also producing deals content for the site and its sibling ExtremeTech. Now, as a full-time PCMag analyst, I’m focusing on reviewing processors and graphics cards while dabbling in all other things PC-related.


Read Michael Justin Allen’s full bio

Read the latest from Michael Justin Allen Sexton



Source link