WAR OF THE UPSCALING ALGORITHMS
Rendering fewer pixels to boost framerates
Nvidia’s DLSS tech is used in games such as Battlefield 2042 to get better performance, without sacrificing image quality.
© EA upscaling algorithms
VIDEO UPSCALING is hardly a new technology. From the time the first DVDs started shipping, companies have worked to create higher-quality renditions of low-resolution videos. First, it was VHS to DVD quality, then we got 720p TVs and Blu-ray drives that would turn DVD quality into Full HD quality, and now there are TVs and other devices that will upscale 1080p content to 4K, or even 8K if you have the requisite display and hardware. But upscaling video content is different from upscaling games, in a variety of ways.
For one, games have a lot more data available behind the scenes. They also run at much higher frame rates and are highly susceptible to latency issues. While the underlying concepts of upscaling may be similar, the hardware and technologies behind the scenes are much more complicated.
The proof of the pudding, as always, is in the eating—or, in this case, the rendering of the pixels. Join us as we dive into the current game upscaling solutions from Nvidia and AMD—and even Intel, sort of. Which will reign supreme, or can they all possibly coexist in an increasingly crowded world of GPUs and APIs?
–JARRED WALTON
THE NEED FOR UPSCALING
We’ve seen a lot of hype over the past few years around the topic of upscaling technologies. Nvidia is largely to blame, having launched DLSS—that’s Deep Learning Super Sampling to the nerds—not long after the first RTX 20-series GPUs landed in late 2018. The idea wasn’t exactly revolutionary: render fewer pixels and then intelligently upscale the result to a higher resolution. Gamers would get better performance, hopefully without losing much in the way of image quality.
The thing is, upscaling tech has been around for decades. Lanczos resampling, for example, is named after its inventor, Hungarian Cornelius Lanczos, who died in 1974. He wasn’t using his mathematical formula for video and image processing back in the day, but the principles certainly apply there. It’s also a relatively simple algorithm compared with some other alternatives, meaning it can be used without causing a significant loss in performance for things like real-time rendering.
But why do we need upscaling in the first place? If you’ve paid attention to increasing resolutions over the past decade or two and looked at where things are headed, we’re fast approaching a time when only the highest of the high-end graphics cards will be able to handle gaming at native resolutions. Even today, with a toptier GPU, many games can’t break 60fps at maximum quality settings while using a 4K resolution. For instance, Total War: Warhammer 3 only managed 50fps on the 3090 Ti at 4K ultra and 38fps on the RX 6950 XT. Games with complex ray tracing, such as Control, Cyberpunk 2077, Fortnite, and Minecraft averaged 31fps on the 3090 Ti and 17fps on the 6950 XT.
Now imagine trying to do anything at all with future 8K displays. Four times as many pixels? Performance will implode. Not that most of us need an 8K display, but they’re coming regardless, and we’ll have far better luck using such a high resolution if we don’t have to natively render every single pixel that gets sent to the screen. But even if you don’t run an 8K monitor, 4K displays are becoming ubiquitous—nearly every new TV that is launched these days now sports at least a 4K resolution.
That might not be a problem for the fastest GPUs, but anyone running a budget or midrange solution will have to look for some alternative to native 4K gaming.
Upscaling fills the gap, and it’s been doing so quite successfully for some time. The previous generation Xbox One X and PlayStation 4 Pro were both marketed as being 4K capable. Technically, they were, especially with lighter fare. But if you wanted to play a more demanding game on those previous generation consoles, they invariably resorted to dynamic upscaling—and most of the console games never even considered whether that was a problem.
Of course, console gamers have the benefit of typically sitting on a couch that’s 10–15 feet away from the TV. The eagle-eyed among you might be able to tell the difference between native 4K rendering and 1080p upscaling at that distance, especially on larger TVs, but for anyone just wanting to enjoy some games, losing a bit of detail while gaining a smoother experience represents a net win. Computer gaming on the other hand typically happens at a monitor distance of two feet, where upscaling artifacts are far more noticeable.