Several hours ago, NVIDIA officially pulled back the veil from two of its latest mobile gaming offerings. Those products were the GeForce RTX 3080 Ti and RTX 3070 Ti for gaming laptops.
In an effort to be brief about it, NVIDIA virtually skimmed through the specifications of these new RTX 3080 Ti and RTX 3070 Ti-powered machines. Starting with the RTX 3080 Ti, the new laptop GPU houses 16GB of what NVIDIA calls its “fastest ever GDDR6” graphics memory, and even goes so far as to say that it outclasses the desktop TITAN RTX GPU. As far as performance goes, the brand says that the GPU is expected to sustain an average of 120 fps with the most demanding titles at 1440p resolutions, with their graphics setting at Ultra, naturally.
As for availability, NVIDIA says that GeForce RTX 3080 Ti laptops will be available via the brand’s laptop partners, including ASUS and Razer, to name a few, starting 1 February.
Moving on the RTX 3070 Ti, NVIDIA didn’t actually share a lot of details about this laptop GPU, save for the fact that its performance is approximately 1.7x faster and more powerful than the desktop GeForce RTX 2070 SUPER graphics card. In addition, its average performance should be within the vicinity of 100 fps at 1440p resolution, for the majority of titles at their Ultra graphics preset.
Laptops equipped with the GeForce RTX 3070 Ti will be available from 1 February onwards, with brands such as ASUS, Alienware, MSI, and Razer to some of the first to produce said laptops.
At the same time, NVIDIA also spoke about its newer and improved 4th generation Max-Q technologies. This includes CPU Optimiser, a feature that allows the onboard GPU to maximise performance, temperatures, and the power of the CPU simply by tweaking them on the fly. Then there’s Rapid Core Scaling, a feature NVIDIA created for creators that heavily depend on programs just like Adobe Premiere Pro, Blender, and Matlab. It enables the GPU to sense real-time demands of applications and only utilise the cores it needs, rather than hog them all. Lastly, there’s a new and improved Battery Boost 2.0 that now uses AI to find that optimal balance between GPU and CPU usage.