- Edited
anayman_k7 wroteDLSS = Rendering in lower resolution and then upscaling
The performance chart provided by Nvidia was inspected by Hardware Unboxed, for a certain reason Nvidia also tested using HDR enabled which was tested to affect performance on the 10 series card which they could have solved with the 20 series, the benchmark chart provided by Nvidia which came late for some weird reasons still not provide a clear information about the performance on the 20 series
https://www.youtube.com/watch?v=QoePGSmFkBg
"Just Buy It: Why Nvidia RTX GPUs Are Worth the Money"
https://www.tomshardware.com/…/nvidia-rtx-gpus-worth-the-mo…
Done Get Rid of the 1080ti Strix and waiting to the 2080ti Stix to drop in my case. No time for " fake convincing" that no need to upgrade skip to 7nm and give myself assumptions : RT is a gimmick , No great leap in performance , DLSS is a gimmick , 30 fps in Tombraider with RT in 1080p etc since I don't want to lose in a sell value and saying all sh..t for the new gen.
Turing is a leap with a fast moving tech.
My next post will be mocking post for those who still perceive that Turing =1.3 Pascal in pure rasterization (without DLSS / RT ). Benchmarks will be revealed sooner or Later ?
Look at the 2080ti vs 1080ti as a raw rendering rasterization on paper
21 % increase in cuda cores
616 GB/s vs 484 GB/s memor bandwidth
~ 14 Tflops of FP32 vs 11.3
That is not a big leap in paper specs , yet Nvidia with their shift to 12nm FF vs 16nm on Pascal , made a leap in efficiency and SMs The streaming multiprocessors (SMs) are the part of the GPU that runs our CUDA kernels , for that the leap in rasterization performance without RT or DLSS.