PowerPC wroteTech Guru wroteanayman_k7 wroteThe benchmarks are out, based on price per performance it seems that the GTX2080 is a bad buy when it has a slight better performance compared to the GTX1080Ti with less VRAM and for 100$ to 150$ more!
When the drivers are mature are more enough and DLSS role out , 1080 will not have a chance. 1080ti need to be compared with the 2080ti and 2080 with the 1080 .
the 1080ti is screwed deep but people who paid a hefty amounts for their 1080tis are trying to "self convince themselves"that upgrading to turing is not worthy , to deep inside they know a gen to gen architectural changes , increase in cuda core , new memory ,new optimizations , new features is always a win situation. Truth about evolving tech were " self convincing" to stick themselves with old gen.
So people with ti's are just "self convincing themselves" that upgrading is not worthy? Nothing to do with the multitude of reviewers claiming this generation of cards is poor value as it stands right now?
if you want to be on the bleeding edge, money no cost, then by all means enjoy your 2080ti. But the majority like to consider the value they're getting for their investment. AS IT STANDS NOW (i.e with little to no data on RTX and DLSS in actual games) the 20 series offers very poor FPS per dollar compared to Pascal.
People being hesitant is totally understandable. Things might change in the future with drivers and more games being released, but few want to pay money for promises, they'd rather see the results first.
Here is the catch
Why RTX Turing is a Leap ( In a summary without diving to the architecture )
First - 2160p Ultra with AA Performance:
Digital Foundry is a very subjective website , RTX 2080ti in their analysis was pulling ahead of 60fps in many games they have tested ( max IQ and most brutle AA were used) it is the 1st true 2160p enjoyable experience without IQ compromises. Even the i7 8700k bottlenecked the rtx 2080ti on 1440p high refresh rates.
Take Far Cry 5 2160p TAA Ultra for example:
2080ti: 72 Fps Average , 64 min
1080ti: 55 Fps Average , 48fps min
This min kills the experience in the 1080ti that I faced on 2160p and many other demanding AAA titles. RTX 2080ti on 2160p Ultra with Heavy AA too performance is phenomenal without DLSS.
Digital Foundry Review of the RTX 2080 / 2080Ti
https://youtu.be/pgEI4tzh0dc
Second -Pricing:
Concerning the price , it is a lunch price and prices will normalize after lunch. Remember , new technology always costs especially with high prices of current V-Ram/Ram models, the TU 102 in the RTX 2080ti contains 18.9 billion transistors in 752 mm2 die size and that includes three discrete processors: the Turing SM, the RT Core, and the Tensor Core compared to GP 102 in the 1080ti that contains 14 billion transistor with a die size of 471 mm2.
HDR:
Turing display engine now supports HDR processing natively that reduces input latency issues Pascal faced , it also supports hardware based HDR tone mapping that is beneficial foe mapping HDR scenes into SDR displalys.
Addition:
USB Type C for VR (Data and Power Over a Single Cable) , NVLink for Dual Graphic Cards , decoder has also been updated to support HEVC YUV444 10/12b HDR at 30 fps, H.264 8K, and VP9 10/12b HDR , DisplayPort 1.4a + DSC , better VR experience.
2080 vs 1080ti
Comparing 2080 to 1080ti not not fair since 2080 replaces the 1080 and 2080ti replaces the 1080 yet:
Less Cuda Cores than the 1080ti yet the 2080 is 13% faster -till now the gap will increases as drivers will become more optimized.
DLSS , RT Cores , Tensor Cores.
HDR HW processing and Tone Mapping
Hybrid Rendering
It is a definite leap in technology.
Take the 2080 for example vs 1080ti
Compare the Die Sizes + transistor Count
Compare the Cost of GDDR6 in a high cost market
The addition of new cores ( RT and Tensor)
Still people are underestimating the potential of turing with initial set of benchmarks , they based a new foundation for a new tech in graphic cards away from sole rasterization rendering to " hybrid rendering form" wait for developers to use it and understand it and the gap will increase incrementally with coding and drivers.
At the end it depends on your gaming needs to upgrade or no there is generalize that a new tech "doesnot worth due it is price / perfromance criteria" , as a 2160p HDR 60fps gamer for Story driven AAA titles , Turing will enhance my gameplay and HDR satisfaction. In addition I will be replacing the PG 348q with the PG 35VQ ( 3440 × 1440p 200hz 10bit HDR) ,the Rtx will be a better fit for that too. I would have kept the 1080ti and waited for the RTX 3080ti 7nm , but who knows when it will drop with unsatisfactory experience by the 1080ti on 2160p and HDR ( especially on the minimums) , and as a owner of HTC Vive Pro utilizing the Virtual Link will minimize cabling hustle too.