I need to summarize some points
For Those Who Are Asking Why RTX 2080ti Has an MSRP of 1200USD I will summarize to you:
2080ti ( TU 102)
752 mm2 die size
18.9 Billion Transistors
Cuda Cores: 4352
Tensor Cores: 544
RT Cores: 68
GDDR6 @ 14Gbps on 352 Bit Memory Bus = 616 GB/s Memory Bandwdith
1080ti ( GP 102)
471 mm2 die size
14billion transistors
Cuda Cores: 3584
Tensor Cores: N/A
RT Cores: N/A
GDDR5x @ 11Gbps on 352 bit Memory Bus = 484GB/s Memory Bandwidth
% Increase in Die Size
(752 - 471)/471 × 100 % = + 59.66 %
% Increase in Transitors Count
(18.9 - 14) / 14 × 100% = + 35 %
%Increase in Cuda Cores
(4352 - 3584) / 3584 = + 21.5 %
Added factoring the Cost of GDDR6 that resulted in:
% Increase in Memory Speed
(14 -11) / 11 = + 27 %
% Increase in Memory Bandwidth
(616 - 484) /484 = + 27 %
Added Factor the Cost of Adding of
+ 544 Tensor Cores and + 68 RT Cores
Tech is always obsolete , and waiting is pain in the ... .
What are your gaming needs in #present , for me 2160p 60hz and even 3440 × 1440p 100hz , the 1080ti ( heavily OC'd to 2100mhz core and 11.8 Ghz memory) failed especially with the min fps on 2160p. A 30 % on average with regular rasterization ( on lunch more with more drivers update , developers etc. ) aside from RT or DLSS , enhance my gaming experience now , why I need to wait for one year + having gaming experience with several IQ compromises on 2160p with the 1080ti. I Enjoy what tech #currently offers me.
Second, HDR processing on Pascal is done by SW without a dedicated pipeline inducing stutter and latency. Turing resolved this issue with hardware level dedicated display pipleline and HDR tone mapping.
With more and more AAA titles adding HDR 10 or Dolby Vision meta-data why i need to wait again to enjoy HDR gaming as a some who has a compatible HDR display.
Third, the 1080ti fails to hit 144fps on 1440p 144hz or 100hz on 3440 × 1440p 100hz without IQ compromises , and these two resolutions needs AA ( 2x msaa at least) not the shitty FXAA. I am willing to drop the PG 35vq ( 3440 × 1440p @ 200hz with 10bit HDR) , RTX 2080ti would be a better value for me.
Fourth, RT AI Neural Networks Tensor Cores DLSS set a base line for hyper rendering beside cuda cores for traditional rasterization. It an added advantage but the above point are my key motivators to upgrade not RT , DLSS etc..