• Hardware
  • RTX 2080ti - Nvidia "Turing" Upgrade

It is a baseline yes for a new leap , next gen will be more efficient sure with a further die shrink. However they managed after an era of engineering to put three kind of different task oriented cores representing three technologies in one chip. That is a leap since these tech were available for very expensive pro level rendering enterprise solution hardware now we have an end customer graphic card like Rtx 2700 @ 550 usd coming in par with a previous + 10000 usd triple V120s to do ray tracing and deep learning calculations. Relatively the TDP of the 2080ti is not very high for the number of different cores it has.
AMD, in the graphic cards line ,lacks in perfomance / innovation in the graphic cards line. They will keep the approach of " catch me a year later" with the Navi and Vega 20 7nm releases. Look.at the Vega 64 it has ~ 13 Tflops FP32 yet the 1080ti has ~ 11.3 and beats in almost all AAA titles across all resolutions. AMD must core hard to change their core engineering of their chips.

Look at the 2080ti vs 1080ti as a raw rendering rasterization on paper

21 % increase in cuda cores
616 GB/s vs 484 GB/s memor bandwidth
~ 14 Tflops of FP32 vs 11.3

That is not a big leap in paper specs , yet Nvidia with their shift to 12nm FF vs 16nm on Pascal , made a leap in efficiency and SMs The streaming multiprocessors (SMs) are the part of the GPU that runs our CUDA kernels , for that the leap in rasterization performance without RT or DLSS.
Raficoo wroteSo Nvidia finally gave some information on what the new RTX 2080 can do compared to the GTX 1080 at 4K resolution:
https://cdn.wccftech.com/wp-content/uploads/2018/08/NV-GeForce-RTX-2080-Performance-Games.jpg
https://cdn.wccftech.com/wp-content/uploads/2018/08/NV-GeForce-RTX-2080-Performance.jpg

And there's this video of the user claiming it's RTX 2080 Ti vs GTX 1080 Ti in games using Deep Learning Super Sampling (DLSS), might be real or not, video quality is crappy though:
https://www.youtube.com/watch?v=WMbh6zFrzto
If you take with the recent graph a linear performance trend the 2080 is faster than the 1080

~ 1.45 - 1.5 at 2160p without DLSS and ~ 2x with DLSS Compatible games same will follow with the 2080 vs 1080ti , thus the video has some creadibility backed with the demo that Nvidia showed in their conference as Gamescom - In a demo called "Infiltrator" at Gamescom, a single Turing-based GPU was able to render it in 4K with high quality graphics setting at a steady 60FPS using DLSS , actually it was ~ 78 bu capped by the 60hz panel being played at. Huang noted that it actually runs at 78FPS, the demo was just limited by the stage display. On a GTX 1080 Ti, that same demo runs in the 30FPS range.
DLSS = Rendering in lower resolution and then upscaling
The performance chart provided by Nvidia was inspected by Hardware Unboxed, for a certain reason Nvidia also tested using HDR enabled which was tested to affect performance on the 10 series card which they could have solved with the 20 series, the benchmark chart provided by Nvidia which came late for some weird reasons still not provide a clear information about the performance on the 20 series

https://www.youtube.com/watch?v=QoePGSmFkBg
anayman_k7 wroteDLSS = Rendering in lower resolution and then upscaling
The performance chart provided by Nvidia was inspected by Hardware Unboxed, for a certain reason Nvidia also tested using HDR enabled which was tested to affect performance on the 10 series card which they could have solved with the 20 series, the benchmark chart provided by Nvidia which came late for some weird reasons still not provide a clear information about the performance on the 20 series

https://www.youtube.com/watch?v=QoePGSmFkBg

"Just Buy It: Why Nvidia RTX GPUs Are Worth the Money"

https://www.tomshardware.com/…/nvidia-rtx-gpus-worth-the-mo…

Done Get Rid of the 1080ti Strix and waiting to the 2080ti Stix to drop in my case. No time for " fake convincing" that no need to upgrade skip to 7nm and give myself assumptions : RT is a gimmick , No great leap in performance , DLSS is a gimmick , 30 fps in Tombraider with RT in 1080p etc since I don't want to lose in a sell value and saying all sh..t for the new gen.

Turing is a leap with a fast moving tech.

My next post will be mocking post for those who still perceive that Turing =1.3 Pascal in pure rasterization (without DLSS / RT ). Benchmarks will be revealed sooner or Later ?

Look at the 2080ti vs 1080ti as a raw rendering rasterization on paper

21 % increase in cuda cores
616 GB/s vs 484 GB/s memor bandwidth
~ 14 Tflops of FP32 vs 11.3

That is not a big leap in paper specs , yet Nvidia with their shift to 12nm FF vs 16nm on Pascal , made a leap in efficiency and SMs The streaming multiprocessors (SMs) are the part of the GPU that runs our CUDA kernels , for that the leap in rasterization performance without RT or DLSS.
You are just like an Apple fan, they saw the iPhone X as a gift from god, same for you, not even a single benchmark out yet you sold your 1080Ti and you rode the first class of the hype train, you might find yourself at the end rebuying the 1080Ti you sold for more :)
I really hope the RTX will perform as good as all the positive hype. After all, this hype is the reason I, and many others will finally be able to afford a 1080Ti. So it's all good.
as @enthralled said,
I'm not falling for any real time shadows, lighting or dynamic reflections.
A game is a game, doesn't have to look SO real.

Personally i'm still on my 980, and when the Real benchmarks of the 2080ti, without RTX come out, i will compare it to the 1080ti.
If there is a huge improvement over the 1080ti, as in say +50%, i would get the new 2080ti.
Otherwise, if performance increase is only at 20-30% increase, i would get a hopefully a cheap 1080ti.
For Skepticals:

"Gaijin’s Enlisted was running with over 90fps in 4K resolution with NVIDIA’s real-time ray tracing effects" -DSOGaming

https://www.dsogaming.com/news/gaijins-enlisted-was-running-with-over-90fps-in-4k-resolution-with-nvidias-real-time-ray-tracing-effects/

" 'Tomb Raider’ devs respond to RTX 2080 Ti, ray tracing performance concerns" -Digital Trends

https://www.digitaltrends.com/computing/tomb-raider-dev-responds-to-rtx-2080-ti-performance-concerns/

PC &Parts PreOrder Availability

RTX 2080ti
https://pcandparts.com/video-card/?filter_p-type=rtx-2080ti

RTX 2080
https://pcandparts.com/video-card/?filter_p-type=rtx-2080&orderby=price-desc




~ 1273 USD with 11 %VAT + 5 USD Delivery will net ~ USD 1278 for the Zotac RTX 2080ti OC which relatively a good price compared to official Nvida FE ~ 1200 USD on their official website.

I will be Aming for the

ROG-STRIX-RTX2080TI-O11G-GAMING , due to their quality PCB ,VRM and Cooling.

I am optimistic with the technology Nvida injected in Turing ,and more and more games will utilize RT on the run , I will not wait for 7nm since I like to enjoy the top specs in my system.

Waiting for:

Apple to Apple Performance on games ( Old and New)

Without DLSS
Without RT
With DLSS
With RT
With DLSS & RT

Promising Tech and a good leap for things done with expensive enterprise level computers now brought to the end gamer . Previous gen Ti replaced by next gen , never failed me.

1080ti Strix OC'd to 2025mhz core clock & 11.6 Ghz memory with an i7 8700k OC'd to 5ghz on all 6 cores (Negative AVX offset set to 0) with 3200mhz rams , the 1080ti definitely struggles on 2160p to have a locked 60 fps experience at very high - very high settings.

Far Cry 5
Monster Hunt World
Ghost Recon Wild Lands
Mess Effect Andromeda
Watch Dogs 2
AC Origins
Kingdom Come: Deliverance
Middle-earth: Shadow of War 

& other titles I tested

To me a 3440× 1440p gamer @ 100hz and native 2160p even at 60hz , 1080ti is obsolete on core rasterization rendering away from RT or DLSS. Yes I need AA ( not FXAA since it the worst) since I play 2160p HDR @60fps on the 55" Sony X930 E TV where the PPI is ~ 80 compared o 163 on a 27" gaming 2160p where AA is mostly not need due to the small screen giving a high pixels per inch.
I'm a full HD 60 fps strategy gamer (ahem factorio) integrated graphics would have been enough for me but I have a gtx 1070. I have not come close to having it 100% used with any game(not even witcher 3)

Now if I can only bring it with me to Canada :D
user wroteI'm a full HD 60 fps strategy gamer (ahem factorio) integrated graphics would have been enough for me but I have a gtx 1070. I have not come close to having it 100% used with any game(not even witcher 3)

Now if I can only bring it with me to Canada
Ok , any thing related to Turing- Offtopic to the subject.
Tech Guru wrote
user wroteI'm a full HD 60 fps strategy gamer (ahem factorio) integrated graphics would have been enough for me but I have a gtx 1070. I have not come close to having it 100% used with any game(not even witcher 3)

Now if I can only bring it with me to Canada
Ok , any thing related to Turing- Offtopic to the subject.
I am merely questioning why would anyone upgrade? Outside the 4k 144 hz people I mean. It does seem like the 10 series already dominate all things full HD and even 1440, there is no further point.
The thing is with early adoption is the risk. The first batch of announced games that support RTX is great. AAA titles that will support RTX, most of these games are in late development stages. So RTX might be an extra option patched later on, not from the ground up.

As with any tech dx10, dx11, dx11.1 , and dx12 (no performance increase or graphical marvel) it all comes down to the developer support.

So regarding the performance metrics of the upcoming cards, i am not hyped about RTX. Especially with the fact that the consoles are what truly drives innovation by developers. And they are AMD and limited (fast but limited).
Lets wait for benchmarks for all the games that are already released.

I am disappointed by the fact that nvidia's founder edition is not blower style anymore, and power consumption. Alot of times the circuit breaker in my house shuts down duo to load on my 980 ti, plus i have a mini-itx case...
You gotta justify your purchase somehow, amirite?
Tech Guru wrote
user wroteI'm a full HD 60 fps strategy gamer (ahem factorio) integrated graphics would have been enough for me but I have a gtx 1070. I have not come close to having it 100% used with any game(not even witcher 3)

Now if I can only bring it with me to Canada
Ok , any thing related to Turing- Offtopic to the subject.
It IS on topic!

He means that he doesn't need Turing, RTX or whatever to play his favorite games that barely push his 1070. You're just so focused on justifying your upgrade, and the WOW effect "Hey look, RTX, DLSSRDLL* ,LDSSR*, <input some random acronym here that looks cool> ... " that you forget to actually enjoy this hardware.

We have been there a 1000000 times already. NVidia releases some "new technology", we expect it to be a huge game changer. You get hyped and post this thread where you're selling your old card, copying articles from here and there**, trying to justify the new purchase, bashing anyone who tries to tell you that you should wait. And in the end it just ends up being, like every release, some 25-30% generation performance increase.
anayman_k7 wroteYou are just like an Apple fan, they saw the iPhone X as a gift from god, same for you, not even a single benchmark out yet you sold your 1080Ti and you rode the first class of the hype train, you might find yourself at the end rebuying the 1080Ti you sold for more :)

If you like having the latest hardware (I do too, I own a water-cooled 1080Ti after all...), then that's fine. But if you need to shove it down everyone's throat, and not accept any arguments about it, then you're just an attention seeking idiot.

* I made these up.
**We know how to browse the internet ourselves!
vlatkozelka wrote
Tech Guru wrote
user wroteI'm a full HD 60 fps strategy gamer (ahem factorio) integrated graphics would have been enough for me but I have a gtx 1070. I have not come close to having it 100% used with any game(not even witcher 3)

Now if I can only bring it with me to Canada
Ok , any thing related to Turing- Offtopic to the subject.
It IS on topic!

He means that he doesn't need Turing, RTX or whatever to play his favorite games that barely push his 1070. You're just so focused on justifying your upgrade, and the WOW effect "Hey look, RTX, DLSSRDLL* ,LDSSR*, <input some random acronym here that looks cool> ... " that you forget to actually enjoy this hardware.

We have been there a 1000000 times already. NVidia releases some "new technology", we expect it to be a huge game changer. You get hyped and post this thread where you're selling your old card, copying articles from here and there**, trying to justify the new purchase, bashing anyone who tries to tell you that you should wait. And in the end it just ends up being, like every release, some 25-30% generation performance increase.
anayman_k7 wroteYou are just like an Apple fan, they saw the iPhone X as a gift from god, same for you, not even a single benchmark out yet you sold your 1080Ti and you rode the first class of the hype train, you might find yourself at the end rebuying the 1080Ti you sold for more :)

If you like having the latest hardware (I do too, I own a water-cooled 1080Ti after all...), then that's fine. But if you need to shove it down everyone's throat, and not accept any arguments about it, then you're just an attention seeking idiot.

* I made these up.
**We know how to browse the internet ourselves!
I do not know why are you triggered :) ,I have been on this forum for more than 7 consecutive years and all my posts are technical oriented and straight.

A hyped person is a noob who posts things and doesnot know what it means , I am clearifying all points in simple tech way trying to summarize what "turing offers". Beside I am not justifying , technology it self justify what it offers. Hate it or no , technology in the pc world is evolving and a new gen whatever it offers is better than old gen. It is better to justify a purchase to a new gen than trying to " self convince" hay I have the previous gen and all what the new gen offers seem a gimmick and a marginal boost.

RT is a gimmick
DLSS is fake
I will wait for 7nm and skip that
Only 30 % Performance Boost
Oh Tombraider doesnot have a locked 60fps even on 1080p on the RTX !


Perceptions that either technology noobs or people who do not want upgrade (since they want to lose in selling their old gen or do not have money to drop the newest tech) , to "self convince themselves " that this is all hype.

If you are triggered by my posts simply do not write offensive feedbacks and take things to a personal level that I do not like to take sure and call you an " idiot" as you were trying to call me indirectly from your response structure.

Personal things are not solved here , if you have anything towards me.
Tech Guru wrote
vlatkozelka wrote
Tech Guru wrote
Ok , any thing related to Turing- Offtopic to the subject.
It IS on topic!

He means that he doesn't need Turing, RTX or whatever to play his favorite games that barely push his 1070. You're just so focused on justifying your upgrade, and the WOW effect "Hey look, RTX, DLSSRDLL* ,LDSSR*, <input some random acronym here that looks cool> ... " that you forget to actually enjoy this hardware.

We have been there a 1000000 times already. NVidia releases some "new technology", we expect it to be a huge game changer. You get hyped and post this thread where you're selling your old card, copying articles from here and there**, trying to justify the new purchase, bashing anyone who tries to tell you that you should wait. And in the end it just ends up being, like every release, some 25-30% generation performance increase.
anayman_k7 wroteYou are just like an Apple fan, they saw the iPhone X as a gift from god, same for you, not even a single benchmark out yet you sold your 1080Ti and you rode the first class of the hype train, you might find yourself at the end rebuying the 1080Ti you sold for more :)

If you like having the latest hardware (I do too, I own a water-cooled 1080Ti after all...), then that's fine. But if you need to shove it down everyone's throat, and not accept any arguments about it, then you're just an attention seeking idiot.

* I made these up.
**We know how to browse the internet ourselves!
I do not know why are you triggered :) ,I have been on this forum for more than 7 consecutive years and all my posts are technical oriented and straight.

A hyped person is a noob who posts things and doesnot know what it means , I am clearifying all points in simple tech way trying to summarize what "turing offers". Beside I am not justifying , technology it self justify what it offers. Hate it or no , technology in the pc world is evolving and a new gen whatever it offers is better than old gen. It is better to justify a purchase to a new gen than trying to " self convince" hay I have the previous gen and all what the new gen offers seem a gimmick and a marginal boost.

RT is a gimmick
DLSS is fake
I will wait for 7nm and skip that
Only 30 % Performance Boost
Oh Tombraider doesnot have a locked 60fps even on 1080p on the RTX !


Perceptions that either technology noobs or people who do not want upgrade (since they want to lose in selling their old gen or do not have money to drop the newest tech) , to "self convince themselves " that this is all hype.

If you are triggered by my posts simply do not write offensive feedbacks and take things to a personal level that I do not like to take sure and call me an " idiot" as you were trying to indirectly from your response structure and ending

Personal things are not solved here , if you have anything towards me seek other channels.
Tech Guru wrote I do not know why are you triggered :)
Stupid content triggers me, I can't help it. You've been posting irrelevant crap for 6 days. And everyone's trying to tell you to wait for actual benchmarks, but you're too stupid to listen
Tech Guru wrote I have been on this forum for more than 7 consecutive years
7 years and you still learned nothing :/
Tech Guru wrote all my posts are technical oriented and straight.
You just COPY articles from here and there, and re-write them in bad English, with bad grammar, and random capitalization of letters. It is cancer to read your posts!
Tech Guru wrote A hyped person is a noob who posts things and doesnot know what it means
Which is exactly what you are. Do you even know what ray tracing is? Do you even know how huge the computation is to achieve rendering fidelity that is anyway near rendering that's done outside of games? You keep talking about tech that you have no idea about. Do you even know why they keep making transistors smaller and smaller? And what are the complications of that?
Tech Guru wrote It is better to justify a purchase to a new gen than trying to " self convince" hay I have the previous gen and all what the new gen offers seem a gimmick and a marginal boost.
NO! Everyone else in this thread is telling you the same thing. Our "Old" hardware is doing it's job perfectly, we are happy with what we have. You are not :)
Tech Guru wrote RT is a gimmick
DLSS is fake
I will wait for 7nm and skip that
Only 30 % Performance Boost
Oh Tombraider doesnot have a locked 60fps even on 1080p on the RTX !


Perceptions that either technology noobs or people who do not want upgrade (since they want to lose in selling their old gen or do not have money to drop the newest tech) , to "self convince themselves " that this is all hype.
It's all true, marketing gimmicks exit. And tech is often sold at much higher prices than it should. The problem is the prices get inflated as fanbois like you will always pre-order.
Also don't go around calling people poor. Being smart with your money doesn't mean you're poor. We aren't all government thieves like yourself!

Tech Guru wrote If you are triggered by my posts simply do not write offensive feedbacks and take things to a personal level.
I am a software engineer, and so I take this stuff seriously. I understand what I read, and ask around when I don't, unlike you who bashes any post that doesn't conform with his opinions.
So when I see stupid crap like yours posted, I can't help it, like I already said in the beginning.

Conclusion:

Your threads are attention seeking posts disguised as tech talks. Every single thread of yours is about some tech that you bought, then some copying of articles and benchmarks trying to justify why you bought it. Do not try to convince me, or anyone, that you actually are a tech person, you just get high on buying expensive tech, and then bragging about it on the internet.
It is just sad to see LebGeeks become a buy and sell website like OLX + random Tech Guru crap. There was much more potential in this place :(
wait for benchmark , im not htat hyped about it , shadow of the tomb raider wasn't running that smooth on the new tech with ray shit on
DLSS is good though .
still i would upgrade since i like upgrading to a new tech and i can do that . still this release ain't looking good and nvidia ceo only spoke about while ray tracing on its that much faster than pascal , but i really don't care about ray tracing on or off i wanna see benchmark and numbers.