• Hardware
  • RTX 2080ti - Nvidia "Turing" Upgrade

It is a baseline yes for a new leap , next gen will be more efficient sure with a further die shrink. However they managed after an era of engineering to put three kind of different task oriented cores representing three technologies in one chip. That is a leap since these tech were available for very expensive pro level rendering enterprise solution hardware now we have an end customer graphic card like Rtx 2700 @ 550 usd coming in par with a previous + 10000 usd triple V120s to do ray tracing and deep learning calculations. Relatively the TDP of the 2080ti is not very high for the number of different cores it has.
AMD, in the graphic cards line ,lacks in perfomance / innovation in the graphic cards line. They will keep the approach of " catch me a year later" with the Navi and Vega 20 7nm releases. Look.at the Vega 64 it has ~ 13 Tflops FP32 yet the 1080ti has ~ 11.3 and beats in almost all AAA titles across all resolutions. AMD must core hard to change their core engineering of their chips.

Look at the 2080ti vs 1080ti as a raw rendering rasterization on paper

21 % increase in cuda cores
616 GB/s vs 484 GB/s memor bandwidth
~ 14 Tflops of FP32 vs 11.3

That is not a big leap in paper specs , yet Nvidia with their shift to 12nm FF vs 16nm on Pascal , made a leap in efficiency and SMs The streaming multiprocessors (SMs) are the part of the GPU that runs our CUDA kernels , for that the leap in rasterization performance without RT or DLSS.
Raficoo wroteSo Nvidia finally gave some information on what the new RTX 2080 can do compared to the GTX 1080 at 4K resolution:
https://cdn.wccftech.com/wp-content/uploads/2018/08/NV-GeForce-RTX-2080-Performance-Games.jpg
https://cdn.wccftech.com/wp-content/uploads/2018/08/NV-GeForce-RTX-2080-Performance.jpg

And there's this video of the user claiming it's RTX 2080 Ti vs GTX 1080 Ti in games using Deep Learning Super Sampling (DLSS), might be real or not, video quality is crappy though:
https://www.youtube.com/watch?v=WMbh6zFrzto
If you take with the recent graph a linear performance trend the 2080 is faster than the 1080

~ 1.45 - 1.5 at 2160p without DLSS and ~ 2x with DLSS Compatible games same will follow with the 2080 vs 1080ti , thus the video has some creadibility backed with the demo that Nvidia showed in their conference as Gamescom - In a demo called "Infiltrator" at Gamescom, a single Turing-based GPU was able to render it in 4K with high quality graphics setting at a steady 60FPS using DLSS , actually it was ~ 78 bu capped by the 60hz panel being played at. Huang noted that it actually runs at 78FPS, the demo was just limited by the stage display. On a GTX 1080 Ti, that same demo runs in the 30FPS range.
DLSS = Rendering in lower resolution and then upscaling
The performance chart provided by Nvidia was inspected by Hardware Unboxed, for a certain reason Nvidia also tested using HDR enabled which was tested to affect performance on the 10 series card which they could have solved with the 20 series, the benchmark chart provided by Nvidia which came late for some weird reasons still not provide a clear information about the performance on the 20 series

https://www.youtube.com/watch?v=QoePGSmFkBg
anayman_k7 wroteDLSS = Rendering in lower resolution and then upscaling
The performance chart provided by Nvidia was inspected by Hardware Unboxed, for a certain reason Nvidia also tested using HDR enabled which was tested to affect performance on the 10 series card which they could have solved with the 20 series, the benchmark chart provided by Nvidia which came late for some weird reasons still not provide a clear information about the performance on the 20 series

https://www.youtube.com/watch?v=QoePGSmFkBg

"Just Buy It: Why Nvidia RTX GPUs Are Worth the Money"

https://www.tomshardware.com/…/nvidia-rtx-gpus-worth-the-mo…

Done Get Rid of the 1080ti Strix and waiting to the 2080ti Stix to drop in my case. No time for " fake convincing" that no need to upgrade skip to 7nm and give myself assumptions : RT is a gimmick , No great leap in performance , DLSS is a gimmick , 30 fps in Tombraider with RT in 1080p etc since I don't want to lose in a sell value and saying all sh..t for the new gen.

Turing is a leap with a fast moving tech.

My next post will be mocking post for those who still perceive that Turing =1.3 Pascal in pure rasterization (without DLSS / RT ). Benchmarks will be revealed sooner or Later ?

Look at the 2080ti vs 1080ti as a raw rendering rasterization on paper

21 % increase in cuda cores
616 GB/s vs 484 GB/s memor bandwidth
~ 14 Tflops of FP32 vs 11.3

That is not a big leap in paper specs , yet Nvidia with their shift to 12nm FF vs 16nm on Pascal , made a leap in efficiency and SMs The streaming multiprocessors (SMs) are the part of the GPU that runs our CUDA kernels , for that the leap in rasterization performance without RT or DLSS.
You are just like an Apple fan, they saw the iPhone X as a gift from god, same for you, not even a single benchmark out yet you sold your 1080Ti and you rode the first class of the hype train, you might find yourself at the end rebuying the 1080Ti you sold for more :)
I really hope the RTX will perform as good as all the positive hype. After all, this hype is the reason I, and many others will finally be able to afford a 1080Ti. So it's all good.
as @enthralled said,
I'm not falling for any real time shadows, lighting or dynamic reflections.
A game is a game, doesn't have to look SO real.

Personally i'm still on my 980, and when the Real benchmarks of the 2080ti, without RTX come out, i will compare it to the 1080ti.
If there is a huge improvement over the 1080ti, as in say +50%, i would get the new 2080ti.
Otherwise, if performance increase is only at 20-30% increase, i would get a hopefully a cheap 1080ti.
For Skepticals:

"Gaijin’s Enlisted was running with over 90fps in 4K resolution with NVIDIA’s real-time ray tracing effects" -DSOGaming

https://www.dsogaming.com/news/gaijins-enlisted-was-running-with-over-90fps-in-4k-resolution-with-nvidias-real-time-ray-tracing-effects/

" 'Tomb Raider’ devs respond to RTX 2080 Ti, ray tracing performance concerns" -Digital Trends

https://www.digitaltrends.com/computing/tomb-raider-dev-responds-to-rtx-2080-ti-performance-concerns/

PC &Parts PreOrder Availability

RTX 2080ti
https://pcandparts.com/video-card/?filter_p-type=rtx-2080ti

RTX 2080
https://pcandparts.com/video-card/?filter_p-type=rtx-2080&orderby=price-desc




~ 1273 USD with 11 %VAT + 5 USD Delivery will net ~ USD 1278 for the Zotac RTX 2080ti OC which relatively a good price compared to official Nvida FE ~ 1200 USD on their official website.

I will be Aming for the

ROG-STRIX-RTX2080TI-O11G-GAMING , due to their quality PCB ,VRM and Cooling.

I am optimistic with the technology Nvida injected in Turing ,and more and more games will utilize RT on the run , I will not wait for 7nm since I like to enjoy the top specs in my system.

Waiting for:

Apple to Apple Performance on games ( Old and New)

Without DLSS
Without RT
With DLSS
With RT
With DLSS & RT

Promising Tech and a good leap for things done with expensive enterprise level computers now brought to the end gamer . Previous gen Ti replaced by next gen , never failed me.

1080ti Strix OC'd to 2025mhz core clock & 11.6 Ghz memory with an i7 8700k OC'd to 5ghz on all 6 cores (Negative AVX offset set to 0) with 3200mhz rams , the 1080ti definitely struggles on 2160p to have a locked 60 fps experience at very high - very high settings.

Far Cry 5
Monster Hunt World
Ghost Recon Wild Lands
Mess Effect Andromeda
Watch Dogs 2
AC Origins
Kingdom Come: Deliverance
Middle-earth: Shadow of War 

& other titles I tested

To me a 3440× 1440p gamer @ 100hz and native 2160p even at 60hz , 1080ti is obsolete on core rasterization rendering away from RT or DLSS. Yes I need AA ( not FXAA since it the worst) since I play 2160p HDR @60fps on the 55" Sony X930 E TV where the PPI is ~ 80 compared o 163 on a 27" gaming 2160p where AA is mostly not need due to the small screen giving a high pixels per inch.
I'm a full HD 60 fps strategy gamer (ahem factorio) integrated graphics would have been enough for me but I have a gtx 1070. I have not come close to having it 100% used with any game(not even witcher 3)

Now if I can only bring it with me to Canada :D
user wroteI'm a full HD 60 fps strategy gamer (ahem factorio) integrated graphics would have been enough for me but I have a gtx 1070. I have not come close to having it 100% used with any game(not even witcher 3)

Now if I can only bring it with me to Canada
Ok , any thing related to Turing- Offtopic to the subject.
Tech Guru wrote
user wroteI'm a full HD 60 fps strategy gamer (ahem factorio) integrated graphics would have been enough for me but I have a gtx 1070. I have not come close to having it 100% used with any game(not even witcher 3)

Now if I can only bring it with me to Canada
Ok , any thing related to Turing- Offtopic to the subject.
I am merely questioning why would anyone upgrade? Outside the 4k 144 hz people I mean. It does seem like the 10 series already dominate all things full HD and even 1440, there is no further point.
The thing is with early adoption is the risk. The first batch of announced games that support RTX is great. AAA titles that will support RTX, most of these games are in late development stages. So RTX might be an extra option patched later on, not from the ground up.

As with any tech dx10, dx11, dx11.1 , and dx12 (no performance increase or graphical marvel) it all comes down to the developer support.

So regarding the performance metrics of the upcoming cards, i am not hyped about RTX. Especially with the fact that the consoles are what truly drives innovation by developers. And they are AMD and limited (fast but limited).
Lets wait for benchmarks for all the games that are already released.

I am disappointed by the fact that nvidia's founder edition is not blower style anymore, and power consumption. Alot of times the circuit breaker in my house shuts down duo to load on my 980 ti, plus i have a mini-itx case...
You gotta justify your purchase somehow, amirite?
Tech Guru wrote
user wroteI'm a full HD 60 fps strategy gamer (ahem factorio) integrated graphics would have been enough for me but I have a gtx 1070. I have not come close to having it 100% used with any game(not even witcher 3)

Now if I can only bring it with me to Canada
Ok , any thing related to Turing- Offtopic to the subject.
It IS on topic!

He means that he doesn't need Turing, RTX or whatever to play his favorite games that barely push his 1070. You're just so focused on justifying your upgrade, and the WOW effect "Hey look, RTX, DLSSRDLL* ,LDSSR*, <input some random acronym here that looks cool> ... " that you forget to actually enjoy this hardware.

We have been there a 1000000 times already. NVidia releases some "new technology", we expect it to be a huge game changer. You get hyped and post this thread where you're selling your old card, copying articles from here and there**, trying to justify the new purchase, bashing anyone who tries to tell you that you should wait. And in the end it just ends up being, like every release, some 25-30% generation performance increase.
anayman_k7 wroteYou are just like an Apple fan, they saw the iPhone X as a gift from god, same for you, not even a single benchmark out yet you sold your 1080Ti and you rode the first class of the hype train, you might find yourself at the end rebuying the 1080Ti you sold for more :)

If you like having the latest hardware (I do too, I own a water-cooled 1080Ti after all...), then that's fine. But if you need to shove it down everyone's throat, and not accept any arguments about it, then you're just an attention seeking idiot.

* I made these up.
**We know how to browse the internet ourselves!
vlatkozelka wrote
Tech Guru wrote
user wroteI'm a full HD 60 fps strategy gamer (ahem factorio) integrated graphics would have been enough for me but I have a gtx 1070. I have not come close to having it 100% used with any game(not even witcher 3)

Now if I can only bring it with me to Canada
Ok , any thing related to Turing- Offtopic to the subject.
It IS on topic!

He means that he doesn't need Turing, RTX or whatever to play his favorite games that barely push his 1070. You're just so focused on justifying your upgrade, and the WOW effect "Hey look, RTX, DLSSRDLL* ,LDSSR*, <input some random acronym here that looks cool> ... " that you forget to actually enjoy this hardware.

We have been there a 1000000 times already. NVidia releases some "new technology", we expect it to be a huge game changer. You get hyped and post this thread where you're selling your old card, copying articles from here and there**, trying to justify the new purchase, bashing anyone who tries to tell you that you should wait. And in the end it just ends up being, like every release, some 25-30% generation performance increase.
anayman_k7 wroteYou are just like an Apple fan, they saw the iPhone X as a gift from god, same for you, not even a single benchmark out yet you sold your 1080Ti and you rode the first class of the hype train, you might find yourself at the end rebuying the 1080Ti you sold for more :)

If you like having the latest hardware (I do too, I own a water-cooled 1080Ti after all...), then that's fine. But if you need to shove it down everyone's throat, and not accept any arguments about it, then you're just an attention seeking idiot.

* I made these up.
**We know how to browse the internet ourselves!
I do not know why are you triggered :) ,I have been on this forum for more than 7 consecutive years and all my posts are technical oriented and straight.

A hyped person is a noob who posts things and doesnot know what it means , I am clearifying all points in simple tech way trying to summarize what "turing offers". Beside I am not justifying , technology it self justify what it offers. Hate it or no , technology in the pc world is evolving and a new gen whatever it offers is better than old gen. It is better to justify a purchase to a new gen than trying to " self convince" hay I have the previous gen and all what the new gen offers seem a gimmick and a marginal boost.

RT is a gimmick
DLSS is fake
I will wait for 7nm and skip that
Only 30 % Performance Boost
Oh Tombraider doesnot have a locked 60fps even on 1080p on the RTX !


Perceptions that either technology noobs or people who do not want upgrade (since they want to lose in selling their old gen or do not have money to drop the newest tech) , to "self convince themselves " that this is all hype.

If you are triggered by my posts simply do not write offensive feedbacks and take things to a personal level that I do not like to take sure and call you an " idiot" as you were trying to call me indirectly from your response structure.

Personal things are not solved here , if you have anything towards me.
Tech Guru wrote
vlatkozelka wrote
Tech Guru wrote
Ok , any thing related to Turing- Offtopic to the subject.
It IS on topic!

He means that he doesn't need Turing, RTX or whatever to play his favorite games that barely push his 1070. You're just so focused on justifying your upgrade, and the WOW effect "Hey look, RTX, DLSSRDLL* ,LDSSR*, <input some random acronym here that looks cool> ... " that you forget to actually enjoy this hardware.

We have been there a 1000000 times already. NVidia releases some "new technology", we expect it to be a huge game changer. You get hyped and post this thread where you're selling your old card, copying articles from here and there**, trying to justify the new purchase, bashing anyone who tries to tell you that you should wait. And in the end it just ends up being, like every release, some 25-30% generation performance increase.
anayman_k7 wroteYou are just like an Apple fan, they saw the iPhone X as a gift from god, same for you, not even a single benchmark out yet you sold your 1080Ti and you rode the first class of the hype train, you might find yourself at the end rebuying the 1080Ti you sold for more :)

If you like having the latest hardware (I do too, I own a water-cooled 1080Ti after all...), then that's fine. But if you need to shove it down everyone's throat, and not accept any arguments about it, then you're just an attention seeking idiot.

* I made these up.
**We know how to browse the internet ourselves!
I do not know why are you triggered :) ,I have been on this forum for more than 7 consecutive years and all my posts are technical oriented and straight.

A hyped person is a noob who posts things and doesnot know what it means , I am clearifying all points in simple tech way trying to summarize what "turing offers". Beside I am not justifying , technology it self justify what it offers. Hate it or no , technology in the pc world is evolving and a new gen whatever it offers is better than old gen. It is better to justify a purchase to a new gen than trying to " self convince" hay I have the previous gen and all what the new gen offers seem a gimmick and a marginal boost.

RT is a gimmick
DLSS is fake
I will wait for 7nm and skip that
Only 30 % Performance Boost
Oh Tombraider doesnot have a locked 60fps even on 1080p on the RTX !


Perceptions that either technology noobs or people who do not want upgrade (since they want to lose in selling their old gen or do not have money to drop the newest tech) , to "self convince themselves " that this is all hype.

If you are triggered by my posts simply do not write offensive feedbacks and take things to a personal level that I do not like to take sure and call me an " idiot" as you were trying to indirectly from your response structure and ending

Personal things are not solved here , if you have anything towards me seek other channels.
Tech Guru wrote I do not know why are you triggered :)
Stupid content triggers me, I can't help it. You've been posting irrelevant crap for 6 days. And everyone's trying to tell you to wait for actual benchmarks, but you're too stupid to listen
Tech Guru wrote I have been on this forum for more than 7 consecutive years
7 years and you still learned nothing :/
Tech Guru wrote all my posts are technical oriented and straight.
You just COPY articles from here and there, and re-write them in bad English, with bad grammar, and random capitalization of letters. It is cancer to read your posts!
Tech Guru wrote A hyped person is a noob who posts things and doesnot know what it means
Which is exactly what you are. Do you even know what ray tracing is? Do you even know how huge the computation is to achieve rendering fidelity that is anyway near rendering that's done outside of games? You keep talking about tech that you have no idea about. Do you even know why they keep making transistors smaller and smaller? And what are the complications of that?
Tech Guru wrote It is better to justify a purchase to a new gen than trying to " self convince" hay I have the previous gen and all what the new gen offers seem a gimmick and a marginal boost.
NO! Everyone else in this thread is telling you the same thing. Our "Old" hardware is doing it's job perfectly, we are happy with what we have. You are not :)
Tech Guru wrote RT is a gimmick
DLSS is fake
I will wait for 7nm and skip that
Only 30 % Performance Boost
Oh Tombraider doesnot have a locked 60fps even on 1080p on the RTX !


Perceptions that either technology noobs or people who do not want upgrade (since they want to lose in selling their old gen or do not have money to drop the newest tech) , to "self convince themselves " that this is all hype.
It's all true, marketing gimmicks exit. And tech is often sold at much higher prices than it should. The problem is the prices get inflated as fanbois like you will always pre-order.
Also don't go around calling people poor. Being smart with your money doesn't mean you're poor. We aren't all government thieves like yourself!

Tech Guru wrote If you are triggered by my posts simply do not write offensive feedbacks and take things to a personal level.
I am a software engineer, and so I take this stuff seriously. I understand what I read, and ask around when I don't, unlike you who bashes any post that doesn't conform with his opinions.
So when I see stupid crap like yours posted, I can't help it, like I already said in the beginning.

Conclusion:

Your threads are attention seeking posts disguised as tech talks. Every single thread of yours is about some tech that you bought, then some copying of articles and benchmarks trying to justify why you bought it. Do not try to convince me, or anyone, that you actually are a tech person, you just get high on buying expensive tech, and then bragging about it on the internet.
It is just sad to see LebGeeks become a buy and sell website like OLX + random Tech Guru crap. There was much more potential in this place :(
wait for benchmark , im not htat hyped about it , shadow of the tomb raider wasn't running that smooth on the new tech with ray shit on
DLSS is good though .
still i would upgrade since i like upgrading to a new tech and i can do that . still this release ain't looking good and nvidia ceo only spoke about while ray tracing on its that much faster than pascal , but i really don't care about ray tracing on or off i wanna see benchmark and numbers.
Imagine racing games looking like this:
https://youtu.be/EAYfJckSEN0
Raytracing is definitely not a gimmick, although we have to wait for the next generation consoles to get a handful of eyecandy looking games for PC, I suppose.
vlatkozelka wrote
Tech Guru wrote I do not know why are you triggered :)
Stupid content triggers me, I can't help it. You've been posting irrelevant crap for 6 days. And everyone's trying to tell you to wait for actual benchmarks, but you're too stupid to listen
Tech Guru wrote I have been on this forum for more than 7 consecutive years
7 years and you still learned nothing :/
Tech Guru wrote all my posts are technical oriented and straight.
You just COPY articles from here and there, and re-write them in bad English, with bad grammar, and random capitalization of letters. It is cancer to read your posts!
Tech Guru wrote A hyped person is a noob who posts things and doesnot know what it means
Which is exactly what you are. Do you even know what ray tracing is? Do you even know how huge the computation is to achieve rendering fidelity that is anyway near rendering that's done outside of games? You keep talking about tech that you have no idea about. Do you even know why they keep making transistors smaller and smaller? And what are the complications of that?
Tech Guru wrote It is better to justify a purchase to a new gen than trying to " self convince" hay I have the previous gen and all what the new gen offers seem a gimmick and a marginal boost.
NO! Everyone else in this thread is telling you the same thing. Our "Old" hardware is doing it's job perfectly, we are happy with what we have. You are not :)
Tech Guru wrote RT is a gimmick
DLSS is fake
I will wait for 7nm and skip that
Only 30 % Performance Boost
Oh Tombraider doesnot have a locked 60fps even on 1080p on the RTX !


Perceptions that either technology noobs or people who do not want upgrade (since they want to lose in selling their old gen or do not have money to drop the newest tech) , to "self convince themselves " that this is all hype.
It's all true, marketing gimmicks exit. And tech is often sold at much higher prices than it should. The problem is the prices get inflated as fanbois like you will always pre-order.
Also don't go around calling people poor. Being smart with your money doesn't mean you're poor. We aren't all government thieves like yourself!

Tech Guru wrote If you are triggered by my posts simply do not write offensive feedbacks and take things to a personal level.
I am a software engineer, and so I take this stuff seriously. I understand what I read, and ask around when I don't, unlike you who bashes any post that doesn't conform with his opinions.
So when I see stupid crap like yours posted, I can't help it, like I already said in the beginning.

Conclusion:

Your threads are attention seeking posts disguised as tech talks. Every single thread of yours is about some tech that you bought, then some copying of articles and benchmarks trying to justify why you bought it. Do not try to convince me, or anyone, that you actually are a tech person, you just get high on buying expensive tech, and then bragging about it on the internet.
It is just sad to see LebGeeks become a buy and sell website like OLX + random Tech Guru crap. There was much more potential in this place :(
Again I will not slip with you on personal level as you are slipped , this is my opnion and in tech nothing is pure right or wrong especially in tech forums where they are open to discussions and sometimes are " heated" since every one has a specific interest of any technology coming and some do not have at all. If you do not have such interest or you think it is all marketing hype simply you can articulate your responses in a more respectful way or even if you have an " aggressive nature" you can articulate them with an aggressive tech oriented way with out slipping to personal issues with me and I respect your opinion - techinal sure , but watching all my posts and self accumulating hate towards me I think resolving things out of this forum will be a better treat for you.
Their pricing for the 2080ti is insane, I bought my 980ti for $700 on release. But I guess it's worth it if the card outperforms the 1080ti by almost twice the power, glad Nvidia is spending so much money on R&D because it's the only company that makes quality gpus.
nas93 wroteTheir pricing for the 2080ti is insane, I bought my 980ti for $700 on release. But I guess it's worth it if the card outperforms the 1080ti by almost twice the power, glad Nvidia is spending so much money on R&D because it's the only company that makes quality gpus.
Indeed they increased around ~300 usd price difference on the none OC ti variant on the 1080ti vs 2080ti lunch prices. No competition in performance /innovation especially from AMD graphic cards line.

Here is an interesting article that popped out recently


NVIDIA GeForce RTX 2080 3DMark TimeSpy Score Leaked – Clocked At 2GHz And Beats A GTX 1080 Ti Without AI Cores

https://wccftech.com/nvidia-geforce-rtx-2080-3dmark-timespy-score-leaked-clocked-at-2ghz-and-beats-a-gtx-1080-ti-without-ai-cores/
If I had the budget I would have buy this 2080ti, but I have other things to pay.
I understand Tech guru.
Life is short, live your passions when you can
Tech Guru wrote
nas93 wroteTheir pricing for the 2080ti is insane, I bought my 980ti for $700 on release. But I guess it's worth it if the card outperforms the 1080ti by almost twice the power, glad Nvidia is spending so much money on R&D because it's the only company that makes quality gpus.
Indeed they increased around ~300 usd price difference on the none OC ti variant on the 1080ti vs 2080ti lunch prices. No competition in performance /innovation especially from AMD graphic cards line.

Here is an interesting article that popped out recently


NVIDIA GeForce RTX 2080 3DMark TimeSpy Score Leaked – Clocked At 2GHz And Beats A GTX 1080 Ti Without AI Cores

https://wccftech.com/nvidia-geforce-rtx-2080-3dmark-timespy-score-leaked-clocked-at-2ghz-and-beats-a-gtx-1080-ti-without-ai-cores/
It will be interesting to see how this rtx plays out, it has potential for sure.

I should report you for getting me hyped up and ready to spend my hard earned money, every time new tech comes along!
vlatkozelka, too much hate in the air. First excuse me for being off topic, you need to cool it . Tech guru choices are his and it's your choice to take it into consideration or not. I'm an amd fan unlike tech G but that doesn't put me in your shoes. Your last posts are nothing but olx style or asking for some help and advices. My advice to you keep your negativity to yourself. Tech g is sharing his opinion and his readings. And I find very helpful. Again even if I don't totally agree with him.
So Behave.
"The RTX 2080 Ti Founder’s Edition will sell for $1,200 but will drop down to $999 after the initial launch. The RTX 2080 will sell initially sell for $799, and then drop down to $699. Both these cards will be available September 20th. The RTX 2070 is expected a little later in October and will carry a Founder’s Edition price of $599 that will eventually settle to $499 with third-party vendors’ models when those become available. NVIDIA’s prices typically start off high but come down closer to the originally stated MSRP in time. There are always factors that could prevent price normalization—for example, if another crypto boom happens".

Source: Forbes

So prices will settle down it seems after their initial lunch. Good News Indeed.

I will not preoder now , and real reviews / bechmarks will start to roll out early to mid September by the tech community. Some tech reviewers already have samples , Nvidia did not release the drivers yet to them. Very Soon they will.
5 days later
Day after day the leaks are starting to appear, for now all the leaks confirms that there will be around 30% average performance difference between the 2080Ti and the 1080Ti, while the 1080Ti today could cost you around 650$ and the 2080Ti is at 1200$, no wonder why Nvidia didn't release performance graphs during the announcement and then had to release some shady DLSS/4k HDR for few games to fool the users about the new series having double the performance.
In addition the RTX will be the new HairWorks that everyone will turn off to gain back half the of the lost FPS
anayman_k7 wroteDay after day the leaks are starting to appear, for now all the leaks confirms that there will be around 30% average performance difference between the 2080Ti and the 1080Ti, while the 1080Ti today could cost you around 650$ and the 2080Ti is at 1200$, no wonder why Nvidia didn't release performance graphs during the announcement and then had to release some shady DLSS/4k HDR for few games to fool the users about the new series having double the performance.
In addition the RTX will be the new HairWorks that everyone will turn off to gain back half the of the lost FPS
I agree. Even HairWorks doesn't run very well to this day.

Personally, I'm going to upgrade to a used 1080ti. A used card would market for as low as ~$450-$500 which gives me incredible bang for my buck. The more people upgrade the higher the saturation of used 1080tis.

Also, no one mentioned SLI. A pair of 1080tis would cost less than a 2080ti. SLI scaling isn't the best, but I'm confident if you get at least %40-%50 performance from the second card, you're still performing better than a 2080ti at a lower cost (assuming you already had the setup that can support the SLI configuration).
anayman_k7 wroteDay after day the leaks are starting to appear, for now all the leaks confirms that there will be around 30% average performance difference between the 2080Ti and the 1080Ti, while the 1080Ti today could cost you around 650$ and the 2080Ti is at 1200$, no wonder why Nvidia didn't release performance graphs during the announcement and then had to release some shady DLSS/4k HDR for few games to fool the users about the new series having double the performance.
In addition the RTX will be the new HairWorks that everyone will turn off to gain back half the of the lost FPS

I will still wait for the official final benchmarks using final state of testing drivers. A turkish guy released a benchmark between 1080ti FE vs 2080ti FE tested on 10 modern games , but the video is dropped time out of youtube due to confidential agreement with Nvidia not to disclose now results. On the 12 / 13th of September real resuts will start rolling away of speculation and leaks.

This Turkish guy has a very genuine channel for testing tech and some people screen shots of the graphes before the video removal. Joker has in depth analysis on that

https://youtu.be/dfGJpVEzUxo





Upgrading to a RTX 2080ti depends on the gaming need


If 1080ti meets them =satisfaction no need to upgrade

Accordingly the 1080ti doesnot meet my gaming needs anymore =dissatisfaction

1080ti Strix OC'd to 2025mhz core clock & 11.6 Ghz memory with an i7 8700k OC'd to 5ghz on all 6 cores (Negative AVX offset set to 0) with 3200mhz rams , the 1080ti definitely struggles on 2160p to have a locked 60 fps experience at very high - very high settings.

Far Cry 5
Monster Hunt World
Ghost Recon Wild Lands
Mess Effect Andromeda
Watch Dogs 2
AC Origins
Kingdom Come: Deliverance
Middle-earth: Shadow of War

& other titles I tested

To me a 3440× 1440p gamer @ 100hz and native 2160p even at 60hz , 1080ti is obsolete on core rasterization rendering away from RT or DLSS. Yes I need AA ( not FXAA since it the worst) since I play 2160p HDR @60fps on the 55" Sony X930 E TV where the PPI is ~ 80 compared o 163 on a 27" gaming 2160p where AA is mostly not need due to the small screen giving a high pixels per inch.
mmk92 wrote
anayman_k7 wroteDay after day the leaks are starting to appear, for now all the leaks confirms that there will be around 30% average performance difference between the 2080Ti and the 1080Ti, while the 1080Ti today could cost you around 650$ and the 2080Ti is at 1200$, no wonder why Nvidia didn't release performance graphs during the announcement and then had to release some shady DLSS/4k HDR for few games to fool the users about the new series having double the performance.
In addition the RTX will be the new HairWorks that everyone will turn off to gain back half the of the lost FPS
I agree. Even HairWorks doesn't run very well to this day.

Personally, I'm going to upgrade to a used 1080ti. A used card would market for as low as ~$450-$500 which gives me incredible bang for my buck. The more people upgrade the higher the saturation of used 1080tis.

Also, no one mentioned SLI. A pair of 1080tis would cost less than a 2080ti. SLI scaling isn't the best, but I'm confident if you get at least %40-%50 performance from the second card, you're still performing better than a 2080ti at a lower cost (assuming you already had the setup that can support the SLI configuration).

It is interesting to see how a single 2080ti performs against two 1080ti in SLI in raw rendering rasterization , no DLSS no RT. When its is "actually" released . Still you cannot take the 30% increase in performance vs a single a 1080ti an subjective baseline to base your decision at:

No real benchmarks yet , leaks and speculations. Usually when 3rd party analsyis / performance benchmarks vs old gen,start to publish cross-looking on diffrent tech webistes and tech youtube channels ,to mimimize objective analysis, is good as a baseline to make a decision.

Second Immature Drivers still.

My Personal opinion about SLIs is plumping slowly

-Not a uniform experience ,some games do not scale at all especially direct x 12 one.
-Scaling is not uniform
-Power Draw
-One card to sell after skipping a generation is hard , so selling two 1080ti @ the gen coming after turing is harder.
-Memory Bandwidth doesnot stack up.
Personally I don't think the RTX will justify that price tag (1200$), I saw Jackfrags Battlefield 5 gameplay with RTX and the game colors was too washed and I think without RTX it looks much better, yes the implementation of the RTX is still early with developers learning how to use it properly which is the main point, why a person would pay 1200$ when he is most likely not experience a good RTX performance soon, waiting for the next series from Nvidia is wiser here, we will have more RTX games (if the RTX was really adopted and got mature and successful), and if we look to the games without RTX, currently with only one or 2 models of 4k 120Hz(144Hz) monitors that will cost in thousands of dollars and are not yet in the market, and the 1080Ti (650$) is capable of 4k 60hz, the 20xx series seems to be the worst video card to invest in
anayman_k7 wrotePersonally I don't think the RTX will justify that price tag (1200$), I saw Jackfrags Battlefield 5 gameplay with RTX and the game colors was too washed and I think without RTX it looks much better, yes the implementation of the RTX is still early with developers learning how to use it properly which is the main point, why a person would pay 1200$ when he is most likely not experience a good RTX performance soon, waiting for the next series from Nvidia is wiser here, we will have more RTX games (if the RTX was really adopted and got mature and successful), and if we look to the games without RTX, currently with only one or 2 models of 4k 120Hz(144Hz) monitors that will cost in thousands of dollars and are not yet in the market, and the 1080Ti (650$) is capable of 4k 60hz, the 20xx series seems to be the worst video card to invest in
1- You cannot generalize yet by saying " 20xx series seems to be the worst video card to invest in" , without the real lunch and benchmarks of a new gen and comparing it with the last gen apple to apple.

2- The 1080ti is capable @ 2160p 60hz with compromises on the image settings , more demanding titles will come it will be more and more susceptible to a sub 60fps experience on 2160p.

3- RT is a true generational leap in the gaming industry (AMD will follow sure in their upcoming Navi cards ) , some think its a mere reflections here and there that adds on the graphical fidelity of the image. It is a lot of rendering computational power , that Hollywood movies usually use rendering farms to do it. Now its is available to the end gamer , which is good.

Say you have a scene in your game with 1000 objects in it, no problem at all for a modern GPU. It takes each object, one after the other, and first finds out what area of the screen the object is visible in. For each of those pixels it executes a shader, which calculates the color of that pixel based on the object's material, textures, lights and the lights in the scene. Relatively speaking that's a very small amount of data that you need to do the calculations.

GPUs do this super quickly because the color of the pixel doesn't depend on anything except that one object's data and the lights in the scene, so you can calculate thousands of pixels at the same time using thousands of tiny processors. Modern engines then do all kinds of post processing steps, where they take the finished image and combine it with other data to do lots of neat effects like SSAO or bloom.

Ray tracing works completely different. In ray tracing you're shooting a ray into a scene and you have no idea before hand what it will hit. So every ray needs to have access to all the objects in the scene, their materials and so on, at the same time. Even if you knew which object a ray would hit what happens when that object has some reflective properties? Or say you put a bright red object next to a white one, some of the red color will reflect onto the white one. So each time you hit something you need to shoot even more rays from there, hitting other objects, and then you need to combine the results of all those to get the color of the pixel.

GPUs simply weren't designed to do this and CPUs are too slow to do it in real time in most cases. What we need is a new piece of hardware specially designed for ray tracing, but that's a huge investment and there are a lot of engineering challenges to work out. That was the case for Games developers, now it is in the hands of the end user / gamer / developer through a mainstream graphic card, a true leap. It is not a hype created by Nvida like gameworks features , the technology exists in movies , architectures , graphic designers etc..and now it in gaming due a leap in GPU architecture and engineering.

Read this Article that is published in 2009 , When Will Ray Tracing Replace Rasterization?

And now RayTracing became a reality
https://www.tomshardware.com/reviews/ray-tracing-rasterization,2351.html

Ray tracing explained: The future of hyper-realistic graphics
https://www.engadget.com/2018/04/16/the-future-of-ray-tracing-explainer/


No real benchmarks on a finished optimized game with mature drivers ( neither BF 5 or Shadow of the Tombraider are released in their final stages nor the RTX is fully reviewed with mature drivers and cross different games that will support it.


4- Prices will normalize after lunch

5-The PG27uq and Predator X27 still works on DP 1.4 with a bandwidth of 35Gbps limiting them to 2160p 10bit HDR 120hz/1440hz at 4 2 2 not 4 4 4 , next year HDMI 2.1 TVs will lunch that makes DP 1.4 an obsolete history. Those monitor in specific are full of pitfull along with the DP 1.4 limited bandwidth they are 8bit + FRC not native 10bit. I will drop the PG 35vq with the RTX 2080ti.
Tech Guru wrote (Negative AVX offset set to 0)
What does AVX offset have to do with this thread? I thought we were talking about RTX cards...
AVX instructions are mostly used by the operating system, and are usually heavier than the more common SSE instructions. They are often left out of overclocking, by adding a negative offset to them, so that a CPU overclock can be stable.

What are you trying to tell us here?
vlatkozelka wrote
Tech Guru wrote (Negative AVX offset set to 0)
What does AVX offset have to do with this thread? I thought we were talking about RTX cards...
AVX instructions are mostly used by the operating system, and are usually heavier than the more common SSE instructions. They are often left out of overclocking, by adding a negative offset to them, so that a CPU overclock can be stable.

What are you trying to tell us here?
Avx negative offset set to zero it means a locked 5Ghz OC on AVX instructions too -AVX code will downclock the processor to help keep core temperatures below the throttling point , which is not a real CPU 5ghz lock

AVX set to 2 means , 4.8 GHz down from 5Ghz on AVX inatructions Code.The problem is that AVX offset kicks in even during non-AVX workload like gaming.

To ensure proper 5Ghz lock , I set to 0 to eliminate any chances for a freq drop and to max out the 1080ti max fps to see the potential of the card not bottlnecked by cpu especially at 100hz testing @ ultra wide and minfps @ 2160p .

a 1080ti to prevent any bottlenecking from the CPU side when I test gaming on the 1080ti. BF 1 uses AVX instructions. To me it turned out with this testings that 1080ti doesnot meet my gaming needs any more and it has been stuggling with a lot of AAA games @ high to very high settings thus time to drop a next gen.
3DCenter.org has compiled a list of relative performance of the GeForce RTX 2080 Ti against the GeForce GTX 1080 Ti. This list matches the games tested and Numbers by the Turkish Tech Channel at Youtube (Video dropped) and Joker and The Good Old Gamer talked about on Youtube.



An average increase of 37.5 % looks good in rasterization rendering - No DLSS and RT , with some games tipping to 50 % + increase. Sure they are still leaks and done on immature drivers , but overall Turing looks promising.


Read more: https://www.tweaktown.com/news/63041/geforce-rtx-2080-ti-37-5-faster-overall-gtx-1080/index.html
How did they test Battlefield 5? even the beta isn't available yet..