• Hardware
  • RTX 2080ti - Nvidia "Turing" Upgrade

anayman_k7 wroteDay after day the leaks are starting to appear, for now all the leaks confirms that there will be around 30% average performance difference between the 2080Ti and the 1080Ti, while the 1080Ti today could cost you around 650$ and the 2080Ti is at 1200$, no wonder why Nvidia didn't release performance graphs during the announcement and then had to release some shady DLSS/4k HDR for few games to fool the users about the new series having double the performance.
In addition the RTX will be the new HairWorks that everyone will turn off to gain back half the of the lost FPS
I agree. Even HairWorks doesn't run very well to this day.

Personally, I'm going to upgrade to a used 1080ti. A used card would market for as low as ~$450-$500 which gives me incredible bang for my buck. The more people upgrade the higher the saturation of used 1080tis.

Also, no one mentioned SLI. A pair of 1080tis would cost less than a 2080ti. SLI scaling isn't the best, but I'm confident if you get at least %40-%50 performance from the second card, you're still performing better than a 2080ti at a lower cost (assuming you already had the setup that can support the SLI configuration).
anayman_k7 wroteDay after day the leaks are starting to appear, for now all the leaks confirms that there will be around 30% average performance difference between the 2080Ti and the 1080Ti, while the 1080Ti today could cost you around 650$ and the 2080Ti is at 1200$, no wonder why Nvidia didn't release performance graphs during the announcement and then had to release some shady DLSS/4k HDR for few games to fool the users about the new series having double the performance.
In addition the RTX will be the new HairWorks that everyone will turn off to gain back half the of the lost FPS

I will still wait for the official final benchmarks using final state of testing drivers. A turkish guy released a benchmark between 1080ti FE vs 2080ti FE tested on 10 modern games , but the video is dropped time out of youtube due to confidential agreement with Nvidia not to disclose now results. On the 12 / 13th of September real resuts will start rolling away of speculation and leaks.

This Turkish guy has a very genuine channel for testing tech and some people screen shots of the graphes before the video removal. Joker has in depth analysis on that

https://youtu.be/dfGJpVEzUxo





Upgrading to a RTX 2080ti depends on the gaming need


If 1080ti meets them =satisfaction no need to upgrade

Accordingly the 1080ti doesnot meet my gaming needs anymore =dissatisfaction

1080ti Strix OC'd to 2025mhz core clock & 11.6 Ghz memory with an i7 8700k OC'd to 5ghz on all 6 cores (Negative AVX offset set to 0) with 3200mhz rams , the 1080ti definitely struggles on 2160p to have a locked 60 fps experience at very high - very high settings.

Far Cry 5
Monster Hunt World
Ghost Recon Wild Lands
Mess Effect Andromeda
Watch Dogs 2
AC Origins
Kingdom Come: Deliverance
Middle-earth: Shadow of War

& other titles I tested

To me a 3440× 1440p gamer @ 100hz and native 2160p even at 60hz , 1080ti is obsolete on core rasterization rendering away from RT or DLSS. Yes I need AA ( not FXAA since it the worst) since I play 2160p HDR @60fps on the 55" Sony X930 E TV where the PPI is ~ 80 compared o 163 on a 27" gaming 2160p where AA is mostly not need due to the small screen giving a high pixels per inch.
mmk92 wrote
anayman_k7 wroteDay after day the leaks are starting to appear, for now all the leaks confirms that there will be around 30% average performance difference between the 2080Ti and the 1080Ti, while the 1080Ti today could cost you around 650$ and the 2080Ti is at 1200$, no wonder why Nvidia didn't release performance graphs during the announcement and then had to release some shady DLSS/4k HDR for few games to fool the users about the new series having double the performance.
In addition the RTX will be the new HairWorks that everyone will turn off to gain back half the of the lost FPS
I agree. Even HairWorks doesn't run very well to this day.

Personally, I'm going to upgrade to a used 1080ti. A used card would market for as low as ~$450-$500 which gives me incredible bang for my buck. The more people upgrade the higher the saturation of used 1080tis.

Also, no one mentioned SLI. A pair of 1080tis would cost less than a 2080ti. SLI scaling isn't the best, but I'm confident if you get at least %40-%50 performance from the second card, you're still performing better than a 2080ti at a lower cost (assuming you already had the setup that can support the SLI configuration).

It is interesting to see how a single 2080ti performs against two 1080ti in SLI in raw rendering rasterization , no DLSS no RT. When its is "actually" released . Still you cannot take the 30% increase in performance vs a single a 1080ti an subjective baseline to base your decision at:

No real benchmarks yet , leaks and speculations. Usually when 3rd party analsyis / performance benchmarks vs old gen,start to publish cross-looking on diffrent tech webistes and tech youtube channels ,to mimimize objective analysis, is good as a baseline to make a decision.

Second Immature Drivers still.

My Personal opinion about SLIs is plumping slowly

-Not a uniform experience ,some games do not scale at all especially direct x 12 one.
-Scaling is not uniform
-Power Draw
-One card to sell after skipping a generation is hard , so selling two 1080ti @ the gen coming after turing is harder.
-Memory Bandwidth doesnot stack up.
Personally I don't think the RTX will justify that price tag (1200$), I saw Jackfrags Battlefield 5 gameplay with RTX and the game colors was too washed and I think without RTX it looks much better, yes the implementation of the RTX is still early with developers learning how to use it properly which is the main point, why a person would pay 1200$ when he is most likely not experience a good RTX performance soon, waiting for the next series from Nvidia is wiser here, we will have more RTX games (if the RTX was really adopted and got mature and successful), and if we look to the games without RTX, currently with only one or 2 models of 4k 120Hz(144Hz) monitors that will cost in thousands of dollars and are not yet in the market, and the 1080Ti (650$) is capable of 4k 60hz, the 20xx series seems to be the worst video card to invest in
anayman_k7 wrotePersonally I don't think the RTX will justify that price tag (1200$), I saw Jackfrags Battlefield 5 gameplay with RTX and the game colors was too washed and I think without RTX it looks much better, yes the implementation of the RTX is still early with developers learning how to use it properly which is the main point, why a person would pay 1200$ when he is most likely not experience a good RTX performance soon, waiting for the next series from Nvidia is wiser here, we will have more RTX games (if the RTX was really adopted and got mature and successful), and if we look to the games without RTX, currently with only one or 2 models of 4k 120Hz(144Hz) monitors that will cost in thousands of dollars and are not yet in the market, and the 1080Ti (650$) is capable of 4k 60hz, the 20xx series seems to be the worst video card to invest in
1- You cannot generalize yet by saying " 20xx series seems to be the worst video card to invest in" , without the real lunch and benchmarks of a new gen and comparing it with the last gen apple to apple.

2- The 1080ti is capable @ 2160p 60hz with compromises on the image settings , more demanding titles will come it will be more and more susceptible to a sub 60fps experience on 2160p.

3- RT is a true generational leap in the gaming industry (AMD will follow sure in their upcoming Navi cards ) , some think its a mere reflections here and there that adds on the graphical fidelity of the image. It is a lot of rendering computational power , that Hollywood movies usually use rendering farms to do it. Now its is available to the end gamer , which is good.

Say you have a scene in your game with 1000 objects in it, no problem at all for a modern GPU. It takes each object, one after the other, and first finds out what area of the screen the object is visible in. For each of those pixels it executes a shader, which calculates the color of that pixel based on the object's material, textures, lights and the lights in the scene. Relatively speaking that's a very small amount of data that you need to do the calculations.

GPUs do this super quickly because the color of the pixel doesn't depend on anything except that one object's data and the lights in the scene, so you can calculate thousands of pixels at the same time using thousands of tiny processors. Modern engines then do all kinds of post processing steps, where they take the finished image and combine it with other data to do lots of neat effects like SSAO or bloom.

Ray tracing works completely different. In ray tracing you're shooting a ray into a scene and you have no idea before hand what it will hit. So every ray needs to have access to all the objects in the scene, their materials and so on, at the same time. Even if you knew which object a ray would hit what happens when that object has some reflective properties? Or say you put a bright red object next to a white one, some of the red color will reflect onto the white one. So each time you hit something you need to shoot even more rays from there, hitting other objects, and then you need to combine the results of all those to get the color of the pixel.

GPUs simply weren't designed to do this and CPUs are too slow to do it in real time in most cases. What we need is a new piece of hardware specially designed for ray tracing, but that's a huge investment and there are a lot of engineering challenges to work out. That was the case for Games developers, now it is in the hands of the end user / gamer / developer through a mainstream graphic card, a true leap. It is not a hype created by Nvida like gameworks features , the technology exists in movies , architectures , graphic designers etc..and now it in gaming due a leap in GPU architecture and engineering.

Read this Article that is published in 2009 , When Will Ray Tracing Replace Rasterization?

And now RayTracing became a reality
https://www.tomshardware.com/reviews/ray-tracing-rasterization,2351.html

Ray tracing explained: The future of hyper-realistic graphics
https://www.engadget.com/2018/04/16/the-future-of-ray-tracing-explainer/


No real benchmarks on a finished optimized game with mature drivers ( neither BF 5 or Shadow of the Tombraider are released in their final stages nor the RTX is fully reviewed with mature drivers and cross different games that will support it.


4- Prices will normalize after lunch

5-The PG27uq and Predator X27 still works on DP 1.4 with a bandwidth of 35Gbps limiting them to 2160p 10bit HDR 120hz/1440hz at 4 2 2 not 4 4 4 , next year HDMI 2.1 TVs will lunch that makes DP 1.4 an obsolete history. Those monitor in specific are full of pitfull along with the DP 1.4 limited bandwidth they are 8bit + FRC not native 10bit. I will drop the PG 35vq with the RTX 2080ti.
Tech Guru wrote (Negative AVX offset set to 0)
What does AVX offset have to do with this thread? I thought we were talking about RTX cards...
AVX instructions are mostly used by the operating system, and are usually heavier than the more common SSE instructions. They are often left out of overclocking, by adding a negative offset to them, so that a CPU overclock can be stable.

What are you trying to tell us here?
vlatkozelka wrote
Tech Guru wrote (Negative AVX offset set to 0)
What does AVX offset have to do with this thread? I thought we were talking about RTX cards...
AVX instructions are mostly used by the operating system, and are usually heavier than the more common SSE instructions. They are often left out of overclocking, by adding a negative offset to them, so that a CPU overclock can be stable.

What are you trying to tell us here?
Avx negative offset set to zero it means a locked 5Ghz OC on AVX instructions too -AVX code will downclock the processor to help keep core temperatures below the throttling point , which is not a real CPU 5ghz lock

AVX set to 2 means , 4.8 GHz down from 5Ghz on AVX inatructions Code.The problem is that AVX offset kicks in even during non-AVX workload like gaming.

To ensure proper 5Ghz lock , I set to 0 to eliminate any chances for a freq drop and to max out the 1080ti max fps to see the potential of the card not bottlnecked by cpu especially at 100hz testing @ ultra wide and minfps @ 2160p .

a 1080ti to prevent any bottlenecking from the CPU side when I test gaming on the 1080ti. BF 1 uses AVX instructions. To me it turned out with this testings that 1080ti doesnot meet my gaming needs any more and it has been stuggling with a lot of AAA games @ high to very high settings thus time to drop a next gen.
3DCenter.org has compiled a list of relative performance of the GeForce RTX 2080 Ti against the GeForce GTX 1080 Ti. This list matches the games tested and Numbers by the Turkish Tech Channel at Youtube (Video dropped) and Joker and The Good Old Gamer talked about on Youtube.



An average increase of 37.5 % looks good in rasterization rendering - No DLSS and RT , with some games tipping to 50 % + increase. Sure they are still leaks and done on immature drivers , but overall Turing looks promising.


Read more: https://www.tweaktown.com/news/63041/geforce-rtx-2080-ti-37-5-faster-overall-gtx-1080/index.html
How did they test Battlefield 5? even the beta isn't available yet..
PowerPC wroteHow did they test Battlefield 5? even the beta isn't available yet..
2nd Closed Alpha
Tech Guru wroteRT is a true generational leap in the gaming industry (AMD will follow sure in their upcoming Navi cards ) , some think its a mere reflections here and there that adds on the graphical fidelity of the image. It is a lot of rendering computational power , that Hollywood movies usually use rendering farms to do it. Now its is available to the end gamer , which is good.

Say you have a scene in your game with 1000 objects in it, no problem at all for a modern GPU. It takes each object, one after the other, and first finds out what area of the screen the object is visible in. For each of those pixels it executes a shader, which calculates the color of that pixel based on the object's material, textures, lights and the lights in the scene. Relatively speaking that's a very small amount of data that you need to do the calculations.

GPUs do this super quickly because the color of the pixel doesn't depend on anything except that one object's data and the lights in the scene, so you can calculate thousands of pixels at the same time using thousands of tiny processors. Modern engines then do all kinds of post processing steps, where they take the finished image and combine it with other data to do lots of neat effects like SSAO or bloom.

Ray tracing works completely different. In ray tracing you're shooting a ray into a scene and you have no idea before hand what it will hit. So every ray needs to have access to all the objects in the scene, their materials and so on, at the same time. Even if you knew which object a ray would hit what happens when that object has some reflective properties? Or say you put a bright red object next to a white one, some of the red color will reflect onto the white one. So each time you hit something you need to shoot even more rays from there, hitting other objects, and then you need to combine the results of all those to get the color of the pixel.
https://www.reddit.com/r/explainlikeimfive/comments/459nz3/eli5_why_isnt_realtime_ray_tracing_used_in_video/czw9686/
You can email PC and Parts now and ask for a full prices of Zotac RTX 2080 and 2080ti OC editions. ETA of end of it of mod of this month .

PC and Parts strong partnership with Zotac contributed to provide a genuine 3 years warranty on the graphic cards with a solid RMA ( they pick for you the damaged Graphic card within the warranty period and they handle all the RMA process) in case the RMA is valid they issue for you a credit note of the original purchase price ( which is solid).

Prices are competitive ~ 80 USD on current 2080ti and 2080 official Nvidia FE listings , and PC and Parts expect prices to normalize and reduce after the official market lunch.

~ 1000 - 1100 USD for the RTX 2080ti strix is a good investment to me.
If you got a good price on the 1080 ti (and it seems you did) sure go for it. Obviously you're an avid 4k gamer. To be honest if I find myself buying a new VGA from scratch I'd probably get it (and use the opportunity to get 4k displays) If nothing else it should be future proof for me for the next 10 years lol
7 days later
5 days later
The benchmarks are out, based on price per performance it seems that the GTX2080 is a bad buy when it has a slight better performance compared to the GTX1080Ti with less VRAM and for 100$ to 150$ more!
anayman_k7 wroteThe benchmarks are out, based on price per performance it seems that the GTX2080 is a bad buy when it has a slight better performance compared to the GTX1080Ti with less VRAM and for 100$ to 150$ more!
When the drivers are mature are more enough and DLSS role out , 1080 will not have a chance. 1080ti need to be compared with the 2080ti and 2080 with the 1080 .

the 1080ti is screwed deep but people who paid a hefty amounts for their 1080tis are trying to "self convince themselves"that upgrading to turing is not worthy , to deep inside they know a gen to gen architectural changes , increase in cuda core , new memory ,new optimizations , new features is always a win situation. Truth about evolving tech were " self convincing" to stick themselves with old gen.
After watching most of the reviews the thing that I was afraid off happened, RTX is still non existent and when it exists it will be so limited and we still have to see actual performance numbers about, same for DLSS, GTX 2080 is not even worth buying, a GTX 2080 Ti will let you play 4k 60fps Ultra or have a consistent 1440p 144Hz experience but for a very hefty price tag.
Tech Guru wrote
anayman_k7 wroteThe benchmarks are out, based on price per performance it seems that the GTX2080 is a bad buy when it has a slight better performance compared to the GTX1080Ti with less VRAM and for 100$ to 150$ more!
When the drivers are mature are more enough and DLSS role out , 1080 will not have a chance. 1080ti need to be compared with the 2080ti and 2080 with the 1080 .

the 1080ti is screwed deep but people who paid a hefty amounts for their 1080tis are trying to "self convince themselves"that upgrading to turing is not worthy , to deep inside they new a gen to gen architectural changes , increase in cuda core , new meomey ,new optimizations , new features is always a win situation. Truth about evolving tech were " self convincing" to stick themselves with old gen.
I disagree, you compare the GTX2080 with the GTX1080Ti for now, because later on we will compare the GTX2070 with the GTX1080Ti, you compare 'What you pay/What you get' not what is that number equal that number (generation by generation), unless these cards are provided for free.

Edit: And I don't think a sane person will buy a thing that could prove itself later on (if we could), go check Reddit 99% of users canceled their 2080 orders for the 1080Ti