LebGeeks

A community for technology geeks in Lebanon.

You are not logged in.

#51 September 3 2018

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

anayman_k7 wrote:

Day after day the leaks are starting to appear, for now all the leaks confirms that there will be around 30% average performance difference between the 2080Ti and the 1080Ti, while the 1080Ti today could cost you around 650$ and the 2080Ti is at 1200$, no wonder why Nvidia didn't release performance graphs during the announcement and then had to release some shady DLSS/4k HDR for few games to fool the users about the new series having double the performance.
In addition the RTX will be the new HairWorks that everyone will turn off to gain back half the of the lost FPS


I will still wait for the official final benchmarks using final state of testing drivers. A turkish guy released a benchmark between 1080ti FE vs 2080ti FE tested on 10 modern games , but the video is dropped time out of youtube due to confidential agreement with Nvidia not to disclose now results. On the 12 / 13th of September real resuts will start rolling away of speculation and leaks.

This Turkish guy has a very genuine channel for testing tech and some people screen shots of the graphes before the video removal. Joker has in depth analysis on that

https://youtu.be/dfGJpVEzUxo





Upgrading to a RTX 2080ti depends on the gaming need


If 1080ti meets them =satisfaction no need to upgrade

Accordingly the 1080ti doesnot meet my gaming needs anymore =dissatisfaction

1080ti Strix  OC'd to 2025mhz core clock & 11.6 Ghz memory  with an i7 8700k OC'd to 5ghz on all 6 cores  (Negative AVX offset set to 0) with 3200mhz rams , the 1080ti definitely  struggles on 2160p to have a locked 60 fps experience at very high - very high settings.

Far Cry 5
Monster Hunt World
Ghost Recon Wild Lands
Mess Effect Andromeda
Watch Dogs 2
AC Origins
Kingdom Come: Deliverance
Middle-earth: Shadow of War

& other titles I tested

To me a 3440× 1440p gamer @ 100hz and native 2160p even at 60hz , 1080ti is obsolete on core rasterization rendering away from RT or DLSS. Yes I need AA ( not FXAA since it the worst) since I play 2160p HDR @60fps on the 55" Sony X930 E TV where the PPI is ~ 80 compared o 163 on a 27" gaming 2160p where AA is mostly not need due to the small screen giving a high pixels per inch.

Last edited by Tech Guru (September 3 2018)

Offline

#52 September 3 2018

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

mmk92 wrote:
anayman_k7 wrote:

Day after day the leaks are starting to appear, for now all the leaks confirms that there will be around 30% average performance difference between the 2080Ti and the 1080Ti, while the 1080Ti today could cost you around 650$ and the 2080Ti is at 1200$, no wonder why Nvidia didn't release performance graphs during the announcement and then had to release some shady DLSS/4k HDR for few games to fool the users about the new series having double the performance.
In addition the RTX will be the new HairWorks that everyone will turn off to gain back half the of the lost FPS

I agree. Even HairWorks doesn't run very well to this day.

Personally, I'm going to upgrade to a used 1080ti. A used card would market for as low as ~$450-$500 which gives me incredible bang for my buck. The more people upgrade the higher the saturation of used 1080tis.

Also, no one mentioned SLI. A pair of 1080tis would cost less than a 2080ti. SLI scaling isn't the best, but I'm confident if you get at least %40-%50 performance from the second card, you're still performing better than a 2080ti at a lower cost (assuming you already had the setup that can support the SLI configuration).


It is interesting to see how a single 2080ti performs against two 1080ti in SLI in raw rendering rasterization , no DLSS no RT. When its is "actually" released . Still you cannot take the 30% increase in performance vs a single a 1080ti an subjective baseline to base your decision at:

No real benchmarks yet , leaks and speculations. Usually when 3rd party analsyis / performance benchmarks vs old gen,start to publish cross-looking on diffrent tech webistes and tech youtube channels ,to mimimize objective analysis,  is good as a baseline to make a decision.

Second Immature Drivers still.

My Personal opinion about SLIs is plumping slowly

-Not a uniform experience ,some games do not scale at all especially direct x 12 one.
-Scaling is not uniform
-Power Draw
-One card to sell after skipping a generation is hard , so selling two 1080ti  @ the gen coming after turing is harder.
-Memory Bandwidth doesnot stack up.

Last edited by Tech Guru (September 3 2018)

Offline

#53 September 3 2018

anayman_k7
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Personally I don't think the RTX will justify that price tag (1200$), I saw Jackfrags Battlefield 5 gameplay with RTX and the game colors was too washed and I think without RTX it looks much better, yes the implementation of the RTX is still early with developers learning how to use it properly which is the main point, why a person would pay 1200$ when he is most likely not experience a good RTX performance soon, waiting for the next series from Nvidia is wiser here, we will have more RTX games (if the RTX was really adopted and got mature and successful), and if we look to the games without RTX, currently with only one or 2 models of 4k 120Hz(144Hz) monitors that will cost in thousands of  dollars and are not yet in the market, and the 1080Ti (650$) is capable of 4k 60hz, the 20xx series seems to be the worst video card to invest in

Offline

#54 September 3 2018

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

anayman_k7 wrote:

Personally I don't think the RTX will justify that price tag (1200$), I saw Jackfrags Battlefield 5 gameplay with RTX and the game colors was too washed and I think without RTX it looks much better, yes the implementation of the RTX is still early with developers learning how to use it properly which is the main point, why a person would pay 1200$ when he is most likely not experience a good RTX performance soon, waiting for the next series from Nvidia is wiser here, we will have more RTX games (if the RTX was really adopted and got mature and successful), and if we look to the games without RTX, currently with only one or 2 models of 4k 120Hz(144Hz) monitors that will cost in thousands of  dollars and are not yet in the market, and the 1080Ti (650$) is capable of 4k 60hz, the 20xx series seems to be the worst video card to invest in

1- You cannot generalize yet by saying " 20xx series seems to be the worst video card to invest in" , without the real lunch and benchmarks of a new gen and comparing it with the last gen apple to apple.

2- The 1080ti is capable @ 2160p 60hz with compromises on the image settings , more demanding titles will come it will be more and more susceptible to a sub 60fps experience on 2160p.

3- RT is a true generational leap in the gaming industry (AMD will follow sure in their upcoming Navi cards ) , some think its a mere reflections here  and there that adds on the graphical fidelity of the image. It is a lot of rendering computational power , that Hollywood movies usually use rendering farms to do it. Now its is available to the end gamer , which is good.

Say you have a scene in your game with 1000 objects in it, no problem at all for a modern GPU. It takes each object, one after the other, and first finds out what area of the screen the object is visible in. For each of those pixels it executes a shader, which calculates the color of that pixel based on the object's material, textures, lights and the lights in the scene. Relatively speaking that's a very small amount of data that you need to do the calculations.

GPUs do this super quickly because the color of the pixel doesn't depend on anything except that one object's data and the lights in the scene, so you can calculate thousands of pixels at the same time using thousands of tiny processors. Modern engines then do all kinds of post processing steps, where they take the finished image and combine it with other data to do lots of neat effects like SSAO or bloom.

Ray tracing works completely different. In ray tracing you're shooting a ray into a scene and you have no idea before hand what it will hit. So every ray needs to have access to all the objects in the scene, their materials and so on, at the same time. Even if you knew which object a ray would hit what happens when that object has some reflective properties? Or say you put a bright red object next to a white one, some of the red color will reflect onto the white one. So each time you hit something you need to shoot even more rays from there, hitting other objects, and then you need to combine the results of all those to get the color of the pixel.

GPUs simply weren't designed to do this and CPUs are too slow to do it in real time in most cases. What we need is a new piece of hardware specially designed for ray tracing, but that's a huge investment and there are a lot of engineering challenges to work out. That was the case for Games developers, now it is in the hands of the end user / gamer / developer through a mainstream graphic card, a true leap. It is not  a hype created by Nvida like gameworks features  , the technology exists in movies , architectures , graphic designers etc..and now it in gaming due a leap in GPU architecture and engineering. 

Read this Article that is published in 2009 , When Will Ray Tracing Replace Rasterization?

And now RayTracing became a reality
https://www.tomshardware.com/reviews/ra … ,2351.html

Ray tracing explained: The future of hyper-realistic graphics
https://www.engadget.com/2018/04/16/the … explainer/


No real benchmarks on a finished optimized game with mature drivers ( neither BF 5 or Shadow of the Tombraider are released in their final stages nor the RTX is fully reviewed with mature drivers and cross different games that will support it.


4- Prices will normalize after lunch

5-The PG27uq and Predator X27 still works on DP 1.4 with a bandwidth of  35Gbps limiting them to 2160p 10bit HDR 120hz/1440hz at 4 2 2 not 4 4 4 , next year HDMI 2.1 TVs will lunch that makes DP 1.4 an obsolete history. Those monitor in specific are full of pitfull along with the DP 1.4 limited bandwidth they are 8bit + FRC not native 10bit. I will drop the PG 35vq with the RTX 2080ti.

Last edited by Tech Guru (September 3 2018)

Offline

#55 September 4 2018

MrClass
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Offline

#56 September 4 2018

vlatkozelka
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Tech Guru wrote:

(Negative AVX offset set to 0)

What does AVX offset have to do with this thread? I thought we were talking about RTX cards...
AVX instructions are mostly used by the operating system, and are usually heavier than the more common SSE instructions. They are often left out of overclocking, by adding a negative offset to them, so that a CPU overclock can be stable.

What are you trying to tell us here?

Last edited by vlatkozelka (September 4 2018)

Offline

#57 September 4 2018

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

vlatkozelka wrote:
Tech Guru wrote:

(Negative AVX offset set to 0)

What does AVX offset have to do with this thread? I thought we were talking about RTX cards...
AVX instructions are mostly used by the operating system, and are usually heavier than the more common SSE instructions. They are often left out of overclocking, by adding a negative offset to them, so that a CPU overclock can be stable.

What are you trying to tell us here?

Avx negative  offset set to zero it means a locked 5Ghz OC on AVX instructions too -AVX code will downclock the processor to help keep core temperatures below the throttling point , which is not a real CPU 5ghz lock

AVX set to 2 means , 4.8 GHz down from 5Ghz on AVX inatructions Code.The problem is that AVX offset kicks in even during non-AVX workload like gaming.

To ensure proper 5Ghz lock , I set to 0 to eliminate any chances for a freq drop and to max out the 1080ti max fps to see the potential of the card not bottlnecked by cpu especially  at 100hz testing @ ultra wide and minfps @ 2160p .

  a 1080ti to prevent any bottlenecking from the CPU side when I test gaming on the 1080ti. BF 1 uses AVX instructions. To me it turned out with this testings that 1080ti doesnot meet my gaming needs any more and it has been stuggling with a lot of AAA games @ high to very high settings thus time to drop a next gen.

Last edited by Tech Guru (September 4 2018)

Offline

#58 September 4 2018

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

3DCenter.org has compiled a list of relative performance of the GeForce RTX 2080 Ti against the GeForce GTX 1080 Ti. This list matches the games tested and Numbers by the Turkish Tech Channel at Youtube (Video dropped) and Joker and The Good Old Gamer talked about on Youtube.

YpLyH3p.jpg

An average increase of 37.5 % looks good in rasterization rendering - No DLSS and RT , with some games tipping to 50 % + increase. Sure they are still leaks and done on immature drivers , but overall Turing looks promising.


Read more: https://www.tweaktown.com/news/63041/ge … index.html

Last edited by Tech Guru (September 5 2018)

Offline

#59 September 4 2018

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

NVIDIA Launching Turing Mobility Lineup With ‘GeForce RTX 2080 Mobile’ GPU

https://wccftech.com/nvidia-turing-mobi … obile-gpu/

Last edited by Tech Guru (September 5 2018)

Offline

#60 September 4 2018

PowerPC
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

How did they test Battlefield 5? even the beta isn't available yet..

Offline

#61 September 4 2018

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

PowerPC wrote:

How did they test Battlefield 5? even the beta isn't available yet..

2nd Closed Alpha

Last edited by Tech Guru (September 4 2018)

Offline

#62 September 5 2018

mmk92
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Tech Guru wrote:

RT is a true generational leap in the gaming industry (AMD will follow sure in their upcoming Navi cards ) , some think its a mere reflections here  and there that adds on the graphical fidelity of the image. It is a lot of rendering computational power , that Hollywood movies usually use rendering farms to do it. Now its is available to the end gamer , which is good.

Say you have a scene in your game with 1000 objects in it, no problem at all for a modern GPU. It takes each object, one after the other, and first finds out what area of the screen the object is visible in. For each of those pixels it executes a shader, which calculates the color of that pixel based on the object's material, textures, lights and the lights in the scene. Relatively speaking that's a very small amount of data that you need to do the calculations.

GPUs do this super quickly because the color of the pixel doesn't depend on anything except that one object's data and the lights in the scene, so you can calculate thousands of pixels at the same time using thousands of tiny processors. Modern engines then do all kinds of post processing steps, where they take the finished image and combine it with other data to do lots of neat effects like SSAO or bloom.

Ray tracing works completely different. In ray tracing you're shooting a ray into a scene and you have no idea before hand what it will hit. So every ray needs to have access to all the objects in the scene, their materials and so on, at the same time. Even if you knew which object a ray would hit what happens when that object has some reflective properties? Or say you put a bright red object next to a white one, some of the red color will reflect onto the white one. So each time you hit something you need to shoot even more rays from there, hitting other objects, and then you need to combine the results of all those to get the color of the pixel.

https://www.reddit.com/r/explainlikeimf … o/czw9686/

Offline

#63 September 5 2018

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

You can email PC and Parts now and ask for a full prices of Zotac RTX 2080 and 2080ti OC editions. ETA of  end of it of mod of this month .

PC and Parts strong partnership with Zotac  contributed to  provide a  genuine 3 years warranty on the graphic cards with a solid RMA ( they pick for you the damaged Graphic card within the warranty period and they handle all the RMA process) in case the RMA is valid they issue for you a credit note of the original purchase price ( which is solid).

Prices are competitive ~ 80 USD on current 2080ti and 2080 official Nvidia FE listings , and PC and Parts expect prices to   normalize and reduce after the official market lunch.

~ 1000 - 1100 USD for the RTX 2080ti strix is a good investment to me.

Last edited by Tech Guru (September 5 2018)

Offline

#64 September 7 2018

user
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

If you got a good price on the 1080 ti (and it seems you did) sure go for it. Obviously you're an avid 4k gamer. To be honest if I find myself buying a new VGA from scratch I'd probably get it (and use the opportunity to get 4k displays) If nothing else it should be future proof for me for the next 10 years lol

Offline

#65 September 14 2018

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

c6vq2WU.jpg

Key Features of Turing

https://videocardz.com/77895/the-new-fe … chitecture

Enjoy , the Innovation embedded in the upcoming chip.  Goodbye Pascal

Offline

#66 September 19 2018

anayman_k7
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

The benchmarks are out, based on price per performance it seems that the GTX2080 is a bad buy when it has a slight better performance compared to the GTX1080Ti with less VRAM and for 100$ to 150$ more!

Offline

#67 September 19 2018

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

anayman_k7 wrote:

The benchmarks are out, based on price per performance it seems that the GTX2080 is a bad buy when it has a slight better performance compared to the GTX1080Ti with less VRAM and for 100$ to 150$ more!

When the drivers are mature are more enough and DLSS role out , 1080 will not have a chance. 1080ti need to be compared with the 2080ti and 2080 with the 1080 .

the 1080ti is screwed deep but people who paid a hefty amounts for their 1080tis are trying to "self convince themselves"that upgrading to turing is not worthy , to deep inside they know a gen to gen architectural changes , increase in cuda core , new memory ,new optimizations , new features is always a win situation. Truth about evolving tech were " self convincing" to stick themselves with old gen.

Last edited by Tech Guru (September 19 2018)

Offline

#68 September 19 2018

anayman_k7
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

After watching most of the reviews the thing that I was afraid off happened, RTX is still non existent and when it exists it will be so limited and we still have to see actual performance numbers about, same for DLSS, GTX 2080 is not even worth buying, a GTX 2080 Ti will let you play 4k 60fps Ultra or have a consistent 1440p 144Hz experience but for a very hefty price tag.

Offline

#69 September 19 2018

anayman_k7
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Tech Guru wrote:
anayman_k7 wrote:

The benchmarks are out, based on price per performance it seems that the GTX2080 is a bad buy when it has a slight better performance compared to the GTX1080Ti with less VRAM and for 100$ to 150$ more!

When the drivers are mature are more enough and DLSS role out , 1080 will not have a chance. 1080ti need to be compared with the 2080ti and 2080 with the 1080 .

the 1080ti is screwed deep but people who paid a hefty amounts for their 1080tis are trying to "self convince themselves"that upgrading to turing is not worthy , to deep inside they new a gen to gen architectural changes , increase in cuda core , new meomey ,new optimizations , new features is always a win situation. Truth about evolving tech were " self convincing" to stick themselves with old gen.

I disagree, you compare the GTX2080 with the GTX1080Ti for now, because later on we will compare the GTX2070 with the GTX1080Ti, you compare 'What you pay/What you get' not what is that number equal that number (generation by generation), unless these cards are provided for free.

Edit: And I don't think a sane person will buy a thing that could prove itself later on (if we could), go check Reddit 99% of users canceled their 2080 orders for the 1080Ti

Last edited by anayman_k7 (September 19 2018)

Offline

#70 September 19 2018

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

anayman_k7 wrote:
Tech Guru wrote:
anayman_k7 wrote:

The benchmarks are out, based on price per performance it seems that the GTX2080 is a bad buy when it has a slight better performance compared to the GTX1080Ti with less VRAM and for 100$ to 150$ more!

When the drivers are mature are more enough and DLSS role out , 1080 will not have a chance. 1080ti need to be compared with the 2080ti and 2080 with the 1080 .

the 1080ti is screwed deep but people who paid a hefty amounts for their 1080tis are trying to "self convince themselves"that upgrading to turing is not worthy , to deep inside they new a gen to gen architectural changes , increase in cuda core , new meomey ,new optimizations , new features is always a win situation. Truth about evolving tech were " self convincing" to stick themselves with old gen.

I disagree, you compare the GTX2080 with the GTX1080Ti for now, because later on we will compare the GTX2070 with the GTX1080Ti, you compare 'What you pay/What you get' not what is that number equal that number (generation by generation), unless these cards are provided for free.

Edit: And I don't think a sane person will buy a thing that could prove itself later on (if we could), go check Reddit 99% of users canceled their 2080 orders for the 1080Ti

You are wrong with that logic

2080 replaces the 1080
2080ti replaces 1080ti
2070 replaces the 1070

Also ,

The new cards are not made to give you 300 fps on 4k. It is made to make ray tracing possible on a 800$ card, instead of 68,000$.

Still DLSS is not tested ,DLSS + RT 》》》 Old Gen 1080ti ,remember in some titles the 2080 in some titles pulled ahead of the 1080ti. Still reviews start to roll out wait couple of weeks / months ahead and the gap will be widen of 1080 vs 2080 and 2080ti vs 1080ti.


Hardware Unbox was the 1st to confess that they are wrong with their pereception about new gen:

"Geforce RTX 2080 & 2080Ti Review , I was Wrong"
https://youtu.be/dLjQR0UFUd0

Last edited by Tech Guru (September 19 2018)

Offline

#71 September 19 2018

anayman_k7
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Tech Guru wrote:
anayman_k7 wrote:
Tech Guru wrote:

When the drivers are mature are more enough and DLSS role out , 1080 will not have a chance. 1080ti need to be compared with the 2080ti and 2080 with the 1080 .

the 1080ti is screwed deep but people who paid a hefty amounts for their 1080tis are trying to "self convince themselves"that upgrading to turing is not worthy , to deep inside they new a gen to gen architectural changes , increase in cuda core , new meomey ,new optimizations , new features is always a win situation. Truth about evolving tech were " self convincing" to stick themselves with old gen.

I disagree, you compare the GTX2080 with the GTX1080Ti for now, because later on we will compare the GTX2070 with the GTX1080Ti, you compare 'What you pay/What you get' not what is that number equal that number (generation by generation), unless these cards are provided for free.

Edit: And I don't think a sane person will buy a thing that could prove itself later on (if we could), go check Reddit 99% of users canceled their 2080 orders for the 1080Ti

You are wrong with that logic

2080 replaces the 1080
2080ti replaces 1080ti
2070 replaces the 1070

Also ,

The new cards are not made to give you 300 fps on 4k. It is made to make ray tracing possible on a 800$ card, instead of 68,000$.

Still DLSS is not tested ,DLSS + RT 》》》 Old Gen 1080ti ,remember in some titles the 2080 in some titles pulled ahead of the 1080ti. Still reviews start to roll out wait couple of weeks / months ahead and the gap will be widen of 1080 vs 2080 and 2080ti vs 1080ti.


Hardware Unbox was the 1st to confess that they are wrong with their pereception about new gen:

"Geforce RTX 2080 & 2080Ti Review , I was Wrong"
https://youtu.be/dLjQR0UFUd0

You need to watch the whole video and not only the first few seconds, I find it weird you are still insisting about the DLSS/RTX things being something real, while for now there is no RTX games and the planned ones are very few, and even that the planned DLSS games are more than the RTX (still not enough games that fits different tastes) the result of the shown example with grass/details inside a car was underwhelming with blurriness. Sorry when I say that even that 100%, yes 100% of reviewers/comments did say that the GTX 2080 is not worth to buy for current price markets and current status of games and yet you still insist otherwise, I call that fanboy-ism to anything Nvidia launches.

Offline

#72 September 19 2018

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

anayman_k7 wrote:
Tech Guru wrote:
anayman_k7 wrote:

I disagree, you compare the GTX2080 with the GTX1080Ti for now, because later on we will compare the GTX2070 with the GTX1080Ti, you compare 'What you pay/What you get' not what is that number equal that number (generation by generation), unless these cards are provided for free.

Edit: And I don't think a sane person will buy a thing that could prove itself later on (if we could), go check Reddit 99% of users canceled their 2080 orders for the 1080Ti

You are wrong with that logic

2080 replaces the 1080
2080ti replaces 1080ti
2070 replaces the 1070

Also ,

The new cards are not made to give you 300 fps on 4k. It is made to make ray tracing possible on a 800$ card, instead of 68,000$.

Still DLSS is not tested ,DLSS + RT 》》》 Old Gen 1080ti ,remember in some titles the 2080 in some titles pulled ahead of the 1080ti. Still reviews start to roll out wait couple of weeks / months ahead and the gap will be widen of 1080 vs 2080 and 2080ti vs 1080ti.


Hardware Unbox was the 1st to confess that they are wrong with their pereception about new gen:

"Geforce RTX 2080 & 2080Ti Review , I was Wrong"
https://youtu.be/dLjQR0UFUd0

You need to watch the whole video and not only the first few seconds, I find it weird you are still insisting about the DLSS/RTX things being something real, while for now there is no RTX games and the planned ones are very few, and even that the planned DLSS games are more than the RTX (still not enough games that fits different tastes) the result of the shown example with grass/details inside a car was underwhelming with blurriness. Sorry when I say that even that 100%, yes 100% of reviewers/comments did say that the GTX 2080 is not worth to buy for current price markets and current status of games and yet you still insist otherwise, I call that fanboy-ism to anything Nvidia launches.


I salute Nvidia by taking risks and moving technology forwad or any company that do so in the tech world.

You do not seem to fully understands how RT s done and the computational power needed to drive real time RT  and the difference between  regular rasterization rendering and real time RT.

Digital Foundry : Metro Exodus RT Analysis:
https://youtu.be/lRO_BbrHFkQ

I will definitely drop the RTX 2080ti , it is a real to provide real 60 fps @ 2160p with out any compromises that my old 1080ti heavily OC'd failed in my titles @ 2160p with frequent drops under 60fps. The 2080ti even is suited for above 60fps @ 2160p ultra settings on monitors like the PG 27uq and Predator X27 , 2160p / 144hz. It will a nice upgrade to me since I will dropping the PG 35vq when it will be released ( 3440 × 1444p) @ 200hz. I am talking here about raw rasterization rendering.

Still DLSS using tensor cores , AI and nural networks is not tested more titles are added that will suppprt that (24 titles till now).

Offline

#73 September 19 2018

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Tech Guru wrote:
anayman_k7 wrote:
Tech Guru wrote:

You are wrong with that logic

2080 replaces the 1080
2080ti replaces 1080ti
2070 replaces the 1070

Also ,

The new cards are not made to give you 300 fps on 4k. It is made to make ray tracing possible on a 800$ card, instead of 68,000$.

Still DLSS is not tested ,DLSS + RT 》》》 Old Gen 1080ti ,remember in some titles the 2080 in some titles pulled ahead of the 1080ti. Still reviews start to roll out wait couple of weeks / months ahead and the gap will be widen of 1080 vs 2080 and 2080ti vs 1080ti.


Hardware Unbox was the 1st to confess that they are wrong with their pereception about new gen:

"Geforce RTX 2080 & 2080Ti Review , I was Wrong"
https://youtu.be/dLjQR0UFUd0

You need to watch the whole video and not only the first few seconds, I find it weird you are still insisting about the DLSS/RTX things being something real, while for now there is no RTX games and the planned ones are very few, and even that the planned DLSS games are more than the RTX (still not enough games that fits different tastes) the result of the shown example with grass/details inside a car was underwhelming with blurriness. Sorry when I say that even that 100%, yes 100% of reviewers/comments did say that the GTX 2080 is not worth to buy for current price markets and current status of games and yet you still insist otherwise, I call that fanboy-ism to anything Nvidia launches.


I salute Nvidia by taking risks for moving technology forward or any company that do so in the tech world.

You do not seem to fully understands how RT sis  done and the computational power needed to drive real time RT  and the difference between  regular rasterization rendering and real time RT.

Digital Foundry : Metro Exodus RT Analysis:
https://youtu.be/lRO_BbrHFkQ

I will definitely drop the RTX 2080ti , it is a real to provide real 60 fps @ 2160p with out any compromises that my old 1080ti heavily OC'd failed in my titles @ 2160p with frequent drops under 60fps. The 2080ti even is suited for above 60fps @ 2160p ultra settings on monitors like the PG 27uq and Predator X27 , 2160p / 144hz. It will a nice upgrade to me since I will dropping the PG 35vq when it will be released ( 3440 × 1444p) @ 200hz. I am talking here about raw rasterization rendering.

Still DLSS using tensor cores , AI and nural networks is not tested more titles are added that will suppprt that (24 titles till now).

Offline

#74 September 19 2018

DNA
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Techguru you are 100% right with your points about the tech and everyone else are on point with their disappointment. The 2080 should have smoked the 1080ti under all conditions just like the 1080 smokes the 980ti

Guys it's simple new tech costs money and since the 1080ti is still better than any offering from a competitor, nvidia made the 2080ti super expensive because it has the NEW Technology if you find it expensive and you don't need the tech simply you buy 1080ti  since there is no competition so it's a win win for nvidia because there is no point for any company in the world to release a product that competes with its own products and pushes their price down thats why they slapped a premium price to the new tech so they don't push 10 series price down
from a user perspective i am disappointed but from a business perspective I salut nvidia for their great business move and market placement for their cards...

Offline

#75 September 19 2018

enthralled
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

DNA wrote:

Techguru you are 100% right with your points about the tech and everyone else are on point with their disappointment. The 2080 should have smoked the 1080ti under all conditions just like the 1080 smokes the 980ti

Guys it's simple new tech costs money and since the 1080ti is still better than any offering from a competitor, nvidia made the 2080ti super expensive because it has the NEW Technology if you find it expensive and you don't need the tech simply you buy 1080ti  since there is no competition so it's a win win for nvidia because there is no point for any company in the world to release a product that competes with its own products and pushes their price down thats why they slapped a premium price to the new tech so they don't push 10 series price down
from a user perspective i am disappointed but from a business perspective I salut nvidia for their great business move and market placement for their cards...

That's a very good and mature interpretation. Hopefully it turns the debate more positive.

Offline

Board footer