I will not go into details but I am very sure that by now we all know that Ray Tracing was only a gimmick thing we didn't need and we were forced to pay for it by Nvidia, the number of the games that really adopted that is very low plus the performance impact is stupid
RTX 2080ti - Nvidia "Turing" Upgrade
- Edited
The 2080ti is serving me well for RTX enabled games and nails it. Cod modern warfare with RTX on ~ 60 with SMAA T2x ultra on 2160p using the 2080ti . RTX alone enabled that RDR2 doesnot include, Destroy all old rasterization canned methods of rendering. Since it is real time rendering not backed rendering , very taxing and 60 fps on 2160p. Very optimised CoD with its new engine that scales very well.anayman_k7 wroteI will not go into details but I am very sure that by now we all know that Ray Tracing was only a gimmick thing we didn't need and we were forced to pay for it by Nvidia, the number of the games that really adopted that is very low plus the performance impact is stupid
RTX + Optimized New Engine = Destroys all what RDRD 2 offers. They should have escaped from old rendering techniques.
Even an RTX 2070 / RTX 2070 super can boost decent frames on 1080p and 1440p when RT is enabled. I do not want to repeat myself again , but you need to grab an RTX 2080ti or RTX 2070 ( in case you have limited budget) to evaluate objectively what RTRT is all about ... The whole gaming industry is shitfting to real time rendering ( including AMD with Navi successor and next gen consoles) away from old canned / baked rasterization techniques.
Saying " a gimmick" is a bit naive of not knowing what all Real Time Ray Tracing is all about. It is not an Nvidia Gamework feature like HBAO+ , Grass Turf , Hairwork , PCSS , HFTS etc.. Now what was done at Holywood movies of ray tracing as a rendering technique for generating an image by tracing the path of light as pixels in an image plane and simulating the effects of its encounters with virtual objects , that need a lot of compuational power - now it is available to the gamer end. You can check all Digital Foundry Analysis.
kareem_nasser wroteThe 3 examples you gave are normal for tech still in its infancy in realtime rendering, thus still only implemented mainly on reflections and illumination. For full on graphics rendering in ray tracing it will take years if not a decade (more typical TFLOPs number increases are declining). The next generation of consoles, given that they will include some form of hardware for ray tracing, this will give the tech a boost on the PC market. Since from the previous generation and cross platform development is a thing, especially that some AAA titles are primarily developed on consoles, along with their rendering technology and toolkits.Tech Guru wroteTo me it is fake:kareem_nasser wrote
Nothing beats hardware based acceleration, but i wouldn't call it "Fake Real Time Ray Tracing".
They Used canned cubic maps - like old rasterization. SVOGI trace cone, not rays and has its own limitation that's why it's considered a different thing from raytracing. CryEngine does lighting and reflections with voxels (SVOGI). Voxels have been researched and used in other methods as well like VXAO or VXGI. The way how the voxel data structures are built and handled puts them halfway towards ray-tracing in principle. It has many limitaions compared to real time raytracing seen in
BFV - Reflections
Metro Exodus - Global Illumination
Rise of the TombRaider & Call of Duty Modern Warefare
The PS5 will have a 2070 kind of performance , which is technically aame 1440p machine or with native 2160p 30fps Medium setting and checkerboard 8K 30fps medium settings. A little underwhelming for a Q4 2020 holidays window release. Most probably it will take a hybrid " software - hardware" approach , as AMD is developing now, to tackle real time rendering. I suspect a lot of short-cuts and tricks will be used to reduce the computational power needed.
@anayman_k7 Fully agree, which is why AMD currently doesnt care much about it.
@techguru You forgot what a 2013 console is currently doing in the graphics department? As in a closed box with specific hardware that developers work on and program to the metal. Which is why you can never directly compare your GPU (which is never programmed for) on PC to a GPU on a console.
@techguru You forgot what a 2013 console is currently doing in the graphics department? As in a closed box with specific hardware that developers work on and program to the metal. Which is why you can never directly compare your GPU (which is never programmed for) on PC to a GPU on a console.
Not to derail the subject being discussed but is it normal for my GeForce RTX™ 2080 Ti GAMING OC 11G to run at ~85 degrees on the following:
Overwatch
1440p 144hz
Everything Maxed
200% Scale
The air flow is supposedly correct in the case, I even kept the case open and nothing changed.
Overwatch
1440p 144hz
Everything Maxed
200% Scale
The air flow is supposedly correct in the case, I even kept the case open and nothing changed.
xterm wroteNot to derail the subject being discussed but is it normal for my GeForce RTX™ 2080 Ti GAMING OC 11G to run at ~85 degrees on the following:
Overwatch
1440p 144hz
Everything Maxed
200% Scale
The air flow is supposedly correct in the case, I even kept the case open and nothing changed.
I think as long as it is at around 80 degrees then you are good. It is a beast and you are already playing at maxed settings 1440p144. It would be a problem if it is overheating and affecting the performance (stutter for example).
kareem_nasser wrote@anayman_k7 Fully agree, which is why AMD currently doesnt care much about it.
@techguru You forgot what a 2013 console is currently doing in the graphics department? As in a closed box with specific hardware that developers work on and program to the metal. Which is why you can never directly compare your GPU (which is never programmed for) on PC to a GPU on a console.
In the contrary they fully care but released Navi architecture in a rush as 7nm RDNA solution with the 5700XT to rasie the bar and regain some market share against Nvidia by competing with Turing after the GCN 5.0 R VII total failure. Again the 5700XT is a total fail in old rasterization as a itbis based on 7nm. Ironically 5700XT is short term life cycled product, Will be replaced soon with its successor that has an RT Hybrid hardward that Navi Missed since it is not ready. Better to go with a RTX 2070. 5700XT ass whipped by 3 years old 16nm Pascal 1080ti , no RT cores too. Being late to release RT hardware doesnot mean they do not care on contrary they are lacking R&D to compete as AMD trend stays
kareem_nasser wrotexterm wroteNot to derail the subject being discussed but is it normal for my GeForce RTX™ 2080 Ti GAMING OC 11G to run at ~85 degrees on the following:
Overwatch
1440p 144hz
Everything Maxed
200% Scale
The air flow is supposedly correct in the case, I even kept the case open and nothing changed.
I think as long as it is at around 80 degrees then you are good. It is a beast and you are already playing at maxed settings 1440p144. It would be a problem if it is overheating and affecting the performance (stutter for example).
Bit of high temperatures ,
My O11G strix OC'd to 2010Mhz on Core has 68 Degrees on 100% Load with no heat throttling after prolonged gaming hours . Case being used
Corsair Crystal Series 680X RGB with couple of LL fans ( 120mm ans 140mm) in a specific push - pull configuration I set.
Resolutions Playing at
2160p 120hz (TV)
1440p 144hz (Monitor)
Both Max settings.
2 months later
AMD is talking about Big Navi to Compete with Two Years Old HighEnd turing , upon release. & Nvidia Will be Announcing Ampere in March.
Always Catch With Me A Year Later with AMD.
https://videocardz.com/newz/rumor-first-nvidia-ampere-geforce-rtx-3080-and-rtx-3070-specs-surface
Always Catch With Me A Year Later with AMD.
https://videocardz.com/newz/rumor-first-nvidia-ampere-geforce-rtx-3080-and-rtx-3070-specs-surface
Yeah AMD is more successful with CPU in terms of sales volume than gpu
2 months later
- Edited
Nvidia Unleased the Power of Tensor Cores and their Saturn V supercomputers for AI and Deep Learming in DLSS 2.0. The major difference between DLSS 1.0 and DLSS 2.0 - In Short:
125% Increase in Fps and requires 400% zoom to see the difference and some times better than Native Rsolution
Remove the expensive per-game training and the many (many) problems that non-deterministic games presented in training, NVIDIA has moved to a single generic network for all games.
This newer neural network is based on a fully synthetic training set rather than individual games, which in turn is fully deterministic, allowing NVIDIA to extensively train the new network in exactly fashion they need for it to iterate and improve over generations.
Digital Foundry:
https://youtu.be/YWIKzRhYZm4
125% Increase in Fps and requires 400% zoom to see the difference and some times better than Native Rsolution
Remove the expensive per-game training and the many (many) problems that non-deterministic games presented in training, NVIDIA has moved to a single generic network for all games.
This newer neural network is based on a fully synthetic training set rather than individual games, which in turn is fully deterministic, allowing NVIDIA to extensively train the new network in exactly fashion they need for it to iterate and improve over generations.
Digital Foundry:
https://youtu.be/YWIKzRhYZm4
This Is Potentially a Revolution in Graphics Technology .. RTX 2060 , the entry RTX , owners you have good years to go with DLSS 2.0 now & soon Direct X 12 Ultimate.
The Tech Chap:
https://m.youtube.com/watch?feature=youtu.be&v=eS1vQ8JtbdM
The Tech Chap:
https://m.youtube.com/watch?feature=youtu.be&v=eS1vQ8JtbdM
19 days later
https://youtu.be/kqZbyPkYygs
That New Samsung Memory modules found only in the 2080 Super Rated @15.5 Gbps is insane it has so much head room to increase its frequency making the 2080 Super a sub 1000 USD Card reaching 2080ti levels.
All RTX line ,expect the 2080Super, has VRAM rated @ 14Gbps including the RTX 2080ti.
That New Samsung Memory modules found only in the 2080 Super Rated @15.5 Gbps is insane it has so much head room to increase its frequency making the 2080 Super a sub 1000 USD Card reaching 2080ti levels.
All RTX line ,expect the 2080Super, has VRAM rated @ 14Gbps including the RTX 2080ti.