LebGeeks

A community for technology geeks in Lebanon.

You are not logged in.

#201 April 25 2019

Elitism Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

bobo619 wrote:
kareem_nasser wrote:
Elitism Guru wrote:

Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.

I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.

Your statement is true but LCD with Full array local diming have come a long way and close but still it won't give you true black. The only draw backs of OLED is burn in espacially for gaming where you have static images being displayed for a long duration such as maps HUDetc... Another fact is that OLED right now cannot deliver higher peak brightness than LCD at least in tvs, OLED also suffers during daytime to truly enjoy it you must be in a dark to dimly lit room. The recommended peak brightness to enjoy true HDR right now is 1400+ nits as more and more content are being mastered to such brightness and even more. I have an 85 inch sony X900F purchased about a year ago and loving it but it suffers with grey uniformity being such a big panel so yeah imo micro LED is way to go but still extra expansive right now will have to wait 2-3 years to reach economy of scale and become affordable.

LCD will always outloast OLED, but geeks talk numbers, OLED will have atleast 15 thousand hours before burn in starts since ~2013, atleast that's what Sharp MTBF tests showed. LCDs around 30 thousand hours, so if you use the OLED 12 hours a day expect around 10 years before burn in, LCD will last longer considering you do board repair and backlight LED replacement. Either ways, fanboys will always avoid grey areas, which is what makes this myth alive no matter how OLED improves.

Last edited by Elitism Guru (April 25 2019)

Offline

#202 April 25 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

kareem_nasser wrote:
Elitism Guru wrote:
Tech Guru wrote:

https://youtu.be/YA0jhM-l7uo

Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing  + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.

Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.

I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.

Oleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.

As a a side note:

Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...

Sony will always be the leader in AV.

http://www.trustedreviews.com/reviews/s … first-look

In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display

Sony  pushed the quantum dot technology in 2013 ( Triluminos Display) ,  that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs  like the SJ 8500 Nanocell , and the gaming monitors  from asus like the pg 35 vq and pg 27 uq.

https://www.theverge.com/2013/1/16/3881 … antum-dots


OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG ,  or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:

Image Burning (Burn-In)
Image Retention

Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.

I can tolerate LED  + an Efficient Local dimming with minimal  blooming , flashligting ,  none real blacks , and not infinate contrast vs  OLEDs image retention and burn-Ins.


According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-li … n-in-test)   and every OLED TV review they give image retention / burn in a score of zero.


For A Real HDR Experience You need:

At least 1000 nits of brightness of real hdr scene at 100% window - sustained

At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.

A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination  reproduction.

A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.

Last edited by Tech Guru (April 25 2019)

Offline

#203 April 25 2019

Elitism Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Tech Guru wrote:
kareem_nasser wrote:
Elitism Guru wrote:

Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.

I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.

Oleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.

As a a side note:

Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...

Sony will always be the leader in AV.

http://www.trustedreviews.com/reviews/s … first-look

In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display

Sony  pushed the quantum dot technology in 2013 ( Triluminos Display) ,  that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs  like the SJ 8500 Nanocell , and the gaming monitors  from asus like the pg 35 vq and pg 27 uq.

https://www.theverge.com/2013/1/16/3881 … antum-dots


OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG ,  or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:

Image Burning (Burn-In)
Image Retention

Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.

I can tolerate LED  + an Efficient Local dimming with minimal  blooming , flashligting ,  none real blacks , and not infinate contrast vs  OLEDs image retention and burn-Ins.


According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-li … n-in-test)   and every OLED TV review they give image retention / burn in a score of zero.


For A Real HDR Experience You need:

At least 1000 nits of brightness of real hdr scene at 100% window - sustained

At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.

A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination  reproduction.

A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.

New OLEDs strobe static pixels, Samsung had a demo back then achieving 15k hours without any issues in 2015. If you intentionally use the flicker test as you mentioned in the reviews, then yes ofc it will happen, even on LCD if you leave it for a month, ever notice shop LCD displays burn in?

I'd rather deal with an 8 year old OLED with slight burn in than dirty screen effect, uneven backlight bleeding, uneven brightness distribution, and jelly effect due to changing LCD differential bias affected by room temperature, Sharp IGZO panels use temperature sensor for this but very few panels even consider jelly/tilt effect. Not to mention you get the advantage of 90hz+ other than input lag, and smoother blur (LCD). Only a 240 Hz WITH backlight strobing and black frame insertion (resulting in 120) can get close to OLED, but you lose brightness and color accuracy.

I'll add that Samsung mentioned that their Galaxy S3 panel would only last 4 years which real life tests and reports showed, which in turn shows how accurate their tests are done outside lab

Last edited by Elitism Guru (April 25 2019)

Offline

#204 April 25 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

.

Last edited by Tech Guru (April 25 2019)

Offline

#205 April 25 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Elitism Guru wrote:
Tech Guru wrote:
kareem_nasser wrote:

I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.

Oleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.

As a a side note:

Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...

Sony will always be the leader in AV.

http://www.trustedreviews.com/reviews/s … first-look

In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display

Sony  pushed the quantum dot technology in 2013 ( Triluminos Display) ,  that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs  like the SJ 8500 Nanocell , and the gaming monitors  from asus like the pg 35 vq and pg 27 uq.

https://www.theverge.com/2013/1/16/3881 … antum-dots


OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG ,  or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:

Image Burning (Burn-In)
Image Retention

Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.

I can tolerate LED  + an Efficient Local dimming with minimal  blooming , flashligting ,  none real blacks , and not infinate contrast vs  OLEDs image retention and burn-Ins.


According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-li … n-in-test)   and every OLED TV review they give image retention / burn in a score of zero.


For A Real HDR Experience You need:

At least 1000 nits of brightness of real hdr scene at 100% window - sustained

At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.

A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination  reproduction.

A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.

New OLEDs strobe static pixels, Samsung had a demo back then achieving 15k hours without any issues in 2015. If you intentionally use the flicker test as you mentioned in the reviews, then yes ofc it will happen, even on LCD if you leave it for a month, ever notice shop LCD displays burn in?

I'd rather deal with an 8 year old OLED with slight burn in than dirty screen effect, uneven backlight bleeding, uneven brightness distribution, and jelly effect due to changing LCD differential bias affected by room temperature, Sharp IGZO panels use temperature sensor for this but very few panels even consider jelly/tilt effect. Not to mention you get the advantage of 90hz+ other than input lag, and smoother blur (LCD). Only a 240 Hz WITH backlight strobing and black frame insertion (resulting in 120) can get close to OLED, but you lose brightness and color accuracy.

I'll add that Samsung mentioned that their Galaxy S3 panel would only last 4 years which real life tests and reports showed, which in turn shows how accurate their tests are done outside lab

My experience with OLED kind of sad ,  away from numbers and technical websites testing. Practicle owneship of a 65"LG OLED C6. It has a breathtaking colors reproduction, real inky blacks, and screen uniformity.

However from Gaming:

Static images ( like health bars , stats ,  ingame maps , outgame main menu)

Peak HDR real scene brightness ~ 666nits in case of LG

Made this set after 6 months of usage falling in burn ins / image retention when I shit from content to content. Although LG in 2018 models introduced Pixel Refresh' and 'Screen Shift' option to reduce the susceptibility to the innate nature of OLEDs. The risk increases on the contents being used and  usage operation hours / day.

The Sony X930e is doing great now , a uniform experience overall without any risk. MicroLed with HDMI 2.1 8K will definitely be my next upgrade.

Last edited by Tech Guru (April 25 2019)

Offline

#206 April 25 2019

Elitism Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Tech Guru wrote:
Elitism Guru wrote:
Tech Guru wrote:

Oleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.

As a a side note:

Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...

Sony will always be the leader in AV.

http://www.trustedreviews.com/reviews/s … first-look

In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display

Sony  pushed the quantum dot technology in 2013 ( Triluminos Display) ,  that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs  like the SJ 8500 Nanocell , and the gaming monitors  from asus like the pg 35 vq and pg 27 uq.

https://www.theverge.com/2013/1/16/3881 … antum-dots


OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG ,  or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:

Image Burning (Burn-In)
Image Retention

Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.

I can tolerate LED  + an Efficient Local dimming with minimal  blooming , flashligting ,  none real blacks , and not infinate contrast vs  OLEDs image retention and burn-Ins.


According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-li … n-in-test)   and every OLED TV review they give image retention / burn in a score of zero.


For A Real HDR Experience You need:

At least 1000 nits of brightness of real hdr scene at 100% window - sustained

At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.

A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination  reproduction.

A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.

New OLEDs strobe static pixels, Samsung had a demo back then achieving 15k hours without any issues in 2015. If you intentionally use the flicker test as you mentioned in the reviews, then yes ofc it will happen, even on LCD if you leave it for a month, ever notice shop LCD displays burn in?

I'd rather deal with an 8 year old OLED with slight burn in than dirty screen effect, uneven backlight bleeding, uneven brightness distribution, and jelly effect due to changing LCD differential bias affected by room temperature, Sharp IGZO panels use temperature sensor for this but very few panels even consider jelly/tilt effect. Not to mention you get the advantage of 90hz+ other than input lag, and smoother blur (LCD). Only a 240 Hz WITH backlight strobing and black frame insertion (resulting in 120) can get close to OLED, but you lose brightness and color accuracy.

I'll add that Samsung mentioned that their Galaxy S3 panel would only last 4 years which real life tests and reports showed, which in turn shows how accurate their tests are done outside lab

My experience with OLED kind of sad ,  away from numbers and technical websites testing. Practicle owneship of a 65"LG OLED C6. It has a breathtaking colors reproduction, real inky blacks, and screen uniformity.

However from Gaming:

Static images ( like health bars , stats ,  ingame maps , outgame main menu)

Peak HDR real scene brightness ~ 666nits in case of LG

Made this set after 6 months of usage falling in burn ins / image retention when I shit from content to content. Although LG in 2018 models introduced Pixel Refresh' and 'Screen Shift' option to reduce the susceptibility to the innate nature of OLEDs. The risk increases on the contents being used and  usage operation hours / day.

The Sony X930e is doing great now , a uniform experience overall without any risk. MicroLed with HDMI 2.1 8K will definitely be my next upgrade.

I see, I forgot about how each vendor implements mitigation against these issues, just like how tilt effect remains unresolved in most LCDs except when looking at 80+ inch TVs.
MicroLed will only be cheaper and reasonable at 8k/4k minimum for large displays, afaik the larger the pixels, the more expensive and slower to manufacture.

Offline

#207 May 19 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

AMD vs Nivdia Efficiency

Make it Nvidia's 7nm vs AMD's 7nm. You'll be on your sides laughing when you'll see Nvidia using about 50% of AMD's total power draw while offering possibly over 50% more than AMD.

"Nvidia’s 12nm vs AMD’s 7nm GPU efficiency is “incomparable” ".

Source: PCGamesn
https://www.pcgamesn.com/nvidia/12nm-vs … arable?amp

Offline

#208 May 28 2019

infiniteloop
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

I want to get a RTX 2080, unfortunately there is a sort of shortage of known brands like msi and Gigabyte, so the most sold brand now is Zotac, and in Zotac lines the basic 2080 is at 770$ VAT included while the AMP is at 850$!!
Prices are compeltely nuts
I have found a Manli  Gallardo 2080 which is like the AMP, a bit too long for my case but even cheaper than the basic zotac, though I have no clues how good are these cards.
Anyone has good addresses for MSI rtx 2080 in Lebanon?

Offline

#209 May 29 2019

xterm
Moderator

Re: RTX 2080ti - Nvidia "Turing" Upgrade

IIRC Mojitech has Gigabyte RX2080 in stock, you can try your luck with him.
You can also check if NewVision has MSI in stock since he's the official distributor.

Offline

#210 May 29 2019

infiniteloop
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Gigabyte are overpriced here like +200$ compared to UAE
The advantage of Asus is that they have 2 HDMI but again overpriced
It will be between the Zotac gaming 770$ and the MSI Ventus 820$, dunno if it's worth paying more for the msi?

Offline

#211 May 29 2019

Elitism Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

infiniteloop wrote:

Gigabyte are overpriced here like +200$ compared to UAE
The advantage of Asus is that they have 2 HDMI but again overpriced
It will be between the Zotac gaming 770$ and the MSI Ventus 820$, dunno if it's worth paying more for the msi?

Any idea if its possible to cross-flash vbios with RTX? like pascal.
I did it for 3 clients build in UAE, even mini cards like the zotac 1080 ti mini

Offline

#212 May 29 2019

nefe_lpmk
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

infiniteloop wrote:

Gigabyte are overpriced here like +200$ compared to UAE
The advantage of Asus is that they have 2 HDMI but again overpriced
It will be between the Zotac gaming 770$ and the MSI Ventus 820$, dunno if it's worth paying more for the msi?

I would go with Zotac, cheaper and good warranty (5 years) from a well known vendor. Plus, I think both use stock PCBs.

Offline

#213 June 1 2019

infiniteloop
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

I finally returned the Zotac and took a Gigabyte Gaming OC instead, the shop made a price on it so the difference in price is not huge

Offline

#214 October 27 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

The Power of the RTX 2080ti

CoD Modern WareFare

4k - 3840 x 2160p
Max Settings with Filmic SMAA T2X
Real Time Rendering - RTX On

https://youtu.be/UrNHMfU_BlI

Offline

#215 November 16 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

A New Rendering  "Fake Real Time Ray Tracing" software mimick that doesnot need RT cores & works on Direct X 11 -  Even top dogs from last gen

1080ti (Pascal)
Radeon VII (GCN 5.0)

Fails to Deliver

Sell Them ASAP :)


https://youtu.be/efOR92n9mms

Offline

#216 November 19 2019

kareem_nasser
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Tech Guru wrote:

A New Rendering  "Fake Real Time Ray Tracing" software mimick that doesnot need RT cores & works on Direct X 11 -  Even top dogs from last gen

1080ti (Pascal)
Radeon VII (GCN 5.0)

Fails to Deliver

Sell Them ASAP :)


https://youtu.be/efOR92n9mms

Nothing beats hardware based acceleration, but i wouldn't call it "Fake Real Time Ray Tracing".

Offline

#217 November 19 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

kareem_nasser wrote:
Tech Guru wrote:

A New Rendering  "Fake Real Time Ray Tracing" software mimick that doesnot need RT cores & works on Direct X 11 -  Even top dogs from last gen

1080ti (Pascal)
Radeon VII (GCN 5.0)

Fails to Deliver

Sell Them ASAP :)


https://youtu.be/efOR92n9mms

Nothing beats hardware based acceleration, but i wouldn't call it "Fake Real Time Ray Tracing".

To me it is  fake:

They Used canned cubic maps -  like old rasterization.  SVOGI trace cone, not rays and has its own limitation that's why it's considered a different thing from raytracing. CryEngine does lighting and reflections with voxels (SVOGI). Voxels have been researched and used in other methods as well like VXAO or VXGI. The way how the voxel data structures are built and handled puts them halfway towards ray-tracing in principle. It has many limitaions compared to real time raytracing seen in

BFV -  Reflections
Metro Exodus -  Global Illumination
Rise of the TombRaider &  Call of Duty Modern Warefare

Offline

#218 November 19 2019

kareem_nasser
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Tech Guru wrote:
kareem_nasser wrote:
Tech Guru wrote:

A New Rendering  "Fake Real Time Ray Tracing" software mimick that doesnot need RT cores & works on Direct X 11 -  Even top dogs from last gen

1080ti (Pascal)
Radeon VII (GCN 5.0)

Fails to Deliver

Sell Them ASAP :)


https://youtu.be/efOR92n9mms

Nothing beats hardware based acceleration, but i wouldn't call it "Fake Real Time Ray Tracing".

To me it is  fake:

They Used canned cubic maps -  like old rasterization.  SVOGI trace cone, not rays and has its own limitation that's why it's considered a different thing from raytracing. CryEngine does lighting and reflections with voxels (SVOGI). Voxels have been researched and used in other methods as well like VXAO or VXGI. The way how the voxel data structures are built and handled puts them halfway towards ray-tracing in principle. It has many limitaions compared to real time raytracing seen in

BFV -  Reflections
Metro Exodus -  Global Illumination
Rise of the TombRaider &  Call of Duty Modern Warefare

The 3 examples you gave are normal for tech still in its infancy in realtime rendering, thus still only implemented mainly on reflections and illumination. For full on graphics rendering in ray tracing it will take years if not a decade (more typical TFLOPs number increases are declining). The next generation of consoles, given that they will include some form of hardware for ray tracing, this will give the tech a boost on the PC market. Since from the previous generation and cross platform development is a thing, especially that some AAA titles are primarily developed on consoles, along with their rendering technology and toolkits.

Offline

#219 November 20 2019

anayman_k7
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

I will not go into details but I am very sure that by now we all know that Ray Tracing was only a gimmick thing we didn't need and we were forced to pay for it by Nvidia, the number of the games that really adopted that is very low plus the performance impact is stupid

Offline

#220 November 20 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

anayman_k7 wrote:

I will not go into details but I am very sure that by now we all know that Ray Tracing was only a gimmick thing we didn't need and we were forced to pay for it by Nvidia, the number of the games that really adopted that is very low plus the performance impact is stupid

The 2080ti is serving me well for RTX enabled  games and nails it. Cod modern warfare with RTX on ~ 60 with SMAA T2x ultra on 2160p using the 2080ti .  RTX alone enabled that RDR2 doesnot include, Destroy all old rasterization canned methods of rendering. Since it is real time rendering not backed rendering , very taxing and 60 fps on 2160p. Very optimised CoD with its new engine that scales very well.

RTX + Optimized New Engine = Destroys all what RDRD 2 offers. They should have escaped from old rendering techniques.


Even an  RTX 2070 /  RTX 2070  super can boost decent frames on 1080p and 1440p when RT is enabled. I do not want to repeat myself again ,  but you need to grab an RTX 2080ti or RTX 2070 ( in case you have  limited budget) to evaluate objectively what RTRT is all about ... The whole gaming industry is shitfting to real time rendering ( including AMD with Navi successor and next gen consoles) away from old canned / baked rasterization techniques.

Saying " a gimmick"  is a bit naive of not knowing what all Real Time Ray Tracing is all about. It is not an Nvidia Gamework feature like HBAO+ , Grass Turf , Hairwork , PCSS , HFTS etc..  Now what was done at Holywood movies of ray tracing as a  rendering technique for generating an image by tracing the path of light as pixels in an image plane and simulating the effects of its encounters with virtual objects ,  that need a lot of compuational power -  now it is available to the gamer end. You can check all Digital Foundry Analysis.

Last edited by Tech Guru (November 20 2019)

Offline

#221 November 20 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

kareem_nasser wrote:
Tech Guru wrote:
kareem_nasser wrote:

Nothing beats hardware based acceleration, but i wouldn't call it "Fake Real Time Ray Tracing".

To me it is  fake:

They Used canned cubic maps -  like old rasterization.  SVOGI trace cone, not rays and has its own limitation that's why it's considered a different thing from raytracing. CryEngine does lighting and reflections with voxels (SVOGI). Voxels have been researched and used in other methods as well like VXAO or VXGI. The way how the voxel data structures are built and handled puts them halfway towards ray-tracing in principle. It has many limitaions compared to real time raytracing seen in

BFV -  Reflections
Metro Exodus -  Global Illumination
Rise of the TombRaider &  Call of Duty Modern Warefare

The 3 examples you gave are normal for tech still in its infancy in realtime rendering, thus still only implemented mainly on reflections and illumination. For full on graphics rendering in ray tracing it will take years if not a decade (more typical TFLOPs number increases are declining). The next generation of consoles, given that they will include some form of hardware for ray tracing, this will give the tech a boost on the PC market. Since from the previous generation and cross platform development is a thing, especially that some AAA titles are primarily developed on consoles, along with their rendering technology and toolkits.


The PS5 will have a 2070 kind of performance ,  which is technically aame 1440p  machine or with native 2160p 30fps Medium setting and checkerboard 8K 30fps medium settings. A little underwhelming for a Q4 2020 holidays window release. Most probably it will take a hybrid " software - hardware"  approach ,  as AMD is developing now,  to tackle real time rendering.  I suspect a lot of short-cuts and tricks will be used to reduce the computational power needed.

Offline

#222 November 20 2019

kareem_nasser
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

@anayman_k7 Fully agree, which is why AMD currently doesnt care much about it.

@techguru You forgot what a 2013 console is currently doing in the graphics department? As in a closed box with specific hardware that developers work on and program to the metal. Which is why you can never directly compare your GPU (which is never programmed for) on PC to a GPU on a console.

Offline

#223 November 21 2019

xterm
Moderator

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Not to derail the subject being discussed but is it normal for my GeForce RTX™ 2080 Ti GAMING OC 11G to run at ~85 degrees on the following:

Overwatch
1440p 144hz
Everything Maxed
200% Scale

The air flow is supposedly correct in the case, I even kept the case open and nothing changed.

Offline

#224 November 21 2019

kareem_nasser
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

xterm wrote:

Not to derail the subject being discussed but is it normal for my GeForce RTX™ 2080 Ti GAMING OC 11G to run at ~85 degrees on the following:

Overwatch
1440p 144hz
Everything Maxed
200% Scale

The air flow is supposedly correct in the case, I even kept the case open and nothing changed.


I think as long as it is at around 80 degrees then you are good. It is a beast and you are already playing at maxed settings 1440p144. It would be a problem if it is overheating and affecting the performance (stutter for example).

Offline

#225 November 21 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

kareem_nasser wrote:

@anayman_k7 Fully agree, which is why AMD currently doesnt care much about it.

@techguru You forgot what a 2013 console is currently doing in the graphics department? As in a closed box with specific hardware that developers work on and program to the metal. Which is why you can never directly compare your GPU (which is never programmed for) on PC to a GPU on a console.


In the contrary they fully care but released Navi architecture in a rush as 7nm RDNA solution with the 5700XT to rasie the bar and regain some market share against Nvidia by competing with Turing after the GCN 5.0 R VII total failure. Again the 5700XT is a total fail in old rasterization as a itbis based on 7nm. Ironically 5700XT is short term life cycled product, Will be replaced soon with its successor that has an RT Hybrid hardward that Navi Missed since it is not ready. Better to go with a RTX 2070. 5700XT ass whipped  by 3 years old 16nm Pascal 1080ti  , no RT cores too.  Being late to release RT hardware doesnot mean they do not care on contrary they are lacking R&D to compete as AMD trend stays

Offline

Board footer