• Hardware
  • RTX 2080ti - Nvidia "Turing" Upgrade

bobo619 wrote
kareem_nasser wrote
Elitism Guru wrote Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.
I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.
Your statement is true but LCD with Full array local diming have come a long way and close but still it won't give you true black. The only draw backs of OLED is burn in espacially for gaming where you have static images being displayed for a long duration such as maps HUDetc... Another fact is that OLED right now cannot deliver higher peak brightness than LCD at least in tvs, OLED also suffers during daytime to truly enjoy it you must be in a dark to dimly lit room. The recommended peak brightness to enjoy true HDR right now is 1400+ nits as more and more content are being mastered to such brightness and even more. I have an 85 inch sony X900F purchased about a year ago and loving it but it suffers with grey uniformity being such a big panel so yeah imo micro LED is way to go but still extra expansive right now will have to wait 2-3 years to reach economy of scale and become affordable.
LCD will always outloast OLED, but geeks talk numbers, OLED will have atleast 15 thousand hours before burn in starts since ~2013, atleast that's what Sharp MTBF tests showed. LCDs around 30 thousand hours, so if you use the OLED 12 hours a day expect around 10 years before burn in, LCD will last longer considering you do board repair and backlight LED replacement. Either ways, fanboys will always avoid grey areas, which is what makes this myth alive no matter how OLED improves.
kareem_nasser wrote
Elitism Guru wrote
Tech Guru wrotehttps://youtu.be/YA0jhM-l7uo

Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.
Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.
I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.
Oleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.

As a a side note:

Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...

Sony will always be the leader in AV.

http://www.trustedreviews.com/reviews/sony-crystal-led-tv-first-look

In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display

Sony pushed the quantum dot technology in 2013 ( Triluminos Display) , that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs like the SJ 8500 Nanocell , and the gaming monitors from asus like the pg 35 vq and pg 27 uq.

https://www.theverge.com/2013/1/16/3881546/sonys-new-triluminous-tvs-pursue-vibrant-hues-with-quantum-dots


OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG , or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:

Image Burning (Burn-In)
Image Retention

Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.

I can tolerate LED + an Efficient Local dimming with minimal blooming , flashligting , none real blacks , and not infinate contrast vs OLEDs image retention and burn-Ins.


According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-life-oled-burn-in-test) and every OLED TV review they give image retention / burn in a score of zero.


For A Real HDR Experience You need:

At least 1000 nits of brightness of real hdr scene at 100% window - sustained

At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.

A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination reproduction.

A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.
Tech Guru wrote
kareem_nasser wrote
Elitism Guru wrote Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.
I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.
Oleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.

As a a side note:

Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...

Sony will always be the leader in AV.

http://www.trustedreviews.com/reviews/sony-crystal-led-tv-first-look

In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display

Sony pushed the quantum dot technology in 2013 ( Triluminos Display) , that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs like the SJ 8500 Nanocell , and the gaming monitors from asus like the pg 35 vq and pg 27 uq.

https://www.theverge.com/2013/1/16/3881546/sonys-new-triluminous-tvs-pursue-vibrant-hues-with-quantum-dots


OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG , or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:

Image Burning (Burn-In)
Image Retention

Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.

I can tolerate LED + an Efficient Local dimming with minimal blooming , flashligting , none real blacks , and not infinate contrast vs OLEDs image retention and burn-Ins.


According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-life-oled-burn-in-test) and every OLED TV review they give image retention / burn in a score of zero.


For A Real HDR Experience You need:

At least 1000 nits of brightness of real hdr scene at 100% window - sustained

At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.

A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination reproduction.

A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.
New OLEDs strobe static pixels, Samsung had a demo back then achieving 15k hours without any issues in 2015. If you intentionally use the flicker test as you mentioned in the reviews, then yes ofc it will happen, even on LCD if you leave it for a month, ever notice shop LCD displays burn in?

I'd rather deal with an 8 year old OLED with slight burn in than dirty screen effect, uneven backlight bleeding, uneven brightness distribution, and jelly effect due to changing LCD differential bias affected by room temperature, Sharp IGZO panels use temperature sensor for this but very few panels even consider jelly/tilt effect. Not to mention you get the advantage of 90hz+ other than input lag, and smoother blur (LCD). Only a 240 Hz WITH backlight strobing and black frame insertion (resulting in 120) can get close to OLED, but you lose brightness and color accuracy.

I'll add that Samsung mentioned that their Galaxy S3 panel would only last 4 years which real life tests and reports showed, which in turn shows how accurate their tests are done outside lab
Elitism Guru wrote
Tech Guru wrote
kareem_nasser wrote
I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.
Oleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.

As a a side note:

Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...

Sony will always be the leader in AV.

http://www.trustedreviews.com/reviews/sony-crystal-led-tv-first-look

In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display

Sony pushed the quantum dot technology in 2013 ( Triluminos Display) , that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs like the SJ 8500 Nanocell , and the gaming monitors from asus like the pg 35 vq and pg 27 uq.

https://www.theverge.com/2013/1/16/3881546/sonys-new-triluminous-tvs-pursue-vibrant-hues-with-quantum-dots


OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG , or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:

Image Burning (Burn-In)
Image Retention

Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.

I can tolerate LED + an Efficient Local dimming with minimal blooming , flashligting , none real blacks , and not infinate contrast vs OLEDs image retention and burn-Ins.


According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-life-oled-burn-in-test) and every OLED TV review they give image retention / burn in a score of zero.


For A Real HDR Experience You need:

At least 1000 nits of brightness of real hdr scene at 100% window - sustained

At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.

A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination reproduction.

A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.
New OLEDs strobe static pixels, Samsung had a demo back then achieving 15k hours without any issues in 2015. If you intentionally use the flicker test as you mentioned in the reviews, then yes ofc it will happen, even on LCD if you leave it for a month, ever notice shop LCD displays burn in?

I'd rather deal with an 8 year old OLED with slight burn in than dirty screen effect, uneven backlight bleeding, uneven brightness distribution, and jelly effect due to changing LCD differential bias affected by room temperature, Sharp IGZO panels use temperature sensor for this but very few panels even consider jelly/tilt effect. Not to mention you get the advantage of 90hz+ other than input lag, and smoother blur (LCD). Only a 240 Hz WITH backlight strobing and black frame insertion (resulting in 120) can get close to OLED, but you lose brightness and color accuracy.

I'll add that Samsung mentioned that their Galaxy S3 panel would only last 4 years which real life tests and reports showed, which in turn shows how accurate their tests are done outside lab
My experience with OLED kind of sad , away from numbers and technical websites testing. Practicle owneship of a 65"LG OLED C6. It has a breathtaking colors reproduction, real inky blacks, and screen uniformity.

However from Gaming:

Static images ( like health bars , stats , ingame maps , outgame main menu)

&

Peak HDR real scene brightness ~ 666nits in case of LG

Made this set after 6 months of usage falling in burn ins / image retention when I shit from content to content. Although LG in 2018 models introduced Pixel Refresh' and 'Screen Shift' option to reduce the susceptibility to the innate nature of OLEDs. The risk increases on the contents being used and usage operation hours / day.

The Sony X930e is doing great now , a uniform experience overall without any risk. MicroLed with HDMI 2.1 8K will definitely be my next upgrade.
Tech Guru wrote
Elitism Guru wrote
Tech Guru wrote
Oleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.

As a a side note:

Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...

Sony will always be the leader in AV.

http://www.trustedreviews.com/reviews/sony-crystal-led-tv-first-look

In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display

Sony pushed the quantum dot technology in 2013 ( Triluminos Display) , that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs like the SJ 8500 Nanocell , and the gaming monitors from asus like the pg 35 vq and pg 27 uq.

https://www.theverge.com/2013/1/16/3881546/sonys-new-triluminous-tvs-pursue-vibrant-hues-with-quantum-dots


OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG , or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:

Image Burning (Burn-In)
Image Retention

Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.

I can tolerate LED + an Efficient Local dimming with minimal blooming , flashligting , none real blacks , and not infinate contrast vs OLEDs image retention and burn-Ins.


According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-life-oled-burn-in-test) and every OLED TV review they give image retention / burn in a score of zero.


For A Real HDR Experience You need:

At least 1000 nits of brightness of real hdr scene at 100% window - sustained

At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.

A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination reproduction.

A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.
New OLEDs strobe static pixels, Samsung had a demo back then achieving 15k hours without any issues in 2015. If you intentionally use the flicker test as you mentioned in the reviews, then yes ofc it will happen, even on LCD if you leave it for a month, ever notice shop LCD displays burn in?

I'd rather deal with an 8 year old OLED with slight burn in than dirty screen effect, uneven backlight bleeding, uneven brightness distribution, and jelly effect due to changing LCD differential bias affected by room temperature, Sharp IGZO panels use temperature sensor for this but very few panels even consider jelly/tilt effect. Not to mention you get the advantage of 90hz+ other than input lag, and smoother blur (LCD). Only a 240 Hz WITH backlight strobing and black frame insertion (resulting in 120) can get close to OLED, but you lose brightness and color accuracy.

I'll add that Samsung mentioned that their Galaxy S3 panel would only last 4 years which real life tests and reports showed, which in turn shows how accurate their tests are done outside lab
My experience with OLED kind of sad , away from numbers and technical websites testing. Practicle owneship of a 65"LG OLED C6. It has a breathtaking colors reproduction, real inky blacks, and screen uniformity.

However from Gaming:

Static images ( like health bars , stats , ingame maps , outgame main menu)

&

Peak HDR real scene brightness ~ 666nits in case of LG

Made this set after 6 months of usage falling in burn ins / image retention when I shit from content to content. Although LG in 2018 models introduced Pixel Refresh' and 'Screen Shift' option to reduce the susceptibility to the innate nature of OLEDs. The risk increases on the contents being used and usage operation hours / day.

The Sony X930e is doing great now , a uniform experience overall without any risk. MicroLed with HDMI 2.1 8K will definitely be my next upgrade.
I see, I forgot about how each vendor implements mitigation against these issues, just like how tilt effect remains unresolved in most LCDs except when looking at 80+ inch TVs.
MicroLed will only be cheaper and reasonable at 8k/4k minimum for large displays, afaik the larger the pixels, the more expensive and slower to manufacture.
23 days later
9 days later
I want to get a RTX 2080, unfortunately there is a sort of shortage of known brands like msi and Gigabyte, so the most sold brand now is Zotac, and in Zotac lines the basic 2080 is at 770$ VAT included while the AMP is at 850$!!
Prices are compeltely nuts
I have found a Manli Gallardo 2080 which is like the AMP, a bit too long for my case but even cheaper than the basic zotac, though I have no clues how good are these cards.
Anyone has good addresses for MSI rtx 2080 in Lebanon?
IIRC Mojitech has Gigabyte RX2080 in stock, you can try your luck with him.
You can also check if NewVision has MSI in stock since he's the official distributor.
Gigabyte are overpriced here like +200$ compared to UAE
The advantage of Asus is that they have 2 HDMI but again overpriced
It will be between the Zotac gaming 770$ and the MSI Ventus 820$, dunno if it's worth paying more for the msi?
infiniteloop wroteGigabyte are overpriced here like +200$ compared to UAE
The advantage of Asus is that they have 2 HDMI but again overpriced
It will be between the Zotac gaming 770$ and the MSI Ventus 820$, dunno if it's worth paying more for the msi?
Any idea if its possible to cross-flash vbios with RTX? like pascal.
I did it for 3 clients build in UAE, even mini cards like the zotac 1080 ti mini
infiniteloop wroteGigabyte are overpriced here like +200$ compared to UAE
The advantage of Asus is that they have 2 HDMI but again overpriced
It will be between the Zotac gaming 770$ and the MSI Ventus 820$, dunno if it's worth paying more for the msi?
I would go with Zotac, cheaper and good warranty (5 years) from a well known vendor. Plus, I think both use stock PCBs.
I finally returned the Zotac and took a Gigabyte Gaming OC instead, the shop made a price on it so the difference in price is not huge
5 months later
20 days later
A New Rendering "Fake Real Time Ray Tracing" software mimick that doesnot need RT cores & works on Direct X 11 - Even top dogs from last gen

1080ti (Pascal)
Radeon VII (GCN 5.0)

Fails to Deliver

Sell Them ASAP :)


https://youtu.be/efOR92n9mms
Tech Guru wroteA New Rendering "Fake Real Time Ray Tracing" software mimick that doesnot need RT cores & works on Direct X 11 - Even top dogs from last gen

1080ti (Pascal)
Radeon VII (GCN 5.0)

Fails to Deliver

Sell Them ASAP :)


https://youtu.be/efOR92n9mms
Nothing beats hardware based acceleration, but i wouldn't call it "Fake Real Time Ray Tracing".
kareem_nasser wrote
Tech Guru wroteA New Rendering "Fake Real Time Ray Tracing" software mimick that doesnot need RT cores & works on Direct X 11 - Even top dogs from last gen

1080ti (Pascal)
Radeon VII (GCN 5.0)

Fails to Deliver

Sell Them ASAP :)


https://youtu.be/efOR92n9mms
Nothing beats hardware based acceleration, but i wouldn't call it "Fake Real Time Ray Tracing".
To me it is fake:

They Used canned cubic maps - like old rasterization. SVOGI trace cone, not rays and has its own limitation that's why it's considered a different thing from raytracing. CryEngine does lighting and reflections with voxels (SVOGI). Voxels have been researched and used in other methods as well like VXAO or VXGI. The way how the voxel data structures are built and handled puts them halfway towards ray-tracing in principle. It has many limitaions compared to real time raytracing seen in

BFV - Reflections
Metro Exodus - Global Illumination
Rise of the TombRaider & Call of Duty Modern Warefare
Tech Guru wrote
kareem_nasser wrote
Tech Guru wroteA New Rendering "Fake Real Time Ray Tracing" software mimick that doesnot need RT cores & works on Direct X 11 - Even top dogs from last gen

1080ti (Pascal)
Radeon VII (GCN 5.0)

Fails to Deliver

Sell Them ASAP :)


https://youtu.be/efOR92n9mms
Nothing beats hardware based acceleration, but i wouldn't call it "Fake Real Time Ray Tracing".
To me it is fake:

They Used canned cubic maps - like old rasterization. SVOGI trace cone, not rays and has its own limitation that's why it's considered a different thing from raytracing. CryEngine does lighting and reflections with voxels (SVOGI). Voxels have been researched and used in other methods as well like VXAO or VXGI. The way how the voxel data structures are built and handled puts them halfway towards ray-tracing in principle. It has many limitaions compared to real time raytracing seen in

BFV - Reflections
Metro Exodus - Global Illumination
Rise of the TombRaider & Call of Duty Modern Warefare
The 3 examples you gave are normal for tech still in its infancy in realtime rendering, thus still only implemented mainly on reflections and illumination. For full on graphics rendering in ray tracing it will take years if not a decade (more typical TFLOPs number increases are declining). The next generation of consoles, given that they will include some form of hardware for ray tracing, this will give the tech a boost on the PC market. Since from the previous generation and cross platform development is a thing, especially that some AAA titles are primarily developed on consoles, along with their rendering technology and toolkits.
I will not go into details but I am very sure that by now we all know that Ray Tracing was only a gimmick thing we didn't need and we were forced to pay for it by Nvidia, the number of the games that really adopted that is very low plus the performance impact is stupid
anayman_k7 wroteI will not go into details but I am very sure that by now we all know that Ray Tracing was only a gimmick thing we didn't need and we were forced to pay for it by Nvidia, the number of the games that really adopted that is very low plus the performance impact is stupid
The 2080ti is serving me well for RTX enabled games and nails it. Cod modern warfare with RTX on ~ 60 with SMAA T2x ultra on 2160p using the 2080ti . RTX alone enabled that RDR2 doesnot include, Destroy all old rasterization canned methods of rendering. Since it is real time rendering not backed rendering , very taxing and 60 fps on 2160p. Very optimised CoD with its new engine that scales very well.

RTX + Optimized New Engine = Destroys all what RDRD 2 offers. They should have escaped from old rendering techniques.


Even an RTX 2070 / RTX 2070 super can boost decent frames on 1080p and 1440p when RT is enabled. I do not want to repeat myself again , but you need to grab an RTX 2080ti or RTX 2070 ( in case you have limited budget) to evaluate objectively what RTRT is all about ... The whole gaming industry is shitfting to real time rendering ( including AMD with Navi successor and next gen consoles) away from old canned / baked rasterization techniques.

Saying " a gimmick" is a bit naive of not knowing what all Real Time Ray Tracing is all about. It is not an Nvidia Gamework feature like HBAO+ , Grass Turf , Hairwork , PCSS , HFTS etc.. Now what was done at Holywood movies of ray tracing as a rendering technique for generating an image by tracing the path of light as pixels in an image plane and simulating the effects of its encounters with virtual objects , that need a lot of compuational power - now it is available to the gamer end. You can check all Digital Foundry Analysis.