• Hardware
  • RTX 2080ti - Nvidia "Turing" Upgrade

vlatkozelka wroteRTX cards are still not worth it. It means nothing to compare GTX with RTX in RT when RT itself is a gimmick at this point.

It depends on each gamer " prespective". To me Global Illumination real time ray tracing + real HDR 10 on Metro Exodus adds another level of realism not witnessed before. Both RTRTGL and real HDR10 on a capable screen match up to create one of the most jaw dropping image , you need to try that in peson before your final conclusion.

Also, Real Time Ray Traced shadows in Shadows of the Tomb of the Raider adds very realistic depth for shadows and how shadows reacts in real time with objects and light without "baked" and "canned" shadows in regular rasterization.

Real Time Ray Tracing is a solid move for the gaming industry , something only rendering farms at Hollywood movies used to do only. As such we see the upcoming AMD Navi cards will support RT and the upcoming PS 5 , which includes a custom Navi gpu , will support RT too. The whole gaming industry is moving to that away from only regular rasterization rendering. It is not a gimmick at all , yes it has a performance impact due to sheer calculations of bounced rays, but with a capable hardware you can enjoy that , even on an RTX 2070 a 500 usd card , you can enjoy that at 1080p without any DLSS or at 1440p with 1440 DLSS , both maxed out.

It is definitely a good idea to sell a card when new tech comes in, before the predecessor used price goes down. As much as people obsess at the difference between 1080p and 1440p meter away at 24 inch, same goes with RT, else why not everyone on this thread owns a 1060 running games at medium?
Elitism Guru wroteIt is definitely a good idea to sell a card when new tech comes in, before the predecessor used price goes down. As much as people obsess at the difference between 1080p and 1440p meter away at 24 inch, same goes with RT, else why not everyone on this thread owns a 1060 running games at medium?
RT is a totally new era of gaming realism , not only perceived pixels density or dpi.


"
It's not a gimmicky, flash-in-the-pan tech that will fail to gain a foothold and exit the conversation in under a year. It really is an important part of the future of games, of ensuring that the next generation of games look closer to reality than ever before, and being able to deliver it in real time really is a stunning innovation" .. Source: GamesRadar
7 days later
https://youtu.be/YA0jhM-l7uo

Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.
Tech Guru wrotehttps://youtu.be/YA0jhM-l7uo

Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.
Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.
Elitism Guru wrote
Tech Guru wrotehttps://youtu.be/YA0jhM-l7uo

Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.
Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.
I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.
kareem_nasser wrote
Elitism Guru wrote
Tech Guru wrotehttps://youtu.be/YA0jhM-l7uo

Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.
Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.
I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.
Your statement is true but LCD with Full array local diming have come a long way and close but still it won't give you true black. The only draw backs of OLED is burn in espacially for gaming where you have static images being displayed for a long duration such as maps HUDetc... Another fact is that OLED right now cannot deliver higher peak brightness than LCD at least in tvs, OLED also suffers during daytime to truly enjoy it you must be in a dark to dimly lit room. The recommended peak brightness to enjoy true HDR right now is 1400+ nits as more and more content are being mastered to such brightness and even more. I have an 85 inch sony X900F purchased about a year ago and loving it but it suffers with grey uniformity being such a big panel so yeah imo micro LED is way to go but still extra expansive right now will have to wait 2-3 years to reach economy of scale and become affordable.
bobo619 wrote
kareem_nasser wrote
Elitism Guru wrote Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.
I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.
Your statement is true but LCD with Full array local diming have come a long way and close but still it won't give you true black. The only draw backs of OLED is burn in espacially for gaming where you have static images being displayed for a long duration such as maps HUDetc... Another fact is that OLED right now cannot deliver higher peak brightness than LCD at least in tvs, OLED also suffers during daytime to truly enjoy it you must be in a dark to dimly lit room. The recommended peak brightness to enjoy true HDR right now is 1400+ nits as more and more content are being mastered to such brightness and even more. I have an 85 inch sony X900F purchased about a year ago and loving it but it suffers with grey uniformity being such a big panel so yeah imo micro LED is way to go but still extra expansive right now will have to wait 2-3 years to reach economy of scale and become affordable.
LCD will always outloast OLED, but geeks talk numbers, OLED will have atleast 15 thousand hours before burn in starts since ~2013, atleast that's what Sharp MTBF tests showed. LCDs around 30 thousand hours, so if you use the OLED 12 hours a day expect around 10 years before burn in, LCD will last longer considering you do board repair and backlight LED replacement. Either ways, fanboys will always avoid grey areas, which is what makes this myth alive no matter how OLED improves.
kareem_nasser wrote
Elitism Guru wrote
Tech Guru wrotehttps://youtu.be/YA0jhM-l7uo

Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.
Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.
I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.
Oleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.

As a a side note:

Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...

Sony will always be the leader in AV.

http://www.trustedreviews.com/reviews/sony-crystal-led-tv-first-look

In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display

Sony pushed the quantum dot technology in 2013 ( Triluminos Display) , that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs like the SJ 8500 Nanocell , and the gaming monitors from asus like the pg 35 vq and pg 27 uq.

https://www.theverge.com/2013/1/16/3881546/sonys-new-triluminous-tvs-pursue-vibrant-hues-with-quantum-dots


OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG , or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:

Image Burning (Burn-In)
Image Retention

Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.

I can tolerate LED + an Efficient Local dimming with minimal blooming , flashligting , none real blacks , and not infinate contrast vs OLEDs image retention and burn-Ins.


According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-life-oled-burn-in-test) and every OLED TV review they give image retention / burn in a score of zero.


For A Real HDR Experience You need:

At least 1000 nits of brightness of real hdr scene at 100% window - sustained

At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.

A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination reproduction.

A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.
Tech Guru wrote
kareem_nasser wrote
Elitism Guru wrote Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.
I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.
Oleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.

As a a side note:

Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...

Sony will always be the leader in AV.

http://www.trustedreviews.com/reviews/sony-crystal-led-tv-first-look

In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display

Sony pushed the quantum dot technology in 2013 ( Triluminos Display) , that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs like the SJ 8500 Nanocell , and the gaming monitors from asus like the pg 35 vq and pg 27 uq.

https://www.theverge.com/2013/1/16/3881546/sonys-new-triluminous-tvs-pursue-vibrant-hues-with-quantum-dots


OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG , or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:

Image Burning (Burn-In)
Image Retention

Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.

I can tolerate LED + an Efficient Local dimming with minimal blooming , flashligting , none real blacks , and not infinate contrast vs OLEDs image retention and burn-Ins.


According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-life-oled-burn-in-test) and every OLED TV review they give image retention / burn in a score of zero.


For A Real HDR Experience You need:

At least 1000 nits of brightness of real hdr scene at 100% window - sustained

At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.

A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination reproduction.

A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.
New OLEDs strobe static pixels, Samsung had a demo back then achieving 15k hours without any issues in 2015. If you intentionally use the flicker test as you mentioned in the reviews, then yes ofc it will happen, even on LCD if you leave it for a month, ever notice shop LCD displays burn in?

I'd rather deal with an 8 year old OLED with slight burn in than dirty screen effect, uneven backlight bleeding, uneven brightness distribution, and jelly effect due to changing LCD differential bias affected by room temperature, Sharp IGZO panels use temperature sensor for this but very few panels even consider jelly/tilt effect. Not to mention you get the advantage of 90hz+ other than input lag, and smoother blur (LCD). Only a 240 Hz WITH backlight strobing and black frame insertion (resulting in 120) can get close to OLED, but you lose brightness and color accuracy.

I'll add that Samsung mentioned that their Galaxy S3 panel would only last 4 years which real life tests and reports showed, which in turn shows how accurate their tests are done outside lab
Elitism Guru wrote
Tech Guru wrote
kareem_nasser wrote
I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.
Oleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.

As a a side note:

Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...

Sony will always be the leader in AV.

http://www.trustedreviews.com/reviews/sony-crystal-led-tv-first-look

In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display

Sony pushed the quantum dot technology in 2013 ( Triluminos Display) , that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs like the SJ 8500 Nanocell , and the gaming monitors from asus like the pg 35 vq and pg 27 uq.

https://www.theverge.com/2013/1/16/3881546/sonys-new-triluminous-tvs-pursue-vibrant-hues-with-quantum-dots


OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG , or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:

Image Burning (Burn-In)
Image Retention

Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.

I can tolerate LED + an Efficient Local dimming with minimal blooming , flashligting , none real blacks , and not infinate contrast vs OLEDs image retention and burn-Ins.


According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-life-oled-burn-in-test) and every OLED TV review they give image retention / burn in a score of zero.


For A Real HDR Experience You need:

At least 1000 nits of brightness of real hdr scene at 100% window - sustained

At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.

A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination reproduction.

A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.
New OLEDs strobe static pixels, Samsung had a demo back then achieving 15k hours without any issues in 2015. If you intentionally use the flicker test as you mentioned in the reviews, then yes ofc it will happen, even on LCD if you leave it for a month, ever notice shop LCD displays burn in?

I'd rather deal with an 8 year old OLED with slight burn in than dirty screen effect, uneven backlight bleeding, uneven brightness distribution, and jelly effect due to changing LCD differential bias affected by room temperature, Sharp IGZO panels use temperature sensor for this but very few panels even consider jelly/tilt effect. Not to mention you get the advantage of 90hz+ other than input lag, and smoother blur (LCD). Only a 240 Hz WITH backlight strobing and black frame insertion (resulting in 120) can get close to OLED, but you lose brightness and color accuracy.

I'll add that Samsung mentioned that their Galaxy S3 panel would only last 4 years which real life tests and reports showed, which in turn shows how accurate their tests are done outside lab
My experience with OLED kind of sad , away from numbers and technical websites testing. Practicle owneship of a 65"LG OLED C6. It has a breathtaking colors reproduction, real inky blacks, and screen uniformity.

However from Gaming:

Static images ( like health bars , stats , ingame maps , outgame main menu)

&

Peak HDR real scene brightness ~ 666nits in case of LG

Made this set after 6 months of usage falling in burn ins / image retention when I shit from content to content. Although LG in 2018 models introduced Pixel Refresh' and 'Screen Shift' option to reduce the susceptibility to the innate nature of OLEDs. The risk increases on the contents being used and usage operation hours / day.

The Sony X930e is doing great now , a uniform experience overall without any risk. MicroLed with HDMI 2.1 8K will definitely be my next upgrade.
Tech Guru wrote
Elitism Guru wrote
Tech Guru wrote
Oleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.

As a a side note:

Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...

Sony will always be the leader in AV.

http://www.trustedreviews.com/reviews/sony-crystal-led-tv-first-look

In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display

Sony pushed the quantum dot technology in 2013 ( Triluminos Display) , that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs like the SJ 8500 Nanocell , and the gaming monitors from asus like the pg 35 vq and pg 27 uq.

https://www.theverge.com/2013/1/16/3881546/sonys-new-triluminous-tvs-pursue-vibrant-hues-with-quantum-dots


OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG , or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:

Image Burning (Burn-In)
Image Retention

Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.

I can tolerate LED + an Efficient Local dimming with minimal blooming , flashligting , none real blacks , and not infinate contrast vs OLEDs image retention and burn-Ins.


According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-life-oled-burn-in-test) and every OLED TV review they give image retention / burn in a score of zero.


For A Real HDR Experience You need:

At least 1000 nits of brightness of real hdr scene at 100% window - sustained

At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.

A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination reproduction.

A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.
New OLEDs strobe static pixels, Samsung had a demo back then achieving 15k hours without any issues in 2015. If you intentionally use the flicker test as you mentioned in the reviews, then yes ofc it will happen, even on LCD if you leave it for a month, ever notice shop LCD displays burn in?

I'd rather deal with an 8 year old OLED with slight burn in than dirty screen effect, uneven backlight bleeding, uneven brightness distribution, and jelly effect due to changing LCD differential bias affected by room temperature, Sharp IGZO panels use temperature sensor for this but very few panels even consider jelly/tilt effect. Not to mention you get the advantage of 90hz+ other than input lag, and smoother blur (LCD). Only a 240 Hz WITH backlight strobing and black frame insertion (resulting in 120) can get close to OLED, but you lose brightness and color accuracy.

I'll add that Samsung mentioned that their Galaxy S3 panel would only last 4 years which real life tests and reports showed, which in turn shows how accurate their tests are done outside lab
My experience with OLED kind of sad , away from numbers and technical websites testing. Practicle owneship of a 65"LG OLED C6. It has a breathtaking colors reproduction, real inky blacks, and screen uniformity.

However from Gaming:

Static images ( like health bars , stats , ingame maps , outgame main menu)

&

Peak HDR real scene brightness ~ 666nits in case of LG

Made this set after 6 months of usage falling in burn ins / image retention when I shit from content to content. Although LG in 2018 models introduced Pixel Refresh' and 'Screen Shift' option to reduce the susceptibility to the innate nature of OLEDs. The risk increases on the contents being used and usage operation hours / day.

The Sony X930e is doing great now , a uniform experience overall without any risk. MicroLed with HDMI 2.1 8K will definitely be my next upgrade.
I see, I forgot about how each vendor implements mitigation against these issues, just like how tilt effect remains unresolved in most LCDs except when looking at 80+ inch TVs.
MicroLed will only be cheaper and reasonable at 8k/4k minimum for large displays, afaik the larger the pixels, the more expensive and slower to manufacture.
23 days later
9 days later
I want to get a RTX 2080, unfortunately there is a sort of shortage of known brands like msi and Gigabyte, so the most sold brand now is Zotac, and in Zotac lines the basic 2080 is at 770$ VAT included while the AMP is at 850$!!
Prices are compeltely nuts
I have found a Manli Gallardo 2080 which is like the AMP, a bit too long for my case but even cheaper than the basic zotac, though I have no clues how good are these cards.
Anyone has good addresses for MSI rtx 2080 in Lebanon?
IIRC Mojitech has Gigabyte RX2080 in stock, you can try your luck with him.
You can also check if NewVision has MSI in stock since he's the official distributor.
Gigabyte are overpriced here like +200$ compared to UAE
The advantage of Asus is that they have 2 HDMI but again overpriced
It will be between the Zotac gaming 770$ and the MSI Ventus 820$, dunno if it's worth paying more for the msi?
infiniteloop wroteGigabyte are overpriced here like +200$ compared to UAE
The advantage of Asus is that they have 2 HDMI but again overpriced
It will be between the Zotac gaming 770$ and the MSI Ventus 820$, dunno if it's worth paying more for the msi?
Any idea if its possible to cross-flash vbios with RTX? like pascal.
I did it for 3 clients build in UAE, even mini cards like the zotac 1080 ti mini
infiniteloop wroteGigabyte are overpriced here like +200$ compared to UAE
The advantage of Asus is that they have 2 HDMI but again overpriced
It will be between the Zotac gaming 770$ and the MSI Ventus 820$, dunno if it's worth paying more for the msi?
I would go with Zotac, cheaper and good warranty (5 years) from a well known vendor. Plus, I think both use stock PCBs.
I finally returned the Zotac and took a Gigabyte Gaming OC instead, the shop made a price on it so the difference in price is not huge