LebGeeks

A community for technology geeks in Lebanon.

You are not logged in.

#176 February 23 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

True AI / Neural Networks/Deep Learning   Power of RTX Line  Tensor Cores.

Metro Exodus' DLSS Receives A Massive Quality Boost - Graphical Quality Tested:

"Nvidia's promise with DLSS was that the quality of images would improve over time, with Nvidia constantly finding new data to add to their deep learning program. More data that their supercomputer can crunch through to create better versions of their algorithm, both in terms of performance and image quality.

Thanks to Metro Exodus' latest patch, in conjunction with Nvidia's latest drivers, the quality of DLSS within the game has increased significantly, so much so that we can actually recommend using the feature now. This increase in graphical fidelity bodes well for the future of DLSS".



https://www.overclock3d.net/news/softwa … _tracing/1

Last edited by Tech Guru (February 23 2019)

Offline

#177 February 24 2019

ManOwaRR
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Metro Exodus is the best  2k19  game  ,  i rarely  replay any single player game after i finish it , but this one is that awesome that i'am playing it again after the new patch and now with 4k rtx ultra + dlss on  , pretty impressive

Offline

#178 February 25 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Metro Exodus Latest Patch Improved DLSS image quality a lot.
So if "Deep Learning Super Sampling" Using Nvidia Saturn Super Computers - Should Learn and get better over time , I wonder if this game will receive a couple  of DLSS patches in the future, and maybe at some point the differences in image quality will be barely indistinguishable. The RTX series is solid , but some people ride the "hatewagon" masking that.

Offline

#179 March 8 2019

bobo619
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Does anyone know how can one check if the Asus ROG RTX 2080ti have micron or samsung vram without installing it and checking GPUZ? I found that mojitech has it in stock and willing to buy it if and only if it has samsung vrams  https://mojitech.net/product/graphic-ca … 1g-gaming/. Also does anyone know where i can find high end rams 3600mhz+ and with good latency in Lebanon.

Offline

#180 March 9 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

bobo619 wrote:

Does anyone know how can one check if the Asus ROG RTX 2080ti have micron or samsung vram without installing it and checking GPUZ? I found that mojitech has it in stock and willing to buy it if and only if it has samsung vrams  https://mojitech.net/product/graphic-ca … 1g-gaming/. Also does anyone know where i can find high end rams 3600mhz+ and with good latency in Lebanon.

Sadly You cannot check without GPU Z. Mojitech listed them on their website , but no actual stock. Usually , batches after November 2018 have been replaced from Micron to Samsung across the FE & many 3rd party brands. 

I can get you the

ASUS ROG-STRIX-RTX2080TI-O11G-GAMING ROG Strix GeForceRTX 2080TI Overclocked O11G GDDR6 HDMI DP 1.4 USB Type-c Graphic Card with full 3 years warranty.

For the Rams

I can got you the Trident Z-Royal edition starting from 3600mhz + & on.

Last edited by Tech Guru (March 9 2019)

Offline

#181 March 28 2019

mmk92
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

The Omen X Emperium from HP promises HDR 1000, Gsync, ultra low refresh rate, and 120hz refresh rate on a 50 inch TV!
Finally a TV for enthusiasts!
This will surely make the 2080ti sweat!

Offline

#182 March 29 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

mmk92 wrote:

The Omen X Emperium from HP promises HDR 1000, Gsync, ultra low refresh rate, and 120hz refresh rate on a 50 inch TV!
Finally a TV for enthusiasts!
This will surely make the 2080ti sweat!

Yep been following this ,  Nvidia kicked out that initially last year as " Big Format Gaming Displays. They are 65" to start with & they are  gaming monitors ( not TVs , a monitor with a streamer - Nvidia Shield).


ROG Swift PG65 from Asus will lunch soon & monitors manufacturers will follow.

Omen X Emperium one major pitfall it uses a 8bit Panel + FRC to make it 10 bit Panel , & a one of the core requirements of  HDR is to have a native 10bit color depth panel for HDR 10.

Hopefully they will start rolling out Native 10 bit panel on the high-end  HDR Gaming monitors ( Even the PG 27uq or the Predator X27 are 8 bit + FRC or Dithering) -

An 8-bit image means there are two to the power of eight shades for red, green, and blue. This is 256 different values per channel. When combining those channels we can have 256 x 256 x 256 different color combinations, or roughly 16 million. A 10-bit image can display 1,024 shades of color per channel, or billions of color combinations. The 1024 shades in a real 10bit panel help it in displaying HDR 10 Rec 2020 and DCI P3 colors spaces properly without color banding issues and washed out colors in HDR illuminated bright parts or dark parts that help shows details and not crashed colors. 


As a Side note , I sold my Asus ROG PG 348Q

Next week I will order from Amazon the  Alienware AW 3418 DW  as a replacement - currently the best Ultra Wide WQHD Gaming Monitor.

ZYHLlQI.jpg
wfOXq0n.jpg

Until The Asus ROG PG 35VQ is released , I will re upgrade it.

Offline

#183 March 29 2019

PowerPC
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

I have this monitor, and it's great, but i think LG came out with a slightly better one. Nano cell tech + 120hz native

Offline

#184 March 29 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

PowerPC wrote:

I have this monitor, and it's great, but i think LG came out with a slightly better one. Nano cell tech + 120hz native

Oh really from where did you get it and by how much ? I am thinking of shipping next week from Amazon - Arrived to my Door sealed will be around ~ 1430 USD.

Shall I pull the trigger or wait for the PG 35VQ release ( no ETA & been pushing the release date since 1 & the half year. I do believe they are waiting for Computex 2019 in end of May.


https://youtu.be/rvq4GT2qhS0

You are refering to the LG 34GK950G - Gsync @ 120hz  ,  & the LG 34GK950F variant is FreeSync 2 with HDR and 144hz.

Last edited by Tech Guru (March 29 2019)

Offline

#185 March 29 2019

PowerPC
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

i'm UK based and got it directly from their UK website. paid around 1000USD back in October.
heard rumors that the 35VQ has been cancelled due to manufacturing cost. We haven't heard anything about it in a while.

Yes that's those are the LG monitors i'm referring to, with native 120 whereas the AW needs to be overclocked to hit that refresh rate.

Offline

#186 March 29 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

PowerPC wrote:

i'm UK based and got it directly from their UK website. paid around 1000USD back in October.
heard rumors that the 35VQ has been cancelled due to manufacturing cost. We haven't heard anything about it in a while.

Yes that's those are the LG monitors i'm referring to, with native 120 whereas the AW needs to be overclocked to hit that refresh rate.

It does it have any effect when you OC the AW from 100 hz native o 120 Hz from your gaming experience ( any apparent horizental or vertical  scanning lines). The OC of the PG 348q from 60hz native to 100hz OC ,  I witnessed some line but not very apparent until you come very close to the monitor.

For the PG 35VQ check the above recently released youtube video ,  and  check out some tweets too.

https://twitter.com/hashtag/pg35vq?s=08

Chose latest Tweets.

I am in a dilemma now ,shall I pull the trigger for the AW. Please advice

Last edited by Tech Guru (March 29 2019)

Offline

#187 March 30 2019

PowerPC
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

I don't think the AW will be a huge upgrade to you PG. I went from 27" 60HZ to ultrawide 120, that was a great change.
If you go AW you won't notice the extra 20hz and the ultrawide experience is the same, so in my opinion it's not a worthwhile upgrade.
Better to wait for new tech like micro led or higher res ultrawide (with high refresh rate), HDR 1000, etc.. My 2 cents.

Offline

#188 April 16 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Real Time Ray Tracing is a #True absymal performance  on none RTX Cards -

Digital Foundry:
Pascal Ray Tracing Tested! GTX 1080 Ti vs RTX 2080/ RTX 2060 + More

https://youtu.be/TkY-20kdXl0

For RT any graphic card can do it before this announcements due to DXR Dirext X 12 March 2018 Extension.

It's  worse to let cuda cores alone do rasterization + ray tracing calculation simultaneously very tense - thus they injected RT cores to do the Ray Tracing calculations.

Claiming that RTX cards are not worth it in such announcements is a clickbait

Offline

#189 April 16 2019

Elitism Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Tech Guru wrote:

Real Time Ray Tracing is a #True absymal performance  on none RTX Cards -

Digital Foundry:
Pascal Ray Tracing Tested! GTX 1080 Ti vs RTX 2080/ RTX 2060 + More

https://youtu.be/TkY-20kdXl0

For RT any graphic card can do it before this announcements due to DXR Dirext X 12 March 2018 Extension.

It's  worse to let cuda cores alone do rasterization + ray tracing calculation simultaneously very tense - thus they injected RT cores to do the Ray Tracing calculations.

Claiming that RTX cards are not worth it in such announcements is a clickbait

No one claimed such thing, its probably suited for accelerated (offscreen) rendering, non real time applications.

Offline

#190 April 16 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Elitism Guru wrote:
Tech Guru wrote:

Real Time Ray Tracing is a #True absymal performance  on none RTX Cards -

Digital Foundry:
Pascal Ray Tracing Tested! GTX 1080 Ti vs RTX 2080/ RTX 2060 + More

https://youtu.be/TkY-20kdXl0

For RT any graphic card can do it before this announcements due to DXR Dirext X 12 March 2018 Extension.

It's  worse to let cuda cores alone do rasterization + ray tracing calculation simultaneously very tense - thus they injected RT cores to do the Ray Tracing calculations.

Claiming that RTX cards are not worth it in such announcements is a clickbait

No one claimed such thing, its probably suited for accelerated (offscreen) rendering, non real time applications.

Let me summarize few things mate: 


1st:

Do not clap hard that is a new announcement ( none existent before ) for blind hate on Turing or technically "Illiterate People" who do not have tech awareness ( not you) :

DXR is part of Direct X 12 API set features & are not hardware limited  , any graphic card that has cuda cores / steam cores ( AMD) can support Ray tracing  ,  this expansion was back announced in March 2018.

https://www.anandtech.com/show/12547/ex … raytracing

For Performance Impact Analysis with 1080ti (doing RT calculations on Cuda cores solely) vs 2080 (with RT Cores) -  they recently released comprehensive charts in conjuction with this driver update for GTX cards to direct the upcoming performance impact

https://www.nvidia.com/en-us/geforce/ne … ming-soon/

The RT Cores in the RTX series removes the rt calculation & they do them in a kind of hybrid rendering method and leave the cuda cores for raw rasterization rendering.

If you read well , the rt on traditional cuda cores doing their caculations  has much more performance impact  pronounced with out having dedicated RT cores to ease / help them.

2nd

RT at the end a feature when you have an rtx 2080 that will performs much more in RT compared to 1080ti , > = in none RT raw rasterization , 2080 is definitely a much better choice than the 1080ti.

Wasn't  the titan V was tested in January and was able to do RT without RT cores ?

"The Titan V is probably this capable in ray tracing even without RT cores due to its sheer power. It has more shader units than the Titan RTX (5120 versus 4608), for instance. However, Battlefield V only features ray traced reflections. Other games will soon ship with several raytraced features, such as Metro Exodus (RT-powered GI and ambient occlusion), Control (RT-powered reflections, diffuse GI and contact shadows), Atomic Heart (RT-powered shadows, ambient occlusion and reflections) and more.

It’s possible that the Titan V will lose more ground in those scenarios, though we’ll only know for sure once these new titles will be available for benchmarking. Until then, stay tuned." -Wccftech

https://wccftech.com/titan-v-high-fps-b … eld-v-rtx/

But look at the performance gap when RT is being calculated solely on cuda cores  ?

Finally , for me , the RTX 2080ti is the first decent 2160p Ultra  graphic card on all AAA titles away from RT and in raw rasterization was the 1st motivator to get rid of the 1080ti & it is sluggish 2160p performance with min ~ 40 fps aside of HDR 10 latency.

The performance will be deep shit on 10th series with no dedicated RT Cores. Look at youtube many tested RT on 1080tis before this announcement  a lot before hand.

~ 60 fps 2160p Metro Exodus Ultra RT High on one rtx 2080ti without DLSS , will a 1080ti do that  even on low IQ I bet ?

It is not a marketing trap , it is a trap for tech illitrate only , any sound buyer can know that RT support is part of DXR Dirext 12 Features way before this " announcements"- My  card say  1080 or rx 580 can supppot RT but RT calculations are very tense shall I drop a card with dedicated RT cores or no ?

All are not related to Nvidia or AMD products choices  , all are related to have tech awareness.

Cuda Cores  + RT Cores = Much Better RT Calculations

Steam Cores  + RT Cores =  Much Better RT Calculations

Whether Nvidia or AMD

DXR  is Part of Direct X 12 Sub Features , Expansion added in March 2018 long before Turing was announced.

Buyer Decision Choices

Yes my 1080ti support RT shall I keep it and do it in cuda cores or shall I drop a hardware with dedicated RT cores ?

Offline

#191 April 16 2019

Elitism Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Tech Guru wrote:
Elitism Guru wrote:
Tech Guru wrote:

Real Time Ray Tracing is a #True absymal performance  on none RTX Cards -

Digital Foundry:
Pascal Ray Tracing Tested! GTX 1080 Ti vs RTX 2080/ RTX 2060 + More

https://youtu.be/TkY-20kdXl0

For RT any graphic card can do it before this announcements due to DXR Dirext X 12 March 2018 Extension.

It's  worse to let cuda cores alone do rasterization + ray tracing calculation simultaneously very tense - thus they injected RT cores to do the Ray Tracing calculations.

Claiming that RTX cards are not worth it in such announcements is a clickbait

No one claimed such thing, its probably suited for accelerated (offscreen) rendering, non real time applications.

Let me summarize few things mate: 


1st:

Do not clap hard that is a new announcement ( none existent before ) for blind hate on Turing or technically "Illiterate People" who do not have tech awareness ( not you) :

DXR is part of Direct X 12 API set features & are not hardware limited  , any graphic card that has cuda cores / steam cores ( AMD) can support Ray tracing  ,  this expansion was back announced in March 2018.

https://www.anandtech.com/show/12547/ex … raytracing

For Performance Impact Analysis with 1080ti (doing RT calculations on Cuda cores solely) vs 2080 (with RT Cores) -  they recently released comprehensive charts in conjuction with this driver update for GTX cards to direct the upcoming performance impact

https://www.nvidia.com/en-us/geforce/ne … ming-soon/

The RT Cores in the RTX series removes the rt calculation & they do them in a kind of hybrid rendering method and leave the cuda cores for raw rasterization rendering.

If you read well , the rt on traditional cuda cores doing their caculations  has much more performance impact  pronounced with out having dedicated RT cores to ease / help them.

2nd

RT at the end a feature when you have an rtx 2080 that will performs much more in RT compared to 1080ti , > = in none RT raw rasterization , 2080 is definitely a much better choice than the 1080ti.

Wasn't  the titan V was tested in January and was able to do RT without RT cores ?

"The Titan V is probably this capable in ray tracing even without RT cores due to its sheer power. It has more shader units than the Titan RTX (5120 versus 4608), for instance. However, Battlefield V only features ray traced reflections. Other games will soon ship with several raytraced features, such as Metro Exodus (RT-powered GI and ambient occlusion), Control (RT-powered reflections, diffuse GI and contact shadows), Atomic Heart (RT-powered shadows, ambient occlusion and reflections) and more.

It’s possible that the Titan V will lose more ground in those scenarios, though we’ll only know for sure once these new titles will be available for benchmarking. Until then, stay tuned." -Wccftech

https://wccftech.com/titan-v-high-fps-b … eld-v-rtx/

But look at the performance gap when RT is being calculated solely on cuda cores  ?

Finally , for me , the RTX 2080ti is the first decent 2160p Ultra  graphic card on all AAA titles away from RT and in raw rasterization was the 1st motivator to get rid of the 1080ti & it is sluggish 2160p performance with min ~ 40 fps aside of HDR 10 latency.

The performance will be deep shit on 10th series with no dedicated RT Cores. Look at youtube many tested RT on 1080tis before this announcement  a lot before hand.

~ 60 fps 2160p Metro Exodus Ultra RT High on one rtx 2080ti without DLSS , will a 1080ti do that  even on low IQ I bet ?

It is not a marketing trap , it is a trap for tech illitrate only , any sound buyer can know that RT support is part of DXR Dirext 12 Features way before this " announcements"- My  card say  1080 or rx 580 can supppot RT but RT calculations are very tense shall I drop a card with dedicated RT cores or no ?

All are not related to Nvidia or AMD products choices  , all are related to have tech awareness.

Cuda Cores  + RT Cores = Much Better RT Calculations

Steam Cores  + RT Cores =  Much Better RT Calculations

Whether Nvidia or AMD

DXR  is Part of Direct X 12 Sub Features , Expansion added in March 2018 long before Turing was announced.

Buyer Decision Choices

Yes my 1080ti support RT shall I keep it and do it in cuda cores or shall I drop a hardware with dedicated RT cores ?

Please don't take this in any offense, no one said GTX cards are superior or even advisable for RT even when I said off screen, it's just a brief description of Nvidia's intentions, and useful for existing owner for render recordings.

It's the same repetitive argument that you seem to use, take this example:
-I do video editing
-Export rendering of the finalized video should take max 4 hours
-I own a circa-2016 quad processor, it does the job to this day under 4 hours
-I could upgrade my setup and reduce it to an hour
-Again, my requirements are to be met, time should take less than 4 hours
-If requirements become less than 4 hours, I would upgrade
Life is much simpler that way.
A 15fps for render isn't bad at all for example when using Pascal for VFX artists.

Thank you for the clarification though, a good tl;dr w/o the need to visit Anandtech ;)

Last edited by Elitism Guru (April 16 2019)

Offline

#192 April 16 2019

Elitism Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Tech Guru wrote:
mmk92 wrote:

The Omen X Emperium from HP promises HDR 1000, Gsync, ultra low refresh rate, and 120hz refresh rate on a 50 inch TV!
Finally a TV for enthusiasts!
This will surely make the 2080ti sweat!

Yep been following this ,  Nvidia kicked out that initially last year as " Big Format Gaming Displays. They are 65" to start with & they are  gaming monitors ( not TVs , a monitor with a streamer - Nvidia Shield).


ROG Swift PG65 from Asus will lunch soon & monitors manufacturers will follow.

Omen X Emperium one major pitfall it uses a 8bit Panel + FRC to make it 10 bit Panel , & a one of the core requirements of  HDR is to have a native 10bit color depth panel for HDR 10.

Hopefully they will start rolling out Native 10 bit panel on the high-end  HDR Gaming monitors ( Even the PG 27uq or the Predator X27 are 8 bit + FRC or Dithering) -

An 8-bit image means there are two to the power of eight shades for red, green, and blue. This is 256 different values per channel. When combining those channels we can have 256 x 256 x 256 different color combinations, or roughly 16 million. A 10-bit image can display 1,024 shades of color per channel, or billions of color combinations. The 1024 shades in a real 10bit panel help it in displaying HDR 10 Rec 2020 and DCI P3 colors spaces properly without color banding issues and washed out colors in HDR illuminated bright parts or dark parts that help shows details and not crashed colors. 


As a Side note , I sold my Asus ROG PG 348Q

Next week I will order from Amazon the  Alienware AW 3418 DW  as a replacement - currently the best Ultra Wide WQHD Gaming Monitor.

http://i.imgur.com/ZYHLlQI.jpg
http://i.imgur.com/wfOXq0n.jpg

Until The Asus ROG PG 35VQ is released , I will re upgrade it.

Temporal dithering is only an issue on low resolutions/pixel density, it works by alternating channel depth queue on subpixel (alternating color), the only difference between dithered 8bit and 10bit native on low PPI is gamma persistence, especially in HDR.

Last edited by Elitism Guru (April 16 2019)

Offline

#193 April 17 2019

vlatkozelka
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

RTX cards are still not worth it. It means nothing to compare GTX with RTX in RT when RT itself is a gimmick at this point.

Offline

#194 April 17 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

vlatkozelka wrote:

RTX cards are still not worth it. It means nothing to compare GTX with RTX in RT when RT itself is a gimmick at this point.


It depends on each gamer " prespective".  To me Global Illumination real time ray tracing +  real HDR 10 on Metro Exodus adds another level of realism not witnessed before.  Both RTRTGL and real HDR10 on a capable screen match up to create  one of the most jaw dropping image ,  you need to try that in peson before your final conclusion.

Also, Real Time Ray Traced shadows in Shadows of the Tomb of the Raider adds very realistic depth for shadows and how shadows reacts in  real time  with objects and light without "baked" and "canned" shadows in regular rasterization.

Real Time Ray Tracing is a solid move for the gaming industry , something only rendering farms at Hollywood movies used to do only. As such we see the upcoming AMD Navi cards will support RT and the upcoming PS 5 ,  which includes a custom Navi gpu , will support RT too. The whole gaming industry is moving to that  away from only regular rasterization rendering. It is not a gimmick at all ,  yes it has a performance  impact due to sheer calculations of bounced rays,  but with a  capable hardware you can enjoy that ,  even on an RTX 2070 a 500 usd card , you can enjoy that at 1080p without any DLSS or at 1440p with  1440 DLSS , both maxed out.

LMkxiO3.jpg

Last edited by Tech Guru (April 17 2019)

Offline

#195 April 17 2019

Elitism Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

It is definitely a good idea to sell a card when new tech comes in, before the predecessor used price goes down. As much as people obsess at the difference between 1080p and 1440p meter away at 24 inch, same goes with RT, else why not everyone on this thread owns a 1060 running games at medium?

Offline

#196 April 17 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Elitism Guru wrote:

It is definitely a good idea to sell a card when new tech comes in, before the predecessor used price goes down. As much as people obsess at the difference between 1080p and 1440p meter away at 24 inch, same goes with RT, else why not everyone on this thread owns a 1060 running games at medium?

RT is a totally new era of gaming realism ,  not only perceived pixels density or dpi.


"
It's not a gimmicky, flash-in-the-pan tech that will fail to gain a foothold and exit the conversation in under a year. It really is an important part of the future of games, of ensuring that the next generation of games look closer to reality than ever before, and being able to deliver it in real time really is a stunning innovation" .. Source: GamesRadar

Offline

#197 April 24 2019

Tech Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

https://youtu.be/YA0jhM-l7uo

Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing  + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.

Offline

#198 April 24 2019

Elitism Guru
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Tech Guru wrote:

https://youtu.be/YA0jhM-l7uo

Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing  + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.

Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.

Offline

#199 April 25 2019

kareem_nasser
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

Elitism Guru wrote:
Tech Guru wrote:

https://youtu.be/YA0jhM-l7uo

Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing  + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.

Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.

I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.

Offline

#200 April 25 2019

bobo619
Member

Re: RTX 2080ti - Nvidia "Turing" Upgrade

kareem_nasser wrote:
Elitism Guru wrote:
Tech Guru wrote:

https://youtu.be/YA0jhM-l7uo

Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing  + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.

Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.

I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.

Your statement is true but LCD with Full array local diming have come a long way and close but still it won't give you true black. The only draw backs of OLED is burn in espacially for gaming where you have static images being displayed for a long duration such as maps HUDetc... Another fact is that OLED right now cannot deliver higher peak brightness than LCD at least in tvs, OLED also suffers during daytime to truly enjoy it you must be in a dark to dimly lit room. The recommended peak brightness to enjoy true HDR right now is 1400+ nits as more and more content are being mastered to such brightness and even more. I have an 85 inch sony X900F purchased about a year ago and loving it but it suffers with grey uniformity being such a big panel so yeah imo micro LED is way to go but still extra expansive right now will have to wait 2-3 years to reach economy of scale and become affordable.

Last edited by bobo619 (April 25 2019)

Offline

Board footer