Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.Tech Guru wrotehttps://youtu.be/YA0jhM-l7uo
Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.
RTX 2080ti - Nvidia "Turing" Upgrade
I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.Elitism Guru wroteWould love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.Tech Guru wrotehttps://youtu.be/YA0jhM-l7uo
Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.
- Edited
Your statement is true but LCD with Full array local diming have come a long way and close but still it won't give you true black. The only draw backs of OLED is burn in espacially for gaming where you have static images being displayed for a long duration such as maps HUDetc... Another fact is that OLED right now cannot deliver higher peak brightness than LCD at least in tvs, OLED also suffers during daytime to truly enjoy it you must be in a dark to dimly lit room. The recommended peak brightness to enjoy true HDR right now is 1400+ nits as more and more content are being mastered to such brightness and even more. I have an 85 inch sony X900F purchased about a year ago and loving it but it suffers with grey uniformity being such a big panel so yeah imo micro LED is way to go but still extra expansive right now will have to wait 2-3 years to reach economy of scale and become affordable.kareem_nasser wroteI guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.Elitism Guru wroteWould love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.Tech Guru wrotehttps://youtu.be/YA0jhM-l7uo
Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.
- Edited
LCD will always outloast OLED, but geeks talk numbers, OLED will have atleast 15 thousand hours before burn in starts since ~2013, atleast that's what Sharp MTBF tests showed. LCDs around 30 thousand hours, so if you use the OLED 12 hours a day expect around 10 years before burn in, LCD will last longer considering you do board repair and backlight LED replacement. Either ways, fanboys will always avoid grey areas, which is what makes this myth alive no matter how OLED improves.bobo619 wroteYour statement is true but LCD with Full array local diming have come a long way and close but still it won't give you true black. The only draw backs of OLED is burn in espacially for gaming where you have static images being displayed for a long duration such as maps HUDetc... Another fact is that OLED right now cannot deliver higher peak brightness than LCD at least in tvs, OLED also suffers during daytime to truly enjoy it you must be in a dark to dimly lit room. The recommended peak brightness to enjoy true HDR right now is 1400+ nits as more and more content are being mastered to such brightness and even more. I have an 85 inch sony X900F purchased about a year ago and loving it but it suffers with grey uniformity being such a big panel so yeah imo micro LED is way to go but still extra expansive right now will have to wait 2-3 years to reach economy of scale and become affordable.kareem_nasser wroteI guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.Elitism Guru wrote Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.
- Edited
Oleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.kareem_nasser wroteI guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.Elitism Guru wroteWould love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.Tech Guru wrotehttps://youtu.be/YA0jhM-l7uo
Metro Exodus: 2160p60 Ultra IQ + Ultra Global Illumination Real Time Ray Tracing + HDR 10 on a Real HDR Capable Screen (HDR 1000, Proper Local Dimming , Native 10bit Color Depth Panel) = Next Level of Realism.
As a a side note:
Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...
Sony will always be the leader in AV.
http://www.trustedreviews.com/reviews/sony-crystal-led-tv-first-look
In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display
Sony pushed the quantum dot technology in 2013 ( Triluminos Display) , that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs like the SJ 8500 Nanocell , and the gaming monitors from asus like the pg 35 vq and pg 27 uq.
https://www.theverge.com/2013/1/16/3881546/sonys-new-triluminous-tvs-pursue-vibrant-hues-with-quantum-dots
OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG , or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:
Image Burning (Burn-In)
Image Retention
Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.
I can tolerate LED + an Efficient Local dimming with minimal blooming , flashligting , none real blacks , and not infinate contrast vs OLEDs image retention and burn-Ins.
According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-life-oled-burn-in-test) and every OLED TV review they give image retention / burn in a score of zero.
For A Real HDR Experience You need:
At least 1000 nits of brightness of real hdr scene at 100% window - sustained
At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.
A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination reproduction.
A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.
- Edited
New OLEDs strobe static pixels, Samsung had a demo back then achieving 15k hours without any issues in 2015. If you intentionally use the flicker test as you mentioned in the reviews, then yes ofc it will happen, even on LCD if you leave it for a month, ever notice shop LCD displays burn in?Tech Guru wroteOleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.kareem_nasser wroteI guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.Elitism Guru wrote Would love to see this on OLED especially when you have to lower brightness in dark rooms for more immersive experience, have yet to see HDR LCD that doesn't suck at low B.
As a a side note:
Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...
Sony will always be the leader in AV.
http://www.trustedreviews.com/reviews/sony-crystal-led-tv-first-look
In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display
Sony pushed the quantum dot technology in 2013 ( Triluminos Display) , that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs like the SJ 8500 Nanocell , and the gaming monitors from asus like the pg 35 vq and pg 27 uq.
https://www.theverge.com/2013/1/16/3881546/sonys-new-triluminous-tvs-pursue-vibrant-hues-with-quantum-dots
OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG , or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:
Image Burning (Burn-In)
Image Retention
Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.
I can tolerate LED + an Efficient Local dimming with minimal blooming , flashligting , none real blacks , and not infinate contrast vs OLEDs image retention and burn-Ins.
According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-life-oled-burn-in-test) and every OLED TV review they give image retention / burn in a score of zero.
For A Real HDR Experience You need:
At least 1000 nits of brightness of real hdr scene at 100% window - sustained
At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.
A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination reproduction.
A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.
I'd rather deal with an 8 year old OLED with slight burn in than dirty screen effect, uneven backlight bleeding, uneven brightness distribution, and jelly effect due to changing LCD differential bias affected by room temperature, Sharp IGZO panels use temperature sensor for this but very few panels even consider jelly/tilt effect. Not to mention you get the advantage of 90hz+ other than input lag, and smoother blur (LCD). Only a 240 Hz WITH backlight strobing and black frame insertion (resulting in 120) can get close to OLED, but you lose brightness and color accuracy.
I'll add that Samsung mentioned that their Galaxy S3 panel would only last 4 years which real life tests and reports showed, which in turn shows how accurate their tests are done outside lab
- Edited
.
- Edited
My experience with OLED kind of sad , away from numbers and technical websites testing. Practicle owneship of a 65"LG OLED C6. It has a breathtaking colors reproduction, real inky blacks, and screen uniformity.Elitism Guru wroteNew OLEDs strobe static pixels, Samsung had a demo back then achieving 15k hours without any issues in 2015. If you intentionally use the flicker test as you mentioned in the reviews, then yes ofc it will happen, even on LCD if you leave it for a month, ever notice shop LCD displays burn in?Tech Guru wroteOleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.kareem_nasser wrote
I guess we both know no matter what PR terms they use on any LCD based TV set, they all inherently have bad contrast ratios. The transition to OLED or at least microLED/Crystal LED is inevitable and i would lean toward the former becoming a standard.
As a a side note:
Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...
Sony will always be the leader in AV.
http://www.trustedreviews.com/reviews/sony-crystal-led-tv-first-look
In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display
Sony pushed the quantum dot technology in 2013 ( Triluminos Display) , that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs like the SJ 8500 Nanocell , and the gaming monitors from asus like the pg 35 vq and pg 27 uq.
https://www.theverge.com/2013/1/16/3881546/sonys-new-triluminous-tvs-pursue-vibrant-hues-with-quantum-dots
OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG , or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:
Image Burning (Burn-In)
Image Retention
Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.
I can tolerate LED + an Efficient Local dimming with minimal blooming , flashligting , none real blacks , and not infinate contrast vs OLEDs image retention and burn-Ins.
According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-life-oled-burn-in-test) and every OLED TV review they give image retention / burn in a score of zero.
For A Real HDR Experience You need:
At least 1000 nits of brightness of real hdr scene at 100% window - sustained
At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.
A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination reproduction.
A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.
I'd rather deal with an 8 year old OLED with slight burn in than dirty screen effect, uneven backlight bleeding, uneven brightness distribution, and jelly effect due to changing LCD differential bias affected by room temperature, Sharp IGZO panels use temperature sensor for this but very few panels even consider jelly/tilt effect. Not to mention you get the advantage of 90hz+ other than input lag, and smoother blur (LCD). Only a 240 Hz WITH backlight strobing and black frame insertion (resulting in 120) can get close to OLED, but you lose brightness and color accuracy.
I'll add that Samsung mentioned that their Galaxy S3 panel would only last 4 years which real life tests and reports showed, which in turn shows how accurate their tests are done outside lab
However from Gaming:
Static images ( like health bars , stats , ingame maps , outgame main menu)
&
Peak HDR real scene brightness ~ 666nits in case of LG
Made this set after 6 months of usage falling in burn ins / image retention when I shit from content to content. Although LG in 2018 models introduced Pixel Refresh' and 'Screen Shift' option to reduce the susceptibility to the innate nature of OLEDs. The risk increases on the contents being used and usage operation hours / day.
The Sony X930e is doing great now , a uniform experience overall without any risk. MicroLed with HDMI 2.1 8K will definitely be my next upgrade.
I see, I forgot about how each vendor implements mitigation against these issues, just like how tilt effect remains unresolved in most LCDs except when looking at 80+ inch TVs.Tech Guru wroteMy experience with OLED kind of sad , away from numbers and technical websites testing. Practicle owneship of a 65"LG OLED C6. It has a breathtaking colors reproduction, real inky blacks, and screen uniformity.Elitism Guru wroteNew OLEDs strobe static pixels, Samsung had a demo back then achieving 15k hours without any issues in 2015. If you intentionally use the flicker test as you mentioned in the reviews, then yes ofc it will happen, even on LCD if you leave it for a month, ever notice shop LCD displays burn in?Tech Guru wrote
Oleds s..k for gaming as they are prone to burn in. Also they aren't very bright either. Oled will be dead in about 5 years when MicroLed takes over. Around 30% brighter with no burn in. Pair that with oleds perfect blacks and contrast AND like Oleds they have a response time around .1ms. It would be the perfect panel for everything.
As a a side note:
Qled and Micro Led are Sony 1st innovative technologies not Samsung nor LG nor TCL nor...
Sony will always be the leader in AV.
http://www.trustedreviews.com/reviews/sony-crystal-led-tv-first-look
In 2012 Sony demonstrated the first micro-LED TV (55", Full-HD) which they termed Crystal-LED. Sony's Crystal-LED never reached the market, but in 2016 the company unveiled its large-area outdoor micro-LED displays which Sony calls Canvas Display
Sony pushed the quantum dot technology in 2013 ( Triluminos Display) , that Samsung has been Adopting in their high end tvs since 2016 (KS days and Now Qled) and NSamsung HG90 , HG 70 , FG 70 ) along with LGs Nano Crystal TVs like the SJ 8500 Nanocell , and the gaming monitors from asus like the pg 35 vq and pg 27 uq.
https://www.theverge.com/2013/1/16/3881546/sonys-new-triluminous-tvs-pursue-vibrant-hues-with-quantum-dots
OLED colors production are the most accurate with infinite contrast and real blacks , and LGs OLEDs have the best HDR Color bitmappping. Howere whether LG , or the upcoming OLEDs the technology itself have some serious drawbacks and more certain can happen:
Image Burning (Burn-In)
Image Retention
Also not serious as the above two but can effect HDR Peak Scene Brightness is OLED brightness is tough to cross 1000nits.
I can tolerate LED + an Efficient Local dimming with minimal blooming , flashligting , none real blacks , and not infinate contrast vs OLEDs image retention and burn-Ins.
According to Rtings.com they did a comprehensive realife content testing on OLEDs TVs ( https://www.rtings.com/tv/learn/real-life-oled-burn-in-test) and every OLED TV review they give image retention / burn in a score of zero.
For A Real HDR Experience You need:
At least 1000 nits of brightness of real hdr scene at 100% window - sustained
At Most 0.05 nits of blacks ( on led this achievable local dimmig - the more dimmable zones the more efficient) From 2017 & on many Led tvs nail that , not issue at all now with LED TVs - usually they are VA Panels TVs due to their high native contrast vs IPS low native contrast.
A Native 10 Bit Color Depth Panel not 8 + FRC for proper color gradients display & real colors combination reproduction.
A wide color gamut with at least 90% coverage of DCI -P3 color space. This is represented with a proper color volume too.
I'd rather deal with an 8 year old OLED with slight burn in than dirty screen effect, uneven backlight bleeding, uneven brightness distribution, and jelly effect due to changing LCD differential bias affected by room temperature, Sharp IGZO panels use temperature sensor for this but very few panels even consider jelly/tilt effect. Not to mention you get the advantage of 90hz+ other than input lag, and smoother blur (LCD). Only a 240 Hz WITH backlight strobing and black frame insertion (resulting in 120) can get close to OLED, but you lose brightness and color accuracy.
I'll add that Samsung mentioned that their Galaxy S3 panel would only last 4 years which real life tests and reports showed, which in turn shows how accurate their tests are done outside lab
However from Gaming:
Static images ( like health bars , stats , ingame maps , outgame main menu)
&
Peak HDR real scene brightness ~ 666nits in case of LG
Made this set after 6 months of usage falling in burn ins / image retention when I shit from content to content. Although LG in 2018 models introduced Pixel Refresh' and 'Screen Shift' option to reduce the susceptibility to the innate nature of OLEDs. The risk increases on the contents being used and usage operation hours / day.
The Sony X930e is doing great now , a uniform experience overall without any risk. MicroLed with HDMI 2.1 8K will definitely be my next upgrade.
MicroLed will only be cheaper and reasonable at 8k/4k minimum for large displays, afaik the larger the pixels, the more expensive and slower to manufacture.
23 days later
AMD vs Nivdia Efficiency
Make it Nvidia's 7nm vs AMD's 7nm. You'll be on your sides laughing when you'll see Nvidia using about 50% of AMD's total power draw while offering possibly over 50% more than AMD.
"Nvidia’s 12nm vs AMD’s 7nm GPU efficiency is “incomparable” ".
Source: PCGamesn
https://www.pcgamesn.com/nvidia/12nm-vs-amd-7nm-gpu-efficiency-incomparable?amp
Make it Nvidia's 7nm vs AMD's 7nm. You'll be on your sides laughing when you'll see Nvidia using about 50% of AMD's total power draw while offering possibly over 50% more than AMD.
"Nvidia’s 12nm vs AMD’s 7nm GPU efficiency is “incomparable” ".
Source: PCGamesn
https://www.pcgamesn.com/nvidia/12nm-vs-amd-7nm-gpu-efficiency-incomparable?amp
9 days later
I want to get a RTX 2080, unfortunately there is a sort of shortage of known brands like msi and Gigabyte, so the most sold brand now is Zotac, and in Zotac lines the basic 2080 is at 770$ VAT included while the AMP is at 850$!!
Prices are compeltely nuts
I have found a Manli Gallardo 2080 which is like the AMP, a bit too long for my case but even cheaper than the basic zotac, though I have no clues how good are these cards.
Anyone has good addresses for MSI rtx 2080 in Lebanon?
Prices are compeltely nuts
I have found a Manli Gallardo 2080 which is like the AMP, a bit too long for my case but even cheaper than the basic zotac, though I have no clues how good are these cards.
Anyone has good addresses for MSI rtx 2080 in Lebanon?
IIRC Mojitech has Gigabyte RX2080 in stock, you can try your luck with him.
You can also check if NewVision has MSI in stock since he's the official distributor.
You can also check if NewVision has MSI in stock since he's the official distributor.
Gigabyte are overpriced here like +200$ compared to UAE
The advantage of Asus is that they have 2 HDMI but again overpriced
It will be between the Zotac gaming 770$ and the MSI Ventus 820$, dunno if it's worth paying more for the msi?
The advantage of Asus is that they have 2 HDMI but again overpriced
It will be between the Zotac gaming 770$ and the MSI Ventus 820$, dunno if it's worth paying more for the msi?
Any idea if its possible to cross-flash vbios with RTX? like pascal.infiniteloop wroteGigabyte are overpriced here like +200$ compared to UAE
The advantage of Asus is that they have 2 HDMI but again overpriced
It will be between the Zotac gaming 770$ and the MSI Ventus 820$, dunno if it's worth paying more for the msi?
I did it for 3 clients build in UAE, even mini cards like the zotac 1080 ti mini
I would go with Zotac, cheaper and good warranty (5 years) from a well known vendor. Plus, I think both use stock PCBs.infiniteloop wroteGigabyte are overpriced here like +200$ compared to UAE
The advantage of Asus is that they have 2 HDMI but again overpriced
It will be between the Zotac gaming 770$ and the MSI Ventus 820$, dunno if it's worth paying more for the msi?
I finally returned the Zotac and took a Gigabyte Gaming OC instead, the shop made a price on it so the difference in price is not huge
5 months later
The Power of the RTX 2080ti
CoD Modern WareFare
4k - 3840 x 2160p
Max Settings with Filmic SMAA T2X
Real Time Rendering - RTX On
https://youtu.be/UrNHMfU_BlI
CoD Modern WareFare
4k - 3840 x 2160p
Max Settings with Filmic SMAA T2X
Real Time Rendering - RTX On
https://youtu.be/UrNHMfU_BlI
20 days later
A New Rendering "Fake Real Time Ray Tracing" software mimick that doesnot need RT cores & works on Direct X 11 - Even top dogs from last gen
1080ti (Pascal)
Radeon VII (GCN 5.0)
Fails to Deliver
Sell Them ASAP :)
https://youtu.be/efOR92n9mms
1080ti (Pascal)
Radeon VII (GCN 5.0)
Fails to Deliver
Sell Them ASAP :)
https://youtu.be/efOR92n9mms
Nothing beats hardware based acceleration, but i wouldn't call it "Fake Real Time Ray Tracing".Tech Guru wroteA New Rendering "Fake Real Time Ray Tracing" software mimick that doesnot need RT cores & works on Direct X 11 - Even top dogs from last gen
1080ti (Pascal)
Radeon VII (GCN 5.0)
Fails to Deliver
Sell Them ASAP :)
https://youtu.be/efOR92n9mms
To me it is fake:kareem_nasser wroteNothing beats hardware based acceleration, but i wouldn't call it "Fake Real Time Ray Tracing".Tech Guru wroteA New Rendering "Fake Real Time Ray Tracing" software mimick that doesnot need RT cores & works on Direct X 11 - Even top dogs from last gen
1080ti (Pascal)
Radeon VII (GCN 5.0)
Fails to Deliver
Sell Them ASAP :)
https://youtu.be/efOR92n9mms
They Used canned cubic maps - like old rasterization. SVOGI trace cone, not rays and has its own limitation that's why it's considered a different thing from raytracing. CryEngine does lighting and reflections with voxels (SVOGI). Voxels have been researched and used in other methods as well like VXAO or VXGI. The way how the voxel data structures are built and handled puts them halfway towards ray-tracing in principle. It has many limitaions compared to real time raytracing seen in
BFV - Reflections
Metro Exodus - Global Illumination
Rise of the TombRaider & Call of Duty Modern Warefare