• Hardware
  • Blue light filter Benq the best?

I am searching for a 24-25'' PC monitor with if possible a hardware based blue light filter, Benq Eye care series look like the most ''pro'' in fighting blue light, though are they using a hardware filter or it's software based like LG ''Reader'' mode or Samsung ''eye saver'' mode?
What is the advantage of hardware filters VS software ones like f.lux?
From what I understood, software based filters reduce blue light from light pixels but not dark ones, only a hardware based filter can reduce all pixels blue light emission
infiniteloop wroteFrom what I understood, software based filters reduce blue light from light pixels but not dark ones, only a hardware based filter can reduce all pixels blue light emission
Software based filters are essentially 1D LUTs that apply over the entire spectrum of colors that are shown on your display, regardless of darkness or lightness. Hardware-based filters are equivalent, the only difference being that such an LUT is implemented in the display firmware. Actual blue light reduction ought to occur at the backlight level, by not emitting excessive blue wavelengths in the first place - however, blue light reduction technologies are in essence overrated and all you really need to maintain an acceptable blue level is a proper and accurate calibration of your display to a color temperature of 6500K or lower. A hardware colorimeter such as the Spyder5 (not very accurate for near-black) or the i1 Display Pro (much more recommended) would help.

On a further note, most BenQ displays in the 24" - 27" range that have come out the last few years are abysmal.
https://www.reddit.com/r/Monitors/comments/5klvmw/is_the_benq_xl2411_a_good_monitor_or_would_you/dbp8ji2
https://www.reddit.com/r/Monitors/comments/6jwvik/difference_between_benq_xl2411z_vs_zowie_xl2411z/djhmj98

The proliferation of BenQ high refresh rate displays in the Lebanese gaming market in the last few years is simply borne out of ignorance. Much better displays from other brands can usually be had at the same price points.

Let's not talk about overshoot due to aggressive overdrive, DyAc, and other superfluous, and often image quality ruining, features of those monitors.
Hardware-based filters are equivalent, the only difference being that such an LUT is implemented in the display firmware.
Thanks for the clarification yasamoka. It seems to me that there are no advantages of hardware over software blue light filters, except maybe for not having to install any additional software. That seems pretty gimmicky.
samer wrote
Hardware-based filters are equivalent, the only difference being that such an LUT is implemented in the display firmware.
Thanks for the clarification yasamoka. It seems to me that there are no advantages of hardware over software blue light filters, except maybe for not having to install any additional software. That seems pretty gimmicky.
There could be one exception, though. If the blue light filter works in the same way as the RGB controls on a monitor, given those RGB controls apply in the analog domain (rather than a digital to digital mapping as in the case of an LUT), then it could be applied on a hardware level without introducing banding, which is inherent in any LUT-based approach with a not very high amount of bits per color component (e.g. 8-bit). However, such an implementation would have to be non-naive and non-linear to be different from simply dialing down the blue slider. I bet very little on such features, which are often just tacked on relatively simple display controllers barring any hardware modification, being anywhere beyond naive implementations, as evidenced by their sudden emergence on practically any monitor, even Korean monitors that undergo almost no R&D in their development.

Regardless, it can easily be proven whether a hardware implementation is LUT-based or not by checking banding here.

Here is Chief Blur Buster's take on the matter.
Here's what I could find about Benq 27'' model EW27752ZH, dunno if LG and smasung are using the same tech or only Benq:

''The backlight is flicker-free WLED, using modified blue diodes as part of the ‘Low Blue Light Plus’ feature of the monitor. These shift the peak of blue light from 420 – 455nm (short wavelength, relatively high-energy blue light) to 455 – 480nm (longer wavelength, lower energy blue light). This allows the monitor to cut out high energy blue light without upsetting the overall colour temperature or balance of the image. The downside is that this longer wavelength blue light is still stimulating, so should still be minimised when you should be winding down (i.e. before bed). Fortunately, the monitor offers some modes which will offer ‘Low Blue Light’ with a lower colour temperature as well (such as the ‘Dark Room’ setting). So this model offers the best of both worlds when it comes to blue light reduction''
yasamoka wrote
samer wrote
Hardware-based filters are equivalent, the only difference being that such an LUT is implemented in the display firmware.
Thanks for the clarification yasamoka. It seems to me that there are no advantages of hardware over software blue light filters, except maybe for not having to install any additional software. That seems pretty gimmicky.
There could be one exception, though. If the blue light filter works in the same way as the RGB controls on a monitor, given those RGB controls apply in the analog domain (rather than a digital to digital mapping as in the case of an LUT), then it could be applied on a hardware level without introducing banding, which is inherent in any LUT-based approach with a not very high amount of bits per color component (e.g. 8-bit). However, such an implementation would have to be non-naive and non-linear to be different from simply dialing down the blue slider. I bet very little on such features, which are often just tacked on relatively simple display controllers barring any hardware modification, being anywhere beyond naive implementations, as evidenced by their sudden emergence on practically any monitor, even Korean monitors that undergo almost no R&D in their development.

Regardless, it can easily be proven whether a hardware implementation is LUT-based or not by checking banding here.

Here is Chief Blur Buster's take on the matter.
Oh, I hadn't realized that banding would be an issue. Never noticed it, as I usually disable flux when doing color-sensitive work. Thanks for the explanation!