Leonedes wroteThe weird thing is, this shouldn't be this complicated, HDMI is becoming a standard, how should I explain this to my 80 years old granny for instance ? :P
The real question is, how should Nvidia explain this to its users for several years who have searched for months (me) for the solution, found it, applied it for several months, and are now unable to connect to their HDTVs through HDMI for a year almost?
Leonedes wroteAlso, using VGA is not really a solution, as we'd need an extra wire for sound, plus, wouldn't picture quality be affected ?
True, not really a solution. With the EDID Overriding, HDMI audio will be disabled anyways, because the TV will be treated like a PC display. I have compared the two sources (HDMI & VGA with 1.8m VGA cable, regular, not too shielded) from the same PC, displaying the same Windows desktop. Not a single bit of difference, according to my perception (very picky). Now, my 10m HDMI cable is giving green snow and wrong colors (signal corruption, it's a digital cable, no noise can be introduced, only missing data), and now my 5m VGA cable + 1.8m old VGA cable, connected to my TV (along with 10m RCA cables) is being used, and I've yet to see anything wrong with the image quality, meaning, as I remember image quality was, it still is. Absolutely no reason to switch back (only one being HDCP, but hey, there's a solution for that).
With short good quality VGA cable, I doubt you'd even think of starting to suffer from image quality, since using 7m cable + a male-to-male connector in between isn't causing any trouble.