• Hardware
  • Laptop HDMI output too sharp & ugly ?

Hello,

I was trying to connect my Alienware m15x to my Samsung tv around 40" I think, I really can't provide much information about the TV as I really dont have any idea about its model and I have no experience at all with TVs, and its model number and specs are written on its other side which is really hard to get to right now, regardless I think the issue is on the laptop's side.

The alienware uses DisplayPort, I connected the displayport adapter to the HDMI cable then to the TV, the display on the TV is extremely sharp and ugly, ive tried almost all resolutions, with both 1080i and 1080p (the best I got was at 1920 x 1080), ive even tried to change the display mode on the tv, getting almost the same result each time. And the sadest part is, that connecting the laptop using the VGA connector gives a more cleaner and bearable display, so there must be something wrong!

I just need advice how to approach the problem, I'm sure its not the TV as I've used a PS3 on it several times, I'm not sure if its the HDMI cable but I will try another one once I get hold of another cable, any suggestions ?
Yes I had this problem and I was trying to fix it for almost a year, albeit with a desktop...I'm sure you have Nvidia graphics card right? If so, I'll guide you through.

And yes it's not the HDMI cable...this cable is a digital cable...if you get no wrong color pixels, or any corruption, then the cable is fine, even if it cost $3 or $100. A higher resolution that is, let's say, too powerful for the cable to pass through, will cause corruption, not overscanning. What you're having seems to be overscanning.

EDIT: In case you misunderstood my post, I meant to say that I did fix it, after hours of research. But if you have an ATI graphics card, then it's a whole other issue. ATI don't have overscan problems as much as Nvidia do, if at all.
Yes of course...first of all do you have any XP system, regardless of connectivity, in the house?
Yes well get one ready, connect it to the TV, then use this software: http://www.softpedia.com/get/System/System-Info/EDID-Viewer.shtml

And save the resulting document. This retrieves an ID for the TV called an EDID. We'll use this ID in a registry file to override any settings nvidia forces and thus get crisp 1080p on your TV.

Reply back here once you get an XP system ready.
Ahh thanks alot dude, i will have to wait until tomorrow to install the system, will reply back here, really thankful there seems to be suj a solution :)
Do you have an XP install disk, or a Linux boot CD, or any BartPE live boot cd, etc...?

EDIT: well you can also chance it and try the software on Windows 7 on the Alienware laptop or any other system...try it...and it if works and gets an EDID then we can proceed...the rest of the solution takes like 10 min if not less.
The problem is that i dont have the cd today, but i just thought of something, to get the EDID i need to connect the pc through HDMI ? Or can I do it with the vga cable ? Im guessing hdmi, if so i got a problem
I got my EDID from HDMI, I'm unsure, I think it's unique to the TV, not its connector, anyways if you have DVI you could get the EDID using a DVI-HDMI dongle...but try with VGA anyways...I'll be sure to try on my TV and see if I get the same EDIDs
Ok thanks. I'll be getting home soon and i'll try.
Hey Yasamoka, the guy I depended on giving me the CD forgot about me today:/ I tried installing EDID reader on the Alienware running windows 7, I pressed on "Read EDID" and selected the only option available: Read EDID from windows registry, 4 options were presented, three being pnp monitor and one nonpnp , im guessing the tv is the nonpnp ? But anyway, i guess its not compatible with win7 cause whatever option I choose I get all the values to be "0", you think theres another prog that might be compatible with win7 ? I dont even know what to google cause the Edid program seems to give lots of info regarding the display I couldnt find an "Edid" number/code amongst them.
10 months later
I hate to be reviving such an old post, but I discovered a couple of days ago, by mistake, that this problem applies to all Samsung TV/monitors, and wasn't caused by the laptop instead, and so I felt I should share this!

The solution is extremely simple:
On your TV
1. Open MENU
2. Go to "Input"
3. Select "Edit Name" to open up the "Edit Name" sub menu.
4. Scroll down to "HDMI" and select it, (watch carefully)
5. Now assign "PC" or "DVI PC" to the HDMI setting.

If it doesn't work on a certain HDMI port try another one.

And there you go, the screen becomes as good as ****.

Its weird how this isn't mentioned anywhere!
This isn't about Samsung TVs only. My Hyundai E420D 42" HDTV does the same. It's to do with Nvidia's drivers. This is called overscanning.

When you connect to a TV through HDMI, Nvidia drivers detect that you are connected to an HDTV. They try to read a value from the TV called an EDID. However, the EDID is read wrongly and the list of resolutions is wrong. This means that when an EDID is read from a TV, Nvidia drivers CANNOT apply a 1920 x 1080p 60Hz resolution without overscanning.

Previously, up till the 19x.xx drivers, and before R256 and R300 drivers, there was an option called OverrideEDIDFlags. Basically, this was a registry entry that you entered either directly in the registry after drivers install or modified the .inf files in the driver you are about to install. This tweak no longer works.

Solution? Use a video interface which does not cause the drivers to read the EDID (even if these interfaces themselves can pass an EDID). Mainly, VGA and DVI. This would make the driver think that these are simply monitors, like any PC monitor. The EDID would not be read and the resolution specified would be applied, in this case 1920 x 1080p 60Hz, without any overscanning.

What the "PC" or "DVI PC" setting do is something equivalent. It makes the drivers think that the TV is actually a PC monitor. This would drop the EDID readings and would make the Nvidia driver apply the correct resolution.

Hope this helps explain this phenomenon.
Thanks for such an informative post !

The weird thing is, this shouldn't be this complicated, HDMI is becoming a standard, how should I explain this to my 80 years old granny for instance ? :P

Also, using VGA is not really a solution, as we'd need an extra wire for sound, plus, wouldn't picture quality be affected ?
Leonedes wroteThe weird thing is, this shouldn't be this complicated, HDMI is becoming a standard, how should I explain this to my 80 years old granny for instance ? :P
The real question is, how should Nvidia explain this to its users for several years who have searched for months (me) for the solution, found it, applied it for several months, and are now unable to connect to their HDTVs through HDMI for a year almost?
Leonedes wroteAlso, using VGA is not really a solution, as we'd need an extra wire for sound, plus, wouldn't picture quality be affected ?
True, not really a solution. With the EDID Overriding, HDMI audio will be disabled anyways, because the TV will be treated like a PC display. I have compared the two sources (HDMI & VGA with 1.8m VGA cable, regular, not too shielded) from the same PC, displaying the same Windows desktop. Not a single bit of difference, according to my perception (very picky). Now, my 10m HDMI cable is giving green snow and wrong colors (signal corruption, it's a digital cable, no noise can be introduced, only missing data), and now my 5m VGA cable + 1.8m old VGA cable, connected to my TV (along with 10m RCA cables) is being used, and I've yet to see anything wrong with the image quality, meaning, as I remember image quality was, it still is. Absolutely no reason to switch back (only one being HDCP, but hey, there's a solution for that).

With short good quality VGA cable, I doubt you'd even think of starting to suffer from image quality, since using 7m cable + a male-to-male connector in between isn't causing any trouble.