PDA

View Full Version : HDMI vs analog



sandyd
September 25th, 2010, 07:05 PM
I got a new monitor (1080p) and I found a VGA cable that came with the box, so I decided "why not, ill just use this!".

Now, im wondering, is there any difference between the quality of HDMI and Analog?
IM asking this because cheap HDMI cables produce crap signal, and good ones cost about 100 bucks (i planning on getting a good HDMI cable if I do buy one)

Npl
September 25th, 2010, 07:12 PM
I got a new monitor (1080p) and I found a VGA cable that came with the box, so I decided "why not, ill just use this!".

Now, im wondering, is there any difference between the quality of HDMI and Analog?
IM asking this because cheap HDMI cables produce crap signal, and good ones cost about 100 bucks (i planning on getting a good HDMI cable if I do buy one)Yeah there is a quality difference between HDMI and analog, if you notice it or not is another question. Its most pronounced with text IMHO, but I think you`re also limited to 720p (or equivalent amount of pixels) with analog.

And dont fall for the expensive cables ********, it only matters for long cables (>2m). Get a cheap 5€ HDMI cable and quality will be flawless.

Mmmbopdowedop
September 25th, 2010, 07:19 PM
I use an RGB connector and i'm capable of 1080p, i'm probably confused, could you clear that up? =]

Npl
September 25th, 2010, 07:39 PM
I use an RGB connector and i'm capable of 1080p, i'm probably confused, could you clear that up? =]Then ignore that, seems to be a limitation of the card I had some years ago. Havent used VGA for ages.

Still, Id rather prefer a rather cheap HDMI Cable (or DVI) and get perfect screen geometry. As said, signal integrity only raises to be a problem on longer cables. There is for example a test of a couple cables (http://www.audioholics.com/education/cables/long-hdmi-cable-bench-tests), a ~2m 10$ cable passes everything while some ridiculous expensive but longer cables dont.

sandyd
September 25th, 2010, 07:58 PM
Then ignore that, seems to be a limitation of the card I had some years ago. Havent used VGA for ages.

Still, Id rather prefer a rather cheap HDMI Cable (or DVI) and get perfect screen geometry. As said, signal integrity only raises to be a problem on longer cables. There is for example a test of a couple cables (http://www.audioholics.com/education/cables/long-hdmi-cable-bench-tests), a ~2m 10$ cable passes everything while some ridiculous expensive but longer cables dont.
ah. thanks for clearing that up.

magmon
September 25th, 2010, 08:08 PM
I got a new monitor (1080p) and I found a VGA cable that came with the box, so I decided "why not, ill just use this!".

Now, im wondering, is there any difference between the quality of HDMI and Analog?
IM asking this because cheap HDMI cables produce crap signal, and good ones cost about 100 bucks (i planning on getting a good HDMI cable if I do buy one)

HDMI cables are digital, so all cables will produce the same quality. Pricey cords are like pricey paper; the end result is the same.

Npl
September 25th, 2010, 08:20 PM
HDMI cables are digital, so all cables will produce the same quality. Pricey cords are like pricey paper; the end result is the same.If the cables arent shielded well then the signal can break down - no matter if digital or analog. With digital signals you usually have a better tolerance against small interferences but if the signal breaks you will get rather brutal errors - its either lossless or really bad in other words.

You can surely get extremely crappy HDMI-Cables (china-stuff from shady retailers) which just wont work at all, but good (short) cables dont have to be expensive.

chriswyatt
September 25th, 2010, 08:39 PM
I bought an HDMI cable and I couldn't get 1080p AT ALL with it. It was only a short cable as well, rubbish!

It was definitely the cable because I bought a better one (cheaper as well) and it worked fine.

Don't waste your money on a really expensive HDMI cable, it'll either work or it won't. It'll only not work if it's a really really crap cable. Generally you only have to worry about cable quality if you're buying a really long HDMI cable.

formaldehyde_spoon
September 26th, 2010, 02:16 AM
HDMI cables are digital, so all cables will produce the same quality. Pricey cords are like pricey paper; the end result is the same.

Yes, a $4 digital cable will give you exactly the same picture as a $400 cable. (that's an exaggeration, I admit: I've actually only seen a $390 HDMI cable).


If the cables arent shielded well then the signal can break down - no matter if digital or analog. With digital signals you usually have a better tolerance against small interferences but if the signal breaks you will get rather brutal errors - its either lossless or really bad in other words.

You can surely get extremely crappy HDMI-Cables (china-stuff from shady retailers) which just wont work at all, but good (short) cables dont have to be expensive.

The signal doesn't ''break down''.
Analogue is immediately susceptible to any amount of noise.
Digital is completely, 100% impervious to it until you reach a certain threshold (which usually requires a very long cable); at that point the signal is simply lost - there is NEVER a degradation of quality (there is no concept of 'quality' in digital signals) ie the data that is received is always correct.

Cam42
September 26th, 2010, 02:42 AM
Buy your cables from monoprice.com
That is all I have to say.

Legendary_Bibo
September 26th, 2010, 02:57 AM
After spending a few years on the Playstation forums I can tell you that there is no difference between a cheap HDMI cable and something like a Monster cable as long as it's 1.3 HDCP compliant. I have a $60 Nyko or something HDMI cable and a $7 Amazon HDMI cable that are both 9' long for our 360 and PS3. There's no difference between them. The really expensive cables are just a scam. HDMI is basically DVI, but with audio as well, and it's able to transfer more accurate color detail, and uncompressed video and audio which can also give you better frame rates and buffers and all that jibber jabber on a PC.

Npl
September 26th, 2010, 05:14 AM
The signal doesn't ''break down''.
Analogue is immediately susceptible to any amount of noise.
Digital is completely, 100% impervious to it until you reach a certain threshold (which usually requires a very long cable); at that point the signal is simply lost - there is NEVER a degradation of quality (there is no concept of 'quality' in digital signals) ie the data that is received is always correct.Nope, there can be flipped bits (amongst other errors) in digital signals, just think of changing a few bits in a raw RGB Image. The reason that you rarely see "bad" digital transmission is the addition of error-correction schemes which can detect and/or correct some errors (but not all).
The better the scheme the more additional bandwidth is used, so its a tradeoff. For each transmitted packet the received result can be
* the original (possible error-corrected) packet
* detection of a erroneous packet which cant be corrected
* a erroneous packet that is not detected as such
(of course the probability of correctly transmitted packet should be very dominant)

If you want to see such "digital degradation", thats how (extreme) transmission errors via HDMI can look like:
http://www.audioholics.com/education/cables/long-hdmi-cable-bench-tests/sparkles-1.jpg/image
http://www.audioholics.com/education/cables/long-hdmi-cable-bench-tests/fifth-element-glitch.jpg/image_preview

Dustin2128
September 26th, 2010, 06:04 AM
Or if you've got the port, DVI cables are HD capable and have always worked well for me.


http://www.audioholics.com/education/cables/long-hdmi-cable-bench-tests/fifth-element-glitch.jpg/image_preview
mm, HD fifth element...

formaldehyde_spoon
September 26th, 2010, 12:35 PM
Nope, there can be flipped bits (amongst other errors) in digital signals, just think of changing a few bits in a raw RGB Image. The reason that you rarely see "bad" digital transmission is the addition of error-correction schemes which can detect and/or correct some errors (but not all).
The better the scheme the more additional bandwidth is used, so its a tradeoff. For each transmitted packet the received result can be
* the original (possible error-corrected) packet
* detection of a erroneous packet which cant be corrected
* a erroneous packet that is not detected as such
(of course the probability of correctly transmitted packet should be very dominant)

If you want to see such "digital degradation", thats how (extreme) transmission errors via HDMI can look like:
http://www.audioholics.com/education/cables/long-hdmi-cable-bench-tests/sparkles-1.jpg/image
http://www.audioholics.com/education/cables/long-hdmi-cable-bench-tests/fifth-element-glitch.jpg/image_preview

No, the reason you rarely see bad digital transmission is because noise has very little effect on it: most variations in the voltage are irrelevant.

Enough interference to flip a bit is *extremely* rare (except in long cables, of course), and then to not be detected by error detection would make it ridiculously unlikely.
HDMI doesn't display corrupted data, so any data with detected errors that can't be corrected is thrown away. HDMI link specs call for BER no worse than 1E-9 - 1 bad bit per billion - and if my memory serves me correctly HDMI uses 10 bit per pixel, so (I'm prepared to be corrected on this point) a maximum of about 1 bad pixel(discarded) per 48 frames (at 60 fps). But for short cables you're more likely to see 100% perfection rather than even this tiny number of errors.

The data you are left with are all perfect. The data displayed will have NO degradation (although with a long cable you may be missing some of it). I'd be happy to bet a large percentage of all my worldly possessions that the images you've posted are using very long cable (they are showing missing data, BTW, not corrupted data).

1 is 1 is 1. 0 is 0 is 0.
Use the cheapest cable that works (almost certainly the cheapest one available) - and it will be obvious if it doesn't work.

For anyone interested in reading up on digital cables, here's where NOT to get your info from: cable makers and sound/audio geeks (not using 'geek' in a negative sense - I'm a geek ;) ).

sandyd
September 26th, 2010, 02:31 PM
hmm....
having a problem here.
Monitor says its recieving 1080p properly -> http://twitpic.com/2s7e8a
but it doesn't fill the screenn??? -> http://twitpic.com/2s7fd3

NMFTM
September 26th, 2010, 02:35 PM
I have a $150ish LG 22' 1680x1050 monitor with both VGA and DVI inputs. To test it out, I installed Ubuntu on both my old and new PC. The old using VGA on a GeForce2 GPU and the new using DVI on a GeForce GTX 260 GPU. I hit the 'switch' button to flip between them (with several seconds lag) and compare signal quality and I was unable to detect any. I also go to a school where the LCD monitors are still using VGA and don't notice any difference in quality between that and my home DVI monitor.

Maybe if I had both right up against each other at the same time I'd notice something. But for the most part I don't really think it matters. I also didn't notice any real speed increases when I switched from an IDE to SATA hard drive. And a friend of mine who plays COD:MW with a 5yo General Electric serial mouse bought new for $15 consistently beats almost everyone else at the weekly LAN party I go to even though I have a Sidewinder X5 laser gaming mouse that's usually set to 2000 DPI.

Brent0
September 26th, 2010, 02:41 PM
Buy your cables from monoprice.com
That is all I have to say.

I second that. Their cables are all very cheap and still contain the quality of higher-end cables. I have bought at least 10+ cables from them. All flawless and under $5.00 with shipping.

matthewbpt
September 26th, 2010, 02:42 PM
In my experience there is absolutely no difference in picture quality normally between VGA and HDMI, the only real advantage I can see is digital sound passthrough, so you can pass the audio and video to an external decoder and then to high quality speakers. HDCP is the scourge of the HD world. My 1080p monitor is perfectly capable of playing full HD video with no loss of quality over VGA.

sandyd
September 26th, 2010, 03:34 PM
hmm....
having a problem here.
Monitor says its recieving 1080p properly -> http://twitpic.com/2s7e8a
but it doesn't fill the screenn??? -> http://twitpic.com/2s7fd3
NVM. turns out it was ATI and its hidden options again...
still, they should do this automaticallly....

grahammechanical
September 26th, 2010, 06:18 PM
CRT (Cathode Ray Tube) monitors were analogue. The output from the computer is digital. It is a digital computer after all. It is not built with vacuum valves. So the signal has to be converted to analogue.

Modern monitors are digital. so to use the VGA connector the signal has to be converted from digital to analogue in the computer, then converted from analogue to digital in the monitor.

HDMI (High Definition Media Interface) like DVI (Digital Video Interface) is digital. There is no need to convert the signal to use the latest monitors. It is progress, of a sort.

Regards

Legendary_Bibo
September 26th, 2010, 06:30 PM
NVM. turns out it was ATI and its hidden options again...
still, they should do this automaticallly....

Yeah it always does this when I connect my laptop to my TV. It took me a year to figure out what the hell was wrong. It's only ATI cards that do this for some unknown reason.

kamaboko
September 26th, 2010, 09:29 PM
In the event you may be looking for some used RCA cables.

http://www.audiogon.com/cgi-bin/cls.pl?cablintr&1287975327&/Siltech-G6-Compase-lake-2x2met

formaldehyde_spoon
September 27th, 2010, 01:30 AM
In my experience there is absolutely no difference in picture quality normally between VGA and HDMI, the only real advantage I can see is digital sound passthrough, so you can pass the audio and video to an external decoder and then to high quality speakers. HDCP is the scourge of the HD world. My 1080p monitor is perfectly capable of playing full HD video with no loss of quality over VGA.

There will always be the possibility of some loss of quality with analogue, the only question is how much.
I have a monitor with a VGA cable next to one with a DVI cable, and there is a slight but perceivable difference. When a window straddles both screens the lines and text on the VGA screen are less sharp.

Johnsie
September 27th, 2010, 12:36 PM
It's all fine and well having the best multimedia hardware. But my question is, how well does Ubuntu make use of multimedia hardware? Software is important too. If you're forking out for good hardware then it makes sense to make the most of that at the software level too.

SeijiSensei
September 27th, 2010, 03:13 PM
It's all fine and well having the best multimedia hardware. But my question is, how well does Ubuntu make use of multimedia hardware? Software is important too. If you're forking out for good hardware then it makes sense to make the most of that at the software level too.

Ubuntu generally has excellent software to make the most of multimedia hardware. Obviously it can't display material encoded in proprietary DRM schemes, but that's not a limitation of Ubuntu, it's a consequence of decisions made by the media companies.

The most common complaints I see concern high-definition, multichannel audio formats and Adobe Flash (http://www.avsforum.com/avs-vb/showthread.php?p=15979742#post15979742). When Adobe removed the 64-bit Linux version of flashplayer a few months ago, some Linux users saw this as Adobe abandoning the Linux platform. Obviously, as we see now (http://labs.adobe.com/downloads/flashplayer10.html), this wasn't true.

I'd agree that support for high-definition audio formats has been a bit slow to develop, but it seems (http://www.halfgaar.net/surround-sound-in-linux) to have come a long way.

Support for video is pretty much all there with free implementations of H.264 encoding (x264) and decoding (libavcodec). Run "mplayer -vc help" and "mplayer -ac help" to see a list of the codecs it supports. If what you're looking for isn't there, try a daily build at http://www.mplayerhq.hu/.

I find that SMPlayer (http://smplayer.sourceforge.net/) plus my NVIDIA card fulfill all my needs, though I usually build the mplayer engine itself from the daily snapshots or this git repository (http://repo.or.cz/w/mplayer.git). I've needed support for things like MKV ordered chapters, the code for which is still seen as a bit dicey by the mainstream mplayer developers. (I'd agree after using the bleeding-edge version myself.)

ssam
September 27th, 2010, 05:37 PM
if you buy a cheap digial cable, and if causes transmission errors then it is defective. so take it back and get a replacement/refund.