Why Does Hd Need Better Signal Than Sd?
Posted 09 January 2011 - 12:36 PM
One thing I found was that HD channels would not work, but most SD channels did,
some breaking up occasionaly, indicating it is borderline with this singal strength.
Curiously one STB (DGTEC) show the SNR as hard zero on HD channels.
When I connected these STB to a good antenna, all channels are fine.
Now why is this? They are just bitstreams on the same RF carrier. Does HD need more
error correction, and just gives up if the signal is marginal?
Posted 09 January 2011 - 01:15 PM
Posted 09 January 2011 - 01:53 PM
Obviously once you get close (or past) the digital cliff then you're in the twilight zone (The one that existed before the Vampires one).
But as a possible explanation, "SD" is both very well defined and involves a lot less data than HD. SD = 576i while HD = a plethora of signal formats. Its possibly not surprising that a cheap (or even expensive) STB handles SD a bit better than it does HD.
Posted 09 January 2011 - 02:17 PM
Different boxes/tuners and different antennas.
Due to the higher processing required to decode & display HD signals compared to SD, marginal signals can impact HD signals sooner than SD.
Whilst some tuners may handle marginal signals better than others, the bottom line is, if you have adequate signal strength and quality, they should all work.
You have proved this yourself by changing from the "piss-weak antenna" to the "good antenna".