It has recently been suggested in another thread that this pinned thread could be a candidate for being unpinned:
...Not to mention the ones like Single Frequency Networks, which is both embarassingly wrong and largely irrelevant now.
They should probably all be unpinned and allowed to die gracefully, rather than taking pride of place at the top of each subforum.
I myself think this particular thread still serves a purpose, though not quite the one intended by the opening poster. Numerous errors of the opening poster are addressed in the course of this thread. Thus, this pinned thread has been serving (and could continue to serve) as a caution for new members of DTV Forum, to take the views of the opening poster with a grain of salt.
In my opinion, a particularly helpful contribution in this thread is post #26
by James T Kirk, which provides an easy to digest and informative overview of how DVB-T can be operated as a Single Frequency Network.
One of the more surprising claims made in this thread by the opening poster is that the COFDM transmission is put through a limiter in the receiver prior to demodulation:
Both FM and COFDM use limiters to remove amplitude variations in the signal prior to demodulation although with COFDM this effect is less.
I thought bellotv gave an easy to follow explanation as to why using a limiter would be an odd thing to do:
I thought that a constallation display was showing the PHASE and AMPLITUDE variations of the QAM signal.16 QAM has 4 phase and 4 Amplitude possibilities per symbol .64 QAM has 8 phase and 8 Amplitude possibilities per symbol.
Surely if you limit the signal prior to demodulation you would get the phase differences but badly corrupted Amplitude levels.
Secondly I thought that COFDM deliberately has piolot carriers dispersed across the channel that carry no modulation whos amplitudes are monitored and used as part of the viterbi decoding process to predict reliability of other nearby carriers that may be affected by reflected signals .
Surely limiting signal prior to demodulation would render this part of COFDM useless too
I may have this all cocked up but I think its pretty close.
And as M'bozo stated:
In circuits of DVB-T receivers I see no limiting function being applied in the recovery of baseband signals.
Why alanh thought that limiting prior to demodulation would be used in a receiver when the overall power envelope of a DVB-T signal varies markedly as a function of time, and when the multiplicity of individual carriers contributing to the power envelope are modulated both in relation to amplitude and phase, is not immediately clear. I note that a spectrum analyser testing voltages or currents at any particular sub-carrier frequency would not be a particularly helpful tool for determining whether the sub-carrier were being amplitude modulated.1
I note that DVB-T transmitters are required to be linear in order to comply with the requirements of the broadcast authority as regards spurious emissions, and in order to provide a clean signal with low MER for receivers. The ratio between the peak power and the RMS power (the crest factor) is a statistical function, and the average value varies depending on the modulation constellation in use. However, rarely, extremely high peaks would occur when the subcarriers, by chance, instantaneously reinforced each other. In practical transmitters these extremely high peaks are clipped, as mentioned in this 700 KB pdf about issues that arise when combining signals from different DVB-T transmitters: http://cdn.rohde-sch...02/7TS02_2E.pdf
. A technique for reducing the basic crest factor of a DVB-T transmission (thus reducing the need for clipping) is mentioned in this 987KB pdf: http://www.rohde-sch...rty_Flyer_e.pdf
Intentional clipping in DVB-T transmissions affects such a small percentage of the transmission that the MER in the receiver can be maintained at an acceptable level. A slight drop in the level at which clipping commenced would affect a higher proportion of the transmission, and if the clipping level dropped even further (taking the transmission MER below permitted specifications), a point would be reached of massive, uncorrectable, error in receivers.Alanh,
here is an (admittedly very late) opportunity for you to provide a reference source to back up your claim that a DVB-T receiver uses uses a limiter "to remove amplitude variations in the signal prior to demodulation"; or, alternatively, you may wish now to withdraw that claim.
Edit 1/8/13: This claim now appears as entry 006
in the alanh "facts".1 In a different context, using a spectrum analyser for the output of a broadcast band AM transmitter on a carrier frequency of say 1MHz with a 1kHz test tone at 99% modulation, the spectrum analyser would read a constant amplitude of RF energy at 1MHz, whether the amplitude modulation were on or off. The spectrum analyser averages over time, and in any event its effective Q is high in order to be able to distinguish between 1.000MHz. and a sideband at 1.001MHz.. For readings of cyclic variations in radio frequency amplitude of a broadcast band AM transmitter, an oscilloscope is the traditional, and more appropriate, tool. The suggestion of using a spectrum analyser for the extremely closely spaced sub-carriers of a DVB-T transmission is not helpful. It would however be a simple matter to view the overall envelope of the transmission; and it is that overall envelope that a pre-demodulation limiter in a DVB-T receiver would actually limit. If a practical demonstration were desired of the effects of limiting, a low power DVB-T exciter could be programmed to perform limiting to a progressively lower than usual threshold, and the progressively increasing MER in a nearby receiver monitoring the exciter output could be observed. Also, the point at which failure in audible output of the radio occurred could be noted.
Edited by MLXXX, 02 August 2013 - 12:01 AM.