Summary

Oak Technology's Peter Claydon analyzes the technical feasibility of adopting DVB-T COFDM for US broadcasting, arguing that existing ATSC tuner hardware could be reused and that chip availability from multiple vendors makes a transition practical. He addresses phase noise, impulse noise, 2K versus 8K carrier modes, and receiver development timelines, concluding that DVB-T could be deployed in 6 MHz US channels with minimal additional cost.

Source document circa 1997 - 2000 preserved as-is

Peter Claydon
Oak Technology Ltd

 

Which COFDM?

 


    COFDM comes in all shapes and sizes and if the FCC were to allow COFDM to be used without specifying DVB-T there could be a period of years before a system could be put in place. Technology marches forward and if a completely new system were developed today, it would probably be an improved COFDM system. This is essentially what has been done in Japan with ISDB-T. Note, however, that Japan will not start broadcasting using ISDB-T until at least 2003 and there is no silicon available.

    Silicon availability is one of the key factors here. Chips don't just grow on trees and history shows that there is at least a period of a year between the definition of a new standard and the date when the first chips are available, and the first chips (or chip sets) are usually expensive and have poorer performance. This was true for both ATSC and DVB-T. From availability of chips, there is then additional delay before products are available to consumers.

    Here are some (technical) issues:

    1. There is a competitive market for DVB-T chips, with proven devices available from Motorola, LSI Logic and Oak Technology and new devices appearing almost as I write from companies such as Infineon and Philips (ex-VLSI Technology) and a raft of others have chips in development. As far as I know (although I can only really speak for Oak Technology), all of these chips can handle 6 MHz channels as well as 7 and 8 MHz.

    2. Tuners. Currently, DVB-T receivers tend to use single-conversion tuners whereas ATSC receivers use double-conversion tuners. The double-conversion type are a little more expensive, but are necessary if the entire frequency range from low VHF up to UHF is to be covered. Oak Technology successfully used an ATSC double-conversion tuner for capturing DVB-T data at Baltimore and it I think there is a very good chance that it will be possible that the existing tuners used in ATSC receivers could be used with DVB-T demodulator chips. Almost universally, double-conversion tuners have better phase noise than single-conversion, so although COFDM is more sensitive to phase noise than 8-VSB, using tuners designed for ATSC should be better.

    3. Phase noise. A word of explanation following-on from the above. The closer the COFDM carriers are together, the more sensitive a demodulator is to phase noise. In Europe, we have seen comparable performance between 2K and 8K carrier DVB-T systems in an 8 MHz channel, although the 8K carriers are spaced at about 1 KHz as opposed to 4 KHz in the 2K system. If the bandwidth is reduced to 6 MHz, the carriers are closer together, but 2K carriers in a 6 MHz channel are still a lot further apart than 8K carriers in an 8 MHz channel. I imagine that US broadcasters would adopt the 2K mode of DVB-T to start with, as 8K is only of any advantage in a Single Frequency Network (SFN).

    4. 2K or 8K DVB-T? I think that if the FCC does allow DVB-T, then it should specify that all receivers should be capable of receiving all DVB-T modes. This would allow broadcasters flexibility to do what they want (standard HDTV broadcasting from a single tower, SFN or mobile). Most chips do all modes anyway, so there is no extra cost involved.

    5. Impulse noise. Although in the UK, the first receiver on the market had problems with impulse noise, the problems have been fixed. It can not be denied that inherently DVB-T is more susceptible to impulse noise than ATSC and also that broadcasting in the UK is only on UHF, where impulse noise is not so much of a problem, but the approach used in DVB-T is to prevent so much impulse noise getting to the demodulator in the first place. This is an issue which can only be resolved by experiment, preferably with field trials. Gary Sgrignoli's (sorry if I've spelt his name incorrectly) detailed ICCE paper provides an excellent source of information on the magnitude of the problem.

    6. Receiver development time. Most of the delay in bringing receivers to the market is caused by software development, mainly related to interpreting the MPEG stream and putting the correct thing on the screen. Demodulator chips require very little software (in the case of my own company's, almost none), so provided nothing else was changed, changing the ATSC/8-VSB front-end for a DVB-T front-end would be about as simple a change as could possibly be made. Some manufacturers would clearly take 6 months to a year over this, but the door would be open for those who could move quickly. I am not qualified to comment on the time taken to modify things at the transmitter end.

    7. NTSC co-channel interference. Oak's chip, and I believe others, does not differentiate between whether the interference is PAL or NTSC. Typically, these chips can tolerate 0 dB PAL co-channel interference and there is no technical reason why they should not tolerate 0 dB NTSC.

    I'm sure that there are other issues, but I can not think of them right now. I'd be happy to comment on any technical issues that other people raise on this subject concerning receivers. In view of comments I received on a posting a few weeks ago, I have done my best to address the issues in an unbiased manner, and I have tried to avoid making flippant comments, so I will be a little disappointed if this posting is treated merely as pro-COFDM propaganda :-).

    Peter Claydon
    Oak Technology Ltd

 

 

 


|Home| |E-MAIL|