A quiet revolution is taking place in the digital corner of the Consumer Electronics Industry. That revolution is the rapid adoption of the MPEG-4* based AVC (Advanced Video Coding) standard. Driven by the increasing commercial demand for more channels of programming at the same or improved video quality in the limited bandwidth available to service providers, AVC is the natural solution. The AVC parent, MPEG-4, was adopted as an International Standards Organization (ISO) standard in 2000, and has two major attributes over its predecessor, MPEG-2: 1) Most importantly, much greater coding efficiency to allow quality video in less bandwidth, resulting in more available channels. 2) Secondarily, the ability to encode test and graphics within the compressed bit-stream, facilitating synchronous interactivity without separated files. Further improvements in MPEG-4 resulted in even better coding efficiencies along with improved video quality. The result was the adoption of the MPEG-4 Part 10 variant as the new International Telecommunications Union (ITU) teleconferencing standard, H.264. Consumer Electronics service providers essentially use the H.264 standard for AVC. MPEG 4 is not simply an updated version of MPEG-2. Although it employs the same basic mathematical philosophies for motion picture data compression, it is really a "clean sheet" approach. Improved algorithms are applied to motion prediction, adaptive sampling and other elements of visual compression to yield efficiencies of 50% to 60% over MPEG-2. This means that quality video can be obtained from bit rates as low as 50 Kbps, generating a 176 x144 pixel image at 15fps, up to full HDTV (1920x1080/ 24p) at 7Mbps. However, this super "squeezing" doesn't come without cost. The increased computational complexity required to process these complex algorithms requires additional computing power and memory. But, these additional costs are being quickly mitigated by good ole Moore's Law.** As a result, AVC is being quickly adapted as the video compression system of choice. AVC is now being, or shortly will be, employed by the major of US and European service providers. Out front, and the most aggressive adaptor of AVC, is Apple with its QuickTime 7 codec (encoder/decoder). Next comes Sony with its Playstation Portable (PSP) and its upcoming Playstation 3 (PS3). The PS3 application is driven by the requirement to decode HDTV Blue-Ray DVDs. Indeed both the HDTV videodisc systems (Blue-Ray and HD-DVD) specify AVC as well as Windows Media 9 (WM9) in addition to MPEG-2 players. On the broadcast side, AVC is the basis for the emerging multi-channel HDTV DBS, Cable, and DSL offerings as well as Apple's Video I-Pod system. Further, AVC is expected to be the selected codec for most of the world's mobile media systems. It is interesting to note that the only US broadcast entities not participating in the AVC revolution are the traditional over-the-air broadcasters, i.e. the local TV stations. That's too bad. (See my article titled "LDTV" posted in HDTV Magazine, September 2005.) It is quite possible that the next generation of HDTV television receiver built-in tuners will include an AVC codec to facilitate cable "plug and play" capability. But, this is not advisable, as the overall HDTV and DTV signal delivery and processing technology is advancing much too rapidly to track with the comparatively long life of today's HDTV display systems. (See my article titled "Boxes" posted in HDTV Magazine, July 2005.) The benefit of high quality, efficient compression systems, such as AVC, to HDTV viewers is more channels of HDTV programs from the major television service providers. But will this ever-increasing stream of technology ever end or at least be stable for any reasonable period of time? I don't think so, and I hope not. For here comes MPEG-21 that adds digital rights management (DRM) solutions to the compressed digital stream. This is necessary to assure the continuing availability of high-quality HDTV programming content. It will only get better. Ed *MPEG - Motion Picture Experts Group ** Moore's Law is defined as "the prediction that the number of transistors that can be placed in an integrated circuit of any given size will double every 18-24 months. That is why microprocessors and other integrated circuits become cheaper and more efficient year by year.