I guess it's a sign of the times. With TV shows like "Survivor" and "The Bachelor" espousing the virtues of deceitful behavior and getting big ratings, why should I be surprised that specifications from A/V manufacturers occasionally stretch the truth. I'm not, and they do......and they continue to on an ever more frequent basis. Still, one has to draw the line somewhere. When they sold us "1080p" sets that wouldn't accept a 1080p signal, when they advertised (and sold) tens of thousands of "ED" sets as "HDTV compatible" to consumers, many of whom still to this day think they are watching high definition images on those screens, and even when they tell us the new CMS (Color Management System) controls will give us "better reds" (there is only ONE correct red), I chalked it all up to "aggressive" marketing. More recently, Samsung sold two generations of BD players knowing full well that they wouldn't play many recently released BD discs. I consider this to be the most egregious fraud ever perpetrated on the CE public. Thankfully, they were rewarded with a class action suit. But none of these are the prevaricated truth I'm on today. Today I'd like to talk to you about CONTRAST RATIO and why almost any number offered to you on a spec sheet is meaningless and intentionally deceptive. Indeed, contrast ratio IS all important, which is why marketeers jack the numbers up beyond anything meaningful (or measurable). Several double-blind tests in recent years have confirmed that to the otherwise uninitiated viewers (read that, non-HDTVMagazine subscribers), will pick a TV out of a large sample of sets if it in fact has the best contrast ratio (CR), eschewing other important metrics like color saturation, color accuracy and resolution. Technically, CR is the simple division of light output at black (0IRE) divided into the light output at peak white (100IRE). Empirically, it translates to what we often call image "pop". Some liken it to a 3-D effect. It's the dynamic range of video, the counterpart to 20 - 20,000 hertz in the audio world. It's also the reason good (low) black levels are so important. The denominator has a lot of leverage in the CR fraction. Two years ago, at CES in Las Vegas, I was appalled to see Sharp claiming a CR of 1,000,000:1. Today, I've seen claims north of 10,000,000:1. Here's why, as my Texan friends would say, that dog don't hunt. When questioned by knowledgeable reviewers, manufacturers defend these numbers by telling you the method by which they were obtained. The numerator (peak white) is often measured with the brightness and contrast controls turned all the way up and, truth be known, probably a line voltage of about 135 volts applied to the power supply to squeeze every last lumen out of the lamp. In contrast, (pun intended), the method of measuring the denominator (black level) involves a pitch black room and pulling the display's power plug out of the wall (I actually had one major manufacturer tell me they measured an unused input)! Obviously, neither of the above conditions have anything to do with the way you or I would watch a movie. If the display were left in the settings used to measure either side of the CR fraction, the set would be unwatchable. Even still, let's do some high school math. About the highest post-calibration light output you are ever likely to see is about 70 to 75 foot-lamberts - probably a D-ILA front projector suited for a commercial theatre, but focused on a much smaller window than a commercial screen. If this device truly had a 1,000,000:1 CR, the black level would have to be an immeasurable .00007 foot-lamberts. The very best equipment we have available today to measure light output goes down to about .005 foot-lamberts and at that level, the accuracy is about plus or minus 100%. For the record, the only honest way to measure CR, the way you would experience it watching a movie, is to put up a full screen checkerboard test pattern and take the average light output of the two center black squares and divide that number into the average light output of the two center white squares. This accounts for the audacity some cinematographers have of putting light and dark images on screen at the same time. This measurement, of course, requires a precision light meter and the quotient, on a very good display will be in the vicinity of 600 to 800. Even 4 or 5 hundred is pretty good. This method (often called modified ANSI) is the one most often used by competent reviewers and was even adopted by Runco, a leading maker of high-end front projectors, as it most closely approximates a real viewing experience. Hey, isn't that all that really matters? Other examples of untruths in print are easy to find, even from otherwise "credible" sources. Consumer Reports once stated "stick with low cost cables - you won't likely see a difference", which simply isn't true if you have good equipment. You don't need the $100/ft. stuff, but Monster grade 2 or equivalent WILL improve the picture. A recent article in USA TODAY offered a "trick" to improving color out of the box - "turn contrast almost all the way up and brightness up to at least half way". Contrast and Brightness are, of course, luminance controls and have nothing to do with color. Patently wrong statements like these surely have more to do with ignorance than deceit, but are just as dangerous to home theatre newbies. Perhaps one day we can hope for a new wave of truth-in-advertising legislation to protect the masses, but the chances of that happening any time soon are, well.........1,000,000:1