California Monkey Wrench

Started by Dale Apr 12, 2006 21 posts
Read-only archive
#1 (edited May 17, 2006)
<em>I had planned on bringing you the details of the new Mitsubishi line tonight but this story (see below) broke today and it is important enough to preempt the product discussions. I am working on an expanded version of this press release which will be posted on our site tomorrow. The reason that this story is important is not because the consumer electronics industry cannot in the end meet the demands, but that the trade offs for doing so can backfire in several ways. First, the people needing these final transition boxes may be unhappy with the performance of lower power schemes and, secondly, the clock it ticking. It is hard to redesign and bring to market in time to accommodate the February 17, 2009 deadline for bringing an end to analog broadcasting. So, tomorrow, all things willing, I will bring you both my comments and research done on this story and then a discussion of the products from Mitsubishi.</em> _Dale Cripps

Read the Full Article
#2
I guess they are bored with all of the rain we are getting here. All they will do is open up the door to ebay and mail order ways for getting these devices. The consumers won't care if they meet energy requirements or not and the government won't be able to stop it.
The amount of energy that they would save in a year is insignificant compared to the amount that is wasted by them concocting such laws or the analog stations all broadcasting their redundant signals.
I haven't seen the whole article, but how much energy can these things use? I know they don't exist so it isn't easy to answer, but someone must have an idea how much added energy use will exist when they are turned on in addition to the TV set.
Energy Star. Aren't they the ones that made it possible for my computer to remain partly on all the time burning up energy when I think that it is off?
Next... :shock:
#3
Bold headline. Except these regulations have been on the books since 2004 and the latest actions on the part of the California Energy Commission that I see were to delay the effective dates by 6 months. Did the CEA just wake up from a long nap?

The article clearly wants to instill fear while being short on facts. What specific requirement is the CEA having issue with? 3 years is a looooooong time to respond to a power supply redesign issue, IME and IMO.

BTW, it's projected in a Federal study that digital STBs will account for 4% of US power consumption by 2010. That's not an insignificant number, hence the reason for power consumption standards on both the Federal and State levels.

From what I know about the issue, it has nothing to do with the tuner performance of the boxes, but rather the power used during standby mode which makes up 75% of the time the DTV adapter is plugged into the wall. Claiming it will somehow result in poorer performance is not accurate.
#4
First, it's not unusual for some aspects of a regulation to go un-noticed for many months.

Next; 4% for all digital set top boxes (DSTB)so I assume the way it's worded that means *all* including my satellite TV receiver, cable decoder, and anything associated with the addition of HDTV. Knowing stats they are probably calling the DVR, The DVD recorder/player, and VHS recorders STBs as well. To me we need to separate out the digital to analog (D/A) converters which is what we are talking about here and I think that is going to be a small % of the STBs.

But, let's look at power. It's should be safe to assume that a simple D/A converter it going to take considerably less power than say my satellite receiver which draws a maximum of 35 watts and typically runs at half that. If the D/A converter is full featured (IOW) it downloads menus and other information as does my satellite receiver then it needs to remain on most of the time. If I leave the satellite receiver unplugged for more than a few minutes it has to reboot when it's plugged in and that can take 3 to 5 minutes.

As to that 4% power figure we need to stop and think a bit. The typically new TV draws on the order of 150 to 200 watts with the old, large CRTs drawing much more. I have some large CRT computer monitors that draw that much. So if we figure a continuous power use of say 20 watts when the peak is 35 watts and the typical TV draws 10 times that much then the TVs are going to be drawing 40% of the power off the grid plus the 4% of the STB? That seems a bit unrealistic to me.

BTW, when my computers are off they are off. Nothing is running and they take a battery to keep the BIOS settings. OTOH they are typically running 24 X 7.
#5
Roger, I'll let you read the piece for yourself. In the first page or two they define what "STB" was for this particular study.

http://www.iea.org/textbase/papers/2004/am_stb.pdf

Without seeing more on what specifically the CEA is complaining about, it's hard to determine if the complaint has any merit or is just a veiled attempt to allow sloppy STB designs into the market. I'm all for designing more power efficient, savvy code equipped hardware. I think there is too much "throw it over the fence" mentality when it comes to what the consumer electronics manufacturers put out for product these days.

It should also be noted that countries like the UK and Australia are addressing these very same issues today. You can find some of their legislative efforts with a Google search.
#6
A sidebar to these proposed regs, Motorola made some comments in response to solicitation for such by the Legislature and their only objection seemed to be with the wording that required testing at both 115V and 230V. Maybe they saw an unintentional requirement for dual power supplies? Later revisions of the proposal(Feb. '06) seemed to address their concerns.

I find that sorta interesting because IME Motorola is not exactly known for efficient power budgeting with their products. My Comcast box(Moto 6412) is known to have marginal cooling issues and resulting performance problems. If you design these products from the get-go with better power management and budgeting some of these problems will go away. It's that penny-wise, pound foolish thing biting them on the butt.
#7
Roger, I'll let you read the piece for yourself. In the first page or two they define what "STB" was for this particular study.

http://www.iea.org/textbase/papers/2004/am_stb.pdf

Without seeing more on what specifically the CEA is complaining about, it's hard to determine if the complaint has any merit or is just a veiled attempt to allow sloppy STB designs into the market. I'm all for designing more power efficient, savvy code equipped hardware. I think there is too much "throw it over the fence" mentality when it comes to what the consumer electronics manufacturers put out for product these days.

It should also be noted that countries like the UK and Australia are addressing these very same issues today. You can find some of their legislative efforts with a Google search.

Virtually all DVB-T COFDM receivers sold in the UK can meet the California energy usage requirements today. The CEA tried to sabatoge the California energy requirements in Congress in the recent hearings on the DTV transition and the need for a converter box.

The CEA attempt is not a veiled attempt, more like a blatant attack on efficiency at the bequest of some of thier members who would rather sell us junk receivers for the US junk modulation, 8-VSB.

The main cause of delay in the US digital transition is our lousy 8-VSB modulaltion. Where was the CEA when 8-VSB was being chosen and later when it was challenged as being inadequate?

The UK has now sold over 11 million enegy efficient COFDM STB's in the last three years and sales are now accelerating. In January, the slowest sales month of the year for STB's they were selling at a 70,000 a week clip. In the US that would translate to 420,000 per week or almost 22 million a year. And the UK has no mandate. UK citizens freely and enthusiastically buy COFMD receivers because they work plug and play. They are not foisted on them with a mandate that saddles 75% of the public with 8-VSB receivers they did not want and do not need.

The CEA has got it all wrong from the beginning for the US public and for their members.
#8
I am always puzzled by those who say that COFDM is the reason other nations are doing so well in SDTV and we are doing so poorly in HDTV when our HDTV market is the most successful in the history of consumer electronics. Sure it got off to a bumpy start. While there are glitches to this very day the pain of a difficult start is well behind us. Ask the retailers. They are showing record years in video sales due entirely to H/DTV.

Your comment that the COFDM DVB-T box meets the California spec of 1 watt standby and 8 watts active is correct. It should be added that the DVB-T does not decode 19.2 Mb/s MPG 2 signals nor does it have the GEMSTAR program guide we manage in standby.

I do think that had we the luxury of 7 and 8 MHz channels the COFDM choice might have been attractive. But we operate, and were mandate to live, in a 6 MHz channel. I was there when COFDM was first brought forward at the IBC in Amsterdam and knew all of the U.S. team who were assigned the task of evaluating it. Some of those people have said in later years that given the wider bandwidth, such as in Europe, it would have been a good choice, but none have said that 8-VSB was a decidedly bad choice, though all were disappointed in the early iterations of the hardware. It has its trade offs and in the U.S. environment, where reach is a stated importance, it serves...how well
#9
The COFDM vs. 8-VSB argument is decided and dead. Bob, I give you the persistance award, but that dog won't hunt. We won't be switching to COFDM in the US. Period. End of story.

I totally disagree with the idea of a subsidy for power supply design! If a vendor wants to compete in this space, then pony up the R&D and play the game like all the rest. Forget the gov't handout idea. Are these companies expecting handouts on the front end also planning to hand over their profits on the back end when they get to market? I'm guessing no. If you don't believe in capitalism and the inherant risk taking, then sit on the sidelines and let someone else do it! But don't sit their and whine that you can't make enough money off the deal, because I can assure you that someone else will.

We're not talking designing a space shuttle here. 3 years to get it done is plenty of time, IMO and IME. The fact that only selected vendors are complaining about the timetable should tell you something. This is not a "one time to market" thing and if they are using this excuse as justification for some subsidy, they should be slapped!! 5 years is the average useable life expectancy on electronics. Future generation product will be needed. Using taxpayer dollars to fund R&D now that the Corporation purely profits from later is flawed in so many ways it makes my blood boil. The $200 M "admin" fee is precisely the sort of welfare situation that the program was sure to start when subsidy was even brought up. Another case of the gov't doing it all wrong. If they had let the private sector figure it out, I can assure you that 50 million STB units would get someone's attention enough that a power efficient design that still made the company a profit would be forthcoming. The gov't should set the standard with reasonable input from the private sector and then GET THE HELL OUT OF THE WAY.

Just another demonstration of how the lobbyists manage to screw it all up and in the end the taxpayer simply pays for Corporate greed.
#10
Here is the problem with applying free market thinking throughout this particular situation:

WHAT IF THERE IS NO MARKET?.

THAT IS WHY A SUBSIDY IS IN READINESS.

The subsidy is there as a reserve in the very likely event that there is no market demand for the products that need to be installed on existing analog sets still dependant upon over-the-air signals in order that the shut off of analog frequencies can occur without severe political "noise" or repercussion.

As far as being competitors, these companies compete all day long every day of the week with all kinds of products that we want. To label them as non-competitive seems short sighted.

The problem we anticipate is in the last phase of the transition in the form of a non-responsiveness from the poorer markets. There is a market segment of over-the-air television viewers who neither care anything about, nor do they want digital television services at any price. This segment will not, and often cannot, do anything about the transition on their own using their own very limited funds. So they will NOT demand (the first cause of a free market) any part of this final outfitting unaided.

How do you gear up a bunch of competitors for such a non lucrative market? You don
#11
Bold headline. Except these regulations have been on the books since 2004 and the latest actions on the part of the California Energy Commission that I see were to delay the effective dates by 6 months. Did the CEA just wake up from a long nap? ...

If you go to the following link:

http://www.energy.ca.gov/appliances/documents/2006-01-30_WORKSHOP_TRANSCRIPT.PDF

You will see a meeting taking place on Monday, January 30, 2006. The PDF file is a bit long, but if you go to page 199 and read on from there, you will see that it was a second chance to fix an impossible standard.

The estimate for the number of homes in California is 12,507,767 as of 2002. In the meeting notes the estimate was 13% of the homes use just an antenna; that comes to around 1.6 million California homes needing DTV Adapters. See:

http://quickfacts.census.gov/qfd/states/06000.html

Assume that the power savings is 7 watts per home, that comes to an 11.2 megawatt savings. This sounds like a big number until you consider that when ALL analog TV stations shut off their transmitters, this is going to be a lot more than 11.2 megawatts saved. Also, given that the estimated life of the adapters is about 7 years, see meeting notes, why spend all the extra money for something that won't be around that long?

Please take the time to read some of the notes from the meeting, there's a lot of complex problems created by the 8 watt on / 1 watt standby standard.

See also:


http://www.energy.ca.gov/appliances/documents/2006-02-10_CEA_DT_ADAPTERS.PDF

and

http://www.energy.ca.gov/appliances/documents/index.html


Bob Diaz
#12 (edited Apr 14, 2006)
Good info, Bob. But personally I'll take the techie over the lobbyist when it comes to objectivity. You'll note he was fairly specific about how the stream can be demodulated and decoded with minimal parts. $24 for the BOM. That should quiet the guys insisting that these boxes are somehow expensive and complex.

The CEA guy wanted to go on and on about current STBs(where are these, btw?), and kept harping on HDTV. You can't compare the currently used STBs anymore than you can use the Pace example they cited because it's apples and oranges. Current STBs are dated technology wise and overly complex for the intended task. As the consultant noted, you pick the simplist low power single chip solution you can find to decode. In fact, given the parameters laid down, it's not that difficult to spin an existing design, bring it up to current silicon technology and reduce power consumption there even further. When there are potentially 40 or 50 million units to be produced, spending a couple of million to rework a chip is peanuts in comparison.

7 watts per household isn't exactly correct since the average TVs per household is, I believe, set at ~2.5 units. Don't hold me to that though.

No long term use for the product according to the CEA. Except the digital tuner mandate has already been botched and you can still buy analog sets today. How many 15-20 year old TV sets still in service do you know of? I know of several. The point being these converters will have longer than the short lifespan the CEA would have you believe.

The CEA literature you linked to states as a bullet:

"Industry dialog has already disproved the outdated 1W/8W"

Apparently not written by the two CEA guys who attended that meeting or perhaps their hearing is a bit selective?

Newer chip designs with equal functionaltiy can REDUCE power requriements, so what flawed logic are they using to suggest lower power requirements are outdated? Did you read within the same legislation are new, lower power requirements for all sorts of devices? So while technology marches forward on other consumer products and machinery, the CEA would have you believe high tech is going in reverse???

Sorry, not buying it. I work with EEs as a daily routine, including some working in this specific field.

Why do you think the Motorola comments I mentioned earlier had nothing directed toward the power requirements? Think they're hiding behind the CEA? Asking because I'm trying to understand who's really pushing the agenda.

EDIT: LOL, the more I read the CEA stuff, the funnier it gets. Read the bullet about 5x in the datastream for HD. And they care about HD for this adapter for what reason?? You're converting for legacy analog sets. IOW, there will be no need for a power hungry, complex decoder chip.

Test data from one retailer?? A little short on facts there. No engineering or economic reason for produing a higher power device? Oh, how about a CEA member sitting on a bunch of dated, power inefficient component inventory?? Is that a possibility?

Pretty easy to see through this stuff.
#13
Dale, I'll make this sound more simplistic than it is, but let's assume 40-50 million of these boxes are required and there's a profit of $20 per to be made.

You still want to insist no one will step in to fill that need?

Especially since the R&D required or "barrier to entry" is minimal. All of the technology already exists, it's simply a matter of shaving the few percentage points of profit out of the design and running with it.

This perception that we're somehow re-inventing the wheel to make this thing happen is wrong, wrong, wrong. Where I get pissed is the $1 B subsidy and immediately hear that 20% is used for admin costs. That's ridiculous and the poster child for how the gov't can screw up a subsidy program. It should have been free market from the start.

I don't know who the CEA is fronting for, but I say let 'em eat cake! There are players who will step in to fill the need, one time product or not.
#14
To Bob Mankin,

There are too many things you covered for me to address everything, so I'll just focus on the engineering details of building a low power HDTV Tuner.

First, the COFDM system used in Europe is a dual speed system. That is the HD (High Definition) image is send using a high speed data rate and a SD (Standard Definition) image is sent using a lower speed data rate. The idea behind this is, if the signal is weak or the noise level is high, it's easier to receive the signal with the lower data rate. So, viewers on the fringe of the signal could at least see a SD image.

This also makes building a converter box a lot easier. The high speed data is ignored and only the lower speed data is used. There is no need to convert from HD to SD, because the lower speed data is already in SD. The number of pixels that are required in a second are: 720h x 576v x 25 frames per second = 10.37 Mega-pixels per second.

With the US 8-VSB, there is NOT always a SD image being sent. A converter box must deal with 1 of 18 possible resolutions and frame rates. If you watch CBS, NBC, or PBS, they could be sending 1920h x 1080v x 30 frames per second = 62.2 Mega-pixels per second.

With COFDM there are less pixels to process and no conversion is required. With 8-VSB, there are a lot more pixels to process AND conversion from HD to SD is required.

With a COFDM converter, any data being processed can be performed at a lower clock rate compared to 8-VSB. The higher the clock rate, the more power a device, even a single chip device, will consume. Add to that the additional processing required for conversion from HD to SD and any EE should be able to understand that a 8-VSB converter WILL consume more power than a COFDM converter.

When the consultant used the COFDM converter as his example, he was using flawed logic. Likewise, when he pointed out that there are 8-VSB USB tuners that consume 1 to 2.5 watts, this is REALLY deceptive. The USB devices have the tuner and decoder, BUT lack the power hungry processing that follows. That processing is performed on the computer and the computer power requirement is NOT being added to the total power required.

In order to reach the $50 to $60 price point for converters, manufacturers will need to use as many single chip solutions as possible. In general, the higher the part count, the higher the price.

I really doubt that these manufacturers have some huge inventory of older outdated boxes they are going to shove on the market. First to sell them at a $50 to $60 price point would be to take a huge loss and the manufacturers swore before congress that the future converters will be around that price. Second they said in the meeting that the boxes could go as low as 15 watts, but none went under 10 watts. That sounds like they are running the prototype converters with as many single chip solutions as possible. Third, NONE of the current 8-VSB HDTV Tuners out there offer an RF output. The RF output is REQUIRED by the FCC and Congress; so, dumping old inventory is not an option for manufacturers.


In reading your comments, it appears to me that there are several things you are unaware of:

First, as of March 1, 2006, ALL TVs with screen sizes 24" and larger must have a digital tuner built into it. You will still find stores with 24" and larger TVs with an analog tuner, because retailers are allowed to sell off existing inventory.

Second, as of March 1, 2007, ALL TVs of any size must have a digital tuner.

Third, the first 4 generations of 8-VSB HDTV Tuners were pathetic as far as performance. Unless the signal was perfect or nearly perfect, the HDTV Tuner wouldn't work. NO ONE was going to commit to a single chip design with such poor performance. The fifth generation tuners finally gave the performance required, but there was a delay before this could be put into a single chip design. It is only recently that we have seen the sixth generation 8-VSB HDTV tuners using single chip designs.


The troubling part of the energy requirements is that a bad decision was made using bogus information. If you don't believe what I have written here, show this message to the EEs you work with and ask them if what I have written is correct or not. Does a faster clock speed to process more data result in a higher power draw or not? Any EE or tech knows the answer.

With an impossible requirement, expect about 15% of California homes to be left in the dark when analog TV is shut down.


Sincerely,

Bob Diaz
#15
Bob, not sure where the idea got started that COFDM is lower power, when in fact COFDM requires MORE power to transmit equal distances when compared to 8VSB. As for the reception circuit, show me a completed circuit and let's analyze the power budget. You're making the circuitry sound MUCH more difficult that it really is in practice. Until then, we have the word of what appears to be an independant Engineer vs. a lobbyist organization. A bit of a stretch to say that's an objective discussion taking place.

The suggested end product will not be a "single chip" solution. That term is overused in this discussion. For signal integrity reasons, the tuner chip will most likely remain seperate from the decoder chip. It's not important, as it will not significantly impact the final cost.

When speculating on obsolete parts, I was referring to components, not completed units. No one in their right mind is sitting on billions of dollars in completed pre-spec hardware.

I'm well aware of the tuner phase-in. The problem is that all the way into 2007 you will still be able to purchase a brand new analog set that in two years will be need additional hardware to continue to function. This is the fault of the government for not mandating the tuner requirement further in advance of the hard date switchover to digital.

The entire COFDM argument centers around multipath rejection. That was based on the first gen tests. 8VSB has closed that gap. We have 8VSB here for the standard, so let's work towards developing it further rather than holding onto the past. That ship has sailed.

The requirement is not what I consider impossible. Apparently Motorola doesn't consider it impossible because they made no mention in their comments. It was noted early in the discussion that only a subset of vendors were complaining. That should tell you something.
#16
To Bob Mankin,

Please read the following:

DVB, Sinclair to Demo COFDM in 6MHz Channel at NAB2000
3/30/2000 Sinclair Broadcast Group (Baltimore), in conjunction with DVB and Acrodyne (Blue Bell, PA), will conduct a demonstration at NAB 2000 transmitting a dual service of HDTV and SDTV using COFDM in a single 6Mhz channel. ...

What is Hierarchical Modulation?

In hierarchical modulation, two separate data streams are modulated onto a single DVB-T stream. One stream, called the "High Priority" (HP) stream is embedded within a "Low Priority" (LP) stream. Receivers with "good" reception conditions can receive both streams, while those with poorer reception conditions may only receive the "High Priority" stream.

Broadcasters can target two different types of DVB-T receivers with two completely different services. Typically, the LP stream is of higher bit rate, but lower robustness than the HP one. For example, a broadcaster could choose to deliver HDTV in the LP stream, while delivering an independent SDTV service in the HP stream.


Hierarchical Modulation


The European DVB-T standard includes a large number of transmission modes able to adapt the COFDM signal to a wide variety of broadcasting services. Among them, the hierarchical modulation mode separates the RF channel in two virtual channels, each able to carry a transport streams (MPEG-TS) with a dedicated
protection.

Some countries intend to use DVB-T to broadcast HDTV; accordingly they have a need to simulcast the DVB multiplex in its high and standard definitions, to address the two categories of receivers which will be used during the introduction phase of Digital TV services,...


http://www.broadcastpapers.com/whitepap ... chical.pdf


There are many other links I could show you, but most are very technical and the all say about the same thing. The bottom line is than Europe you will find a dual speed (Hierarchical) transmission system with the same program being sent in both HD and SD at the same time.

It is an engineering fact of life that HD video has MORE data to deal with than SD video. In processing data, either the clock speed must be increased to deal with the increased data, which increases power consumption _OR_ parallel circuits must be added to deal with the increased data, which ALSO increases power consumption. This simple fact was ignored by the consultant when he stated that in Europe COFDM receivers operate with 8 watts of power.

Just use "COFDM hierarchical modulation" in any search engine and you'll be able to read LOTS of pages about this feature. Hierarchical Modulation was specified for E-VSB for the USA, but currently no one is using it.

I am not trying to make this sound overly complex; I am just trying to show you that there is a BIG difference in the data rates and power consumption with the receivers between the two systems.

You put a great deal of trust into the consultant's comments, but he has forgotten to take into account critical details and his information on 8-VSB USB tuners is VERY misleading at best.

Apparently Motorola doesn't consider it impossible because they made no mention in their comments.


That's a mighty big ASSUMPTION. Motorola never said, "8 watts no problem." You are reading more into it than what was there and it is also possible that because they knew that CEA was covering that issue, there was no need to repeat the same thing again.

It was noted early in the discussion that only a subset of vendors were complaining. That should tell you something.


I do not consider the CEA group "a subset of vendors". Also, NAB (National Association of Broadcasters) has also spoken out against the California power requirements. Given that many times both sides have been on opposite sides during the DTV transition, it should say it's more than just "a subset of vendors" or a bunch of marketing people who don't know what they are talking about.

SEE:
http://www.tvtechnology.com/dailynews/one.php?id=3852

I have no love for the CEA, which has worked as hard as possible to stop the DTV Tuner mandate. However, in this case, I understanding the engineering behind the problem and agree that an 8-VSB converter must require more power than a COFDM DVB-T converter.

Bob Diaz
#17
Bob, adding EPG features to an otherwise unsophisticated box will require more complex software, memory and some form of battery backup to maintain the guide. I believe that is talked about in the testimony. Understanding that, is there really any mystery why the NAB would be crying that they don't get to make your life more "convenient" with bloated enhanced guide code embedded in the box? Next it'll be someone complaining they can't do VOD with 8W/1W. You have to consider their motivations because when anyone suggests their viewers will simply be cut off if they don't have their way, I tune out immediately because I know it's pure BS.

Rather than continue to argue the supposed merits of COFDM, let's keep the discussion limited to 8VSB. Show me a bare bones box circuit and let's do the math. It's all speculation beyond that.

If Motorola is going to the trouble of writing and submitting a response, wouldn't it make sense they would raise the flag if they had issue with the new power spec? I made that assumption, but somehow your reverse logic holds more merit? How so?

From your tvtechnology.com link:

"...The associations said that consumer electronics manufacturers 'are committed to energy conservation' and that new technologies combining multiple operations in one box will help reduce energy consumption."

From that statement it sounds like they want to rewrite what Congress intended and offer more features in the box than are really needed, just as I speculated above. It's no wonder they don't like the low power requirement. Their concern has little to do with assuring that Grandma has TV beyond the drop dead date and everything to do with adding in features that are not necessary for this application.

Forget the clock speed argument. None of this stuff runs at very fast clock speeds comparitively speaking. Posted to the HDTV-in-SFBay mailing list this morning was an OTA tuner, the Epson LSDT2. The user actually MEASURED his power consumption in on and standby modes and TA-DA...

11W/1W.

That's for an older design box with HDTV capability. Use a less complex processor and drop that even more. I'm looking for a picture of it online and not finding one, but I'm guessing you could slice out a couple of features(LED or LCD display if it has one) and make the spec quite easily.

I guess the CEA wasn't looking too hard. For them to suggest vendors are already paying close attention to power consumption numbers is laughable. Previously it was only about low cost components. Now somone will actually have to do some intelligent design and they don't like it. Tough, I say.

If I still had my OTA boxes, I would hook them up and offer you test data of SD vs. HD tuning to show the difference is insignficant.
#18
To Bob Mankin,

I can see from your last message that you missed the whole point of my going to great length to show the difference between a COFDM decoder and an 8-VSB decoder. I was NOT trying to promote COFDM, but trying to show that one can NOT say that if a COFDM decoder take 8 watts, an 8-VSB decoder must take 8 watts. You can say it, but the engineering says, "NO!"


We are on two different playing fields here, because many of your arguments are what I call emotional, non-engineering arguments. For example in your first post, you stated regarding the article by Dale Cripps:

The article clearly wants to instill fear while being short on facts.


This attacks the credibility of the author, so I guess you want us take what he has to say with a grain of salt. However, given that he writes articles for HDTV Magazine, I believe we should give him a bit more credit for knowing what he's talking about than what you wanted to give him. Later you attack the credibility of the CEA people:

Just another demonstration of how the lobbyists manage to screw it all up and in the end the taxpayer simply pays for Corporate greed. ...

(other message)
Good info, Bob. But personally I'll take the techie over the lobbyist when it comes to objectivity. ...

(other message)
I don't know who the CEA is fronting for, but I say let 'em eat cake! There are players who will step in to fill the need, one time product or not. ...


These types of attacks are called, "Attack the Speaker" and while is sounds good on the surface, it is really a logical fallacy. I can tell you that having worked at Epson America and EIZO Nanao, NO marketing person would ever make a presentation like this without having all the technical facts checked by the Engineering Department. However, when you suggest they are just a lobbyist, this tells me you don't want to even look at the merit of their arguments.

When I pointed out that your comment "It was noted early in the discussion that only a subset of vendors were complaining." and pointed to CEA and NAB (National Association of Broadcasters) as being more than just a "subset of vendors", you comment was:

...Understanding that, is there really any mystery why the NAB would be crying that they don't get to make your life more "convenient" with bloated enhanced guide code embedded in the box? Next it'll be someone complaining they can't do VOD with 8W/1W. You have to consider their motivations because when anyone suggests their viewers will simply be cut off if they don't have their way, I tune out immediately because I know it's pure BS.


This attack on the credibility of NAB and the engineering people within NAB is so off the wall, it's funny. Never mind the fact that if 8-VSB converter boxes can NOT be built to fit the 8 watt California requirement, all California Broadcasters are screwed. Heck, no, you'd rather paint them as being part of the "Dark Side Of the Force". Another comment that amazes me is:

Forget the clock speed argument. None of this stuff runs at very fast clock speeds comparatively speaking. ...


Unless you happen to be a EE, which by your comments you do not appear to be, you have no idea what you are talking about. Nor are you really qualified to judge how easy or hard it is to meet the 8 watt requirement.

Test data from one retailer?? A little short on facts there. No engineering or economic reason for produing a higher power device? Oh, how about a CEA member sitting on a bunch of dated, power inefficient component inventory?? Is that a possibility?


Do you honestly believe that some vender is going to stockpile a bunch of parts for boxes that won't be sold until 2008? Other than your person opinion, what proof of this do you have?


I could go on, but I doubt there is anything I could say to change your mind.


Bob Diaz
#19
You got me, Bob. I do, in fact, question the true motivations of the CEA and the NAB in their comments. I guess you buy into their claim that it's just all about making sure the viewer still has TV in 2009? You act like "lobbyist" is a dirty word when it's just an acknowledgment of the facts.

I respect Dale's accomplishments and what he has written. But in this case just as with this forum thread, the key facts are missing and you're swallowing the flawed logic and analysis of not one, but now two lobbyist organizations who are commenting based on an agenda. You could have settled this quickly by producing the circuit so we can run the numbers. You would rather we just take your word for it while ignoring some rather obvious flaws in the supposed "facts" as presented by these people. No comment on the Epson box example I gave, huh?

Got me again, Bob, I'm not an EE. Are you? But I do have some familiarity with circuit designs having done a few hundred of them in a 20+ year career of PCB layout. I'm the guy who takes the EE's concept and turns it into a physical reality. Somehow I don't think the EE for the design I'm doing at the moment did his power budgeting study for the product by driving down to the local electronics retailer and noting what's on the shelf. I believe real engineeing is a bit more technical than that.

Maybe the CEA just has a lower bar set? Remember back in the CEC proceedings how the CEA responded to the Engineer's testimony? They simply stated that COFDM was different than 8VSB and that they couldn't find a product on the shelves today that would meet the spec. That was it! The sum total of their "technical" response.

Somewhere, I don't know where, you got locked onto this idea that the box simply CANNOT be engineered to meet this spec. I welcome and challenge you to find within this joint PR the specific statement which says such:

http://www.nab.org/newsroom/pressrel/Re ... erters.htm

I see claims of "more expensive" and perhaps "limited availability", but nowhere do I see the claim it's technologically impossible. Please point to what in that release I'm missing. And from near the bottom of that PR we have:

"ABOUT THE NAB: The National Association of Broadcasters is a trade association that advocates on behalf of more than 8,300 free, local radio and television stations and also broadcast networks before Congress, the Federal Communications Commission and the Courts."

Don't know about you, but that sorta nails the idea they are a lobbyist organization so I'm confused why you're taking offense to acknowledging the obvious.
#20
To Bob Mankin,

My background is that I received my Degree from CSULB in Industrial Technology - Electronics Option. This is similar to being 1/2 a EE and 1/2 a Manufacturing Engineer. I worked for TRW as an Engineer for 4 1/2 years in the Microelectronics Group, moved on the Epson America for 9 years as a Senior Engineer, worked at EIZO/Nanao (a computer monitor company) as the Technical Manager for 9 years, and am currently teaching Electronics and Manufacturing Technology at El Camino College for the last 6 years.

I notice that in your last message, you continue to resort to name calling "lobbyist", as if this somehow declares all their arguments null and void.

...But in this case just as with this forum thread, the key facts are missing and you're swallowing the flawed logic and analysis of not one, but now two lobbyist organizations who are commenting based on an agenda....


In the above quote, you have ASSUMED their logic to be flawed and you ASSUMED they some sort of evil hidden agenda. To me it looks like you paint some sort of dark evil image of these people to justify your position. When you ASSUME all sorts of wild things about these people, we are no longer dealing with any reality here. SEE BELOW:

#1...Oh, how about a CEA member sitting on a bunch of dated, power inefficient component inventory?? Is that a possibility? ...

#2 Bob, adding EPG features to an otherwise unsophisticated box will require more complex software, memory and some form of battery backup to maintain the guide. I believe that is talked about in the testimony. Understanding that, is there really any mystery why the NAB would be crying that they don't get to make your life more "convenient" with bloated enhanced guide code embedded in the box? ...

#3 ... You have to consider their motivations because when anyone suggests their viewers will simply be cut off if they don't have their way, I tune out immediately because I know it's pure BS.


#1: I pointed this out before, but I'll say it again; you assumed that some vendors were sitting on a stockpile of old obsolete parts. This makes NO sense, why would any vender hold onto parts that won't be sold until 2008?

#2: You ASSUMED that the driving force behind NAB was to promote power hungry features in the box. This makes NO sense, NAB (National Association of Broadcasters) is worried that if a if a low power box can't be built OR if the cost is excessive, they stand to loose LOTS of viewers in California. Broadcasters make their money from advertising, so a loss in viewers = a loss in income.

#3: You ASSUMED that NAB's motivation could NOT be loss of viewers, you just wrote that off a BS. If you read TV Technology, a trade publication to Broadcast Professionals, on a regular bases (I do), you would see that the potential loss of viewers is a major concern for the Broadcast Industry. Both Cable and Satellite services have diluted their market share. BUT, you just pretend that this could NOT be an issue for them....

Let me spell this out in simple terms, the California Energy Commission's actions are a Crap Shoot: IF existing Chip designs existed that fit within the 8 watt power limit, there's no need for any manufacturer to put up a fight, unless the cost of the chips are excessive. Thus, the potential for one of two bad outcomes are:

NUMBER 1: Manufacturers find that the 8 watt limit is impossible to meet or impractical to meet. In this case, no converter box could legally be sold in California and the 1.6 million California households that need the box are screwed big time.

NUMBER 2: Manufacturers cut some sort of deal with the chip makers to make a low power version of the chips. (This assumes that it can be done and no one can say for sure.) One thing I learned for TRW, Epson, and EIZO; CUSTOM CHIPS COME AT A PREMIUM PRICE!!! The higher price is unknown at this time. This would mean that the CEC has signed a blank check that a lot of LOW INCOME families in California are going to have to pay.


In answer to your comment about the EPSON OTA tuner, the LSDT2: I did a search using GOOGLE and only got 6 hits. Only two hits were of any use. The first was a Product Support Bulletin dated October 19, 2004, which spoke about the part numbers. This does not prove that the product was shipping back in October 2004 (about 1.5 years ago), because when I worked at Epson, we wrote many such bulletins months BEFORE the product shipped. The second link was a posting in a forum about the Epson Converter dated August 26, 2005 (about 8 months ago). This suggests that this "older box" might have been shipping for about a year, give or take, but I can't be 100% sure.

Anyway, you ASSUME that newer technology, newer being a year give or take, MUST have lower power requirements. True going from discreet parts to single chip subsystems saved a lot of power, but Moore's Law only predicts increasing chip density. A lot of people think that things like RAM access time and power requirements should also follow the same curve of Moore's Law, but they don't.

There is also another group that has spoken out against the CEC mandate:

CEA and NAB aren't alone in their concerns. David Donovan, president of the Association for Maximum Service Television, which is collaborating with the NAB on a low-cost A/D converter design, also opposes the mandate.

The regulations, Donovan said, may result "in establishing a different, perhaps more expensive, converter box just for California residents. As a result, California viewers may not receive the full benefits of the federal (subsidy) program. In turn, this may have the unintended consequence of delaying the availability of these converter boxes to the citizens of California. Consumer acceptance is the key to the digital transition, and any delay or impediment to the roll out of digital to analog converter boxes could slow down the digital transition."


http://www.tvtechnology.com/dailynews/one.php?id=3852

Before you call them just a bunch of lobbyists, look at their engineering committee and you see there are a lot of engineers on the committee:

http://www.mstv.org/engmem.html


I live in California and what "boils my blood" about the California Energy Commission's actions is that they are unwilling to accept having converter boxes using 10 to 15 watts; no they want 8 watts. A savings of 2 to 7 watts per box. For this energy savings, they are willing to gamble with the outcome. At risk is 1.6 million California homes who stand to suffer if things don't work out. Why risk a major screw up for just 2 to 7 watts per box?

IF the CEC was really interested in saving energy, let's start with the real energy hog: Light Bulbs. A 100 watt light bulb eats 100 watts of energy, but a 25 watt screw-in compact fluorescent light requires only 25 watts of energy and provides the same light. I'm running out of space, but I can show the engineering math that shows the fluorescent lights cost less in the long run. I consider 75 watts a heck of a better energy savings than 2 to 7 watts and EVERYONE saves. BUT it appears that the CEC wants to go after the high risk, minimal energy savings. Thanks guys...


Bob Diaz
#21
Bob, it's quite clear that you're passionate on the issue and we will just have to agree to disagree.

You have issues with calling these organizations what they are.....lobbyists. You automatically take that as a negative connotation when it doesn't have to be. That's your call. Again and again your posts seem to want to suggest that these organizations are purely out to protect the interests of the consumer and I can't take that suggestion with any amount of seriousness.

You can't produce, for the sake of this discussion, a simple schematic of a viable tuner box, so the arguments are heavy in speculation rather than fact. Much like the testimony from the CEA before the State committee.

Yes, I have made some speculation(ie. possibly obsolete inventory), but it was a casual suggestion, not something I'd rest my hat on. If the issue were that important to me, I'd get more deep into researching it. I am purely speculating on why the CEA and the NAB are taking the stance that they are and I don't happen to share your beliefs on the motivations. No big deal.

If you feel as strongly as you seem to on the matter, then write the State and let your voice be heard. Explain to them your qualifications so it carries more weight. If you want to label ALL manufacturers as believing this mandate is impossible, as you did in your last post, then so be it. I don't happen to think that's an accurate statement.

The Epson reference was from an HDTV related mailing list. I believe I stated such earlier. I believe the thread was even discussing the CEC requirement. You won't find it on the web. It was a real live end user taking real live power consumption measurements. Please don't try to paint this as a made up fact. I don't own that box and I don't know all the details about it, but common sense should tell you that 8W is quite possible based on that existing piece of hardware alone, possibly even less with a purpose built design.

I'm done. Have the last word if you must.