sockeatingdryer:
I find it unlikely that a sub-$1000 computer of any kind would have a graphics card that offers pixel-level programming, which is required for the full effect of CoreImage. Dell doesn't. I'm pretty sure hp doesn't. Alienware certainly doesn't.
The point is that CoreImage requires pixel shading to render CoreImage's real-time effects to their fullest. Cards with pixel shaders are expensive.
For that matter, Tiger doesn't REQUIRE such a graphics card to run. Indeed, CoreImage itself doesn't require that graphics card to run--it scales down the effects for weaker video cards.
Furthermore, the eMac, starting at $799 has a Radeon 9200 in it. That's VERY GOOD for a sub-$1000 computer. The iBook offers the Mobility Radeon 9200, starting at $999. Again, that's quite good--most (all?) desktops and laptops in these ranges on the PC side have integrated graphics.
Apple decided to include a feature in the next operating system that is a serious boon to those whose graphics hardware can handle the load. However, not all current models have that graphics hardware because it's NOT CHEAP. Why shouldn't those who have the graphics hardware be able to use it? The other options are thus:
a) Don't include CoreImage, even though there are a lot of folks that have the graphics hardware to run it at full throttle;
b) Include graphics hardware in all new computers that can fully exploit CoreImage, while either raising prices or lowering profit margins.
Do either of these two look like good solutions to you?