Hey, I'm not trying to bash you here but I think you are sidestepping the issue. It is about the potential of the cards, the mobile versions are not going to be more powerful than the regular cards and if the regular cards are choking (both AMD and NVIDIA) you think the mobile version is going to do better?
No, I don't. My points have largely been simply that there is uncertainty until we see what these do specifically in Macs, and specifically the Mobile version of the chips with Apple's drivers, etc.
Mobile chips have always been worse in my experience, but can often also be different architecture. nVidia has done this in the past, used old technology rebranded. I imagine the same is possible in reverse as well, if a mobile version comes out after the original, it could have refined architecture, but not necessarily higher specs.
If you look at my original post, you'll see that I myself link to comparisons that are for the desktop class chips since mobile comparisons weren't available.
But the reviews, and especially benchmarks, that were posted are different. These are PC tech sites comparing PC graphics cards on high-end gaming systems most of the time. And I felt that is where it gets unfair to toss the iMac into that.
Like I said in the OP, iMacs are essentially "desktop laptops", and if you want to know how an iMac compares to others in it's class, it needs to be pitted against laptops and mobile-class chips. You pit a HD 2600 Pro versus an 8600 GT and you'll have different results then if you put a Mobility HD 2600 Pro versus an 8600M GT.
So we can look at relative technical differences in the full desktop class chips, but when it comes to lengthy reviews and benchmarks, that isn't going to translate over right. The mobile chips from nVidia are not up to par with their desktop cousins, there could easily be a smaller gap between them and ATI's offerings in the mobile arena.
Therein lies the problem so many people have when they think of the iMac, they act as if these machines are the standard mid-sized tower PC, when they aren't. I tried to make that clear in the OP.
iMacs are aimed at being efficeint, quiet, low-power, all-in-one solutions for general computing and media use. They are great as what they are.
If people want to complain, like I said, complain there is no consumer level Mac Pro. Don't bash the iMacs just because they don't compete with your huge PC tower.
The truth is an iMac could play games before they were updated and they still can. It's never been a power-house for gaming and never will be. People are acting as if these new iMac gpus are the end of the world, as if they are the worst iMac ever. That isn't true, they should easily out class the previous offerings (minus that 2400 XT).
I switched to an iMac from my 3-yr old gaming PC and have loved it. I even ended up with better gaming performance! But I knew what I was getting when I bought it, I wasn't silly enough to think an AIO solution like an iMac was going to be full of desktop-class hardware.
Playing the latest and greatest games - especially at high settings - was always something the gamers with endless pockets got to enjoy. If you didn't want to fork over for a new card every 6 months, the only games you were likely to run at max settings + max resolution were 3 years old, maybe.
Fair enough. Drivers can make a difference. But in my experience no amount of driver tweaking is going to make a $79 card perform like a $400 card. At 1024x768 no AA, no AF the 2400 runs Oblivion at 6.2 FPS I think I saw the 2600pro at 19FPS and the 2600XT at 23FPS. They barely run COD2 better. Do you really expect to see an added 30-40FPS to these scores by driver tweaking?
Actually, what I found interesting about that, was even at those low framerates, the HD 2600 Pro was still beating out the 7600GT! Oblivion at 14.1 FPS on the 7600 GT, compared to 16.9 FPS on the HD 2600 Pro. Neither are good framerates, and it makes me wonder if Oblivion just has a horrible engine!
Of course, I probably won't be running any games beyond 1280x768, which should perform better then 1280x1024 they are using to benchmark. And these framerates... are they just showing averages? Interesting, none the less.