Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TheSilencer

macrumors regular
May 27, 2007
111
0
DirectX10 is for Vista only, yes, but the Shader 4.0 are available for OpenGL too via a OGL Shader4 bridge. So, you get the same effects, the same lighting, the same particles and so on for OGL as for DX10 and that is why you need a good DX10 card, not for Vista but for Shader 4.0.
 

oingoboingo

macrumors 6502a
Jul 31, 2003
988
0
Sydney, Australia
Hey Haoshiro how about we quote some reviews?:)

...

Thanks for pointing all these reviews out. Ultimately, I believe real world game benchmarks are what counts. I spent the morning after the iMac announcement reading a lot of these articles, and unfortunately came to the conclusion that as much as I'd like a new iMac on my desk, it didn't make sense to pay ~ AU $2000 for a machine which can't run even older gaming titles at full native panel resolution at acceptable frame rates (when the occasional spurt of gaming was one of the things I would be buying the iMac for).

I haven't followed the graphics card market all that closely for the last couple of years, so it was a real eye-opener to see how poorly ATI's new low and mid-range GPUs were being rated. It's almost like it's ATI's turn to experience some of the pain that nVidia must have felt when it released the FX 5xxx series chips a few years back.
 

fblack

macrumors 6502a
May 16, 2006
528
1
USA
I haven't followed the graphics card market all that closely for the last couple of years, so it was a real eye-opener to see how poorly ATI's new low and mid-range GPUs were being rated. It's almost like it's ATI's turn to experience some of the pain that nVidia must have felt when it released the FX 5xxx series chips a few years back.

I think you're exactly right in your comparison. Several reviews stated that they were late to market with their chips. Also that there's a giant hole between their 2600 and 2900 offerings. Its just how technology works one company leapfrogs the other and so on. Not too long ago AMD was taking a bite out of Intel in cpu's, but now the tables have turned.

I dont think Apple had too many choices in regards to new mid-range gpus and might just have gotten a better deal from AMD/ATI. But maybe they will surprise us and offer a BTO option for a better card down the road. Being a mac fan means you always get to hope...:D
 

Mac.Jnr

macrumors member
May 26, 2007
97
0
I dont think the ATI HD2600 Pro is a mobile GPU. I believe it's the desktop version but still at the end of the day it sucks balls still.
 

DoFoT9

macrumors P6
Jun 11, 2007
17,586
99
London, United Kingdom
thanks so much for this post H-man. its now perfectly clear to me wat this card is about and where it performs best. people (including me) jumped to conclusions based on fony reviews i read and immediately posted bad comments. i now know tht its not a bad card for the newest games, some older games will not run anywhere near the best they could

once again thanks :)
 

Haoshiro

macrumors 68000
Original poster
Feb 9, 2006
1,894
6
USA, OR
Hey, I'm not trying to bash you here but I think you are sidestepping the issue. It is about the potential of the cards, the mobile versions are not going to be more powerful than the regular cards and if the regular cards are choking (both AMD and NVIDIA) you think the mobile version is going to do better?

No, I don't. My points have largely been simply that there is uncertainty until we see what these do specifically in Macs, and specifically the Mobile version of the chips with Apple's drivers, etc.

Mobile chips have always been worse in my experience, but can often also be different architecture. nVidia has done this in the past, used old technology rebranded. I imagine the same is possible in reverse as well, if a mobile version comes out after the original, it could have refined architecture, but not necessarily higher specs.

If you look at my original post, you'll see that I myself link to comparisons that are for the desktop class chips since mobile comparisons weren't available.

But the reviews, and especially benchmarks, that were posted are different. These are PC tech sites comparing PC graphics cards on high-end gaming systems most of the time. And I felt that is where it gets unfair to toss the iMac into that.

Like I said in the OP, iMacs are essentially "desktop laptops", and if you want to know how an iMac compares to others in it's class, it needs to be pitted against laptops and mobile-class chips. You pit a HD 2600 Pro versus an 8600 GT and you'll have different results then if you put a Mobility HD 2600 Pro versus an 8600M GT.

So we can look at relative technical differences in the full desktop class chips, but when it comes to lengthy reviews and benchmarks, that isn't going to translate over right. The mobile chips from nVidia are not up to par with their desktop cousins, there could easily be a smaller gap between them and ATI's offerings in the mobile arena.

Therein lies the problem so many people have when they think of the iMac, they act as if these machines are the standard mid-sized tower PC, when they aren't. I tried to make that clear in the OP.

iMacs are aimed at being efficeint, quiet, low-power, all-in-one solutions for general computing and media use. They are great as what they are.

If people want to complain, like I said, complain there is no consumer level Mac Pro. Don't bash the iMacs just because they don't compete with your huge PC tower.

The truth is an iMac could play games before they were updated and they still can. It's never been a power-house for gaming and never will be. People are acting as if these new iMac gpus are the end of the world, as if they are the worst iMac ever. That isn't true, they should easily out class the previous offerings (minus that 2400 XT).

I switched to an iMac from my 3-yr old gaming PC and have loved it. I even ended up with better gaming performance! But I knew what I was getting when I bought it, I wasn't silly enough to think an AIO solution like an iMac was going to be full of desktop-class hardware.

Playing the latest and greatest games - especially at high settings - was always something the gamers with endless pockets got to enjoy. If you didn't want to fork over for a new card every 6 months, the only games you were likely to run at max settings + max resolution were 3 years old, maybe.

Fair enough. Drivers can make a difference. But in my experience no amount of driver tweaking is going to make a $79 card perform like a $400 card. At 1024x768 no AA, no AF the 2400 runs Oblivion at 6.2 FPS I think I saw the 2600pro at 19FPS and the 2600XT at 23FPS. They barely run COD2 better. Do you really expect to see an added 30-40FPS to these scores by driver tweaking?

Actually, what I found interesting about that, was even at those low framerates, the HD 2600 Pro was still beating out the 7600GT! Oblivion at 14.1 FPS on the 7600 GT, compared to 16.9 FPS on the HD 2600 Pro. Neither are good framerates, and it makes me wonder if Oblivion just has a horrible engine!

Of course, I probably won't be running any games beyond 1280x768, which should perform better then 1280x1024 they are using to benchmark. And these framerates... are they just showing averages? Interesting, none the less.
 

takao

macrumors 68040
Dec 25, 2003
3,827
605
Dornbirn (Austria)
Therein lies the problem so many people have when they think of the iMac, they act as if these machines are the standard mid-sized tower PC, when they aren't. I tried to make that clear in the OP.

iMacs are aimed at being efficeint, quiet, low-power, all-in-one solutions for general computing and media use. They are great as what they are.

If people want to complain, like I said, complain there is no consumer level Mac Pro. Don't bash the iMacs just because they don't compete with your huge PC tower.

i said it in 2003 and i say it today: apple needs a small consumer level mac with user replacable graphics card, accessable ram slots and 1-2 harddisks (no extra pci card slots or so) with no built in screen

is it so difficult to create ? looking at how much money/devs apple is throwing at gadgets like phones,p3players, apple tv etc. i'm still worried that apple ignores a market where money can be easily made with less effort

in 1,5 to 2 years i want to replace my g4 mac mini (with whom i'm quite happy except it's crappy performance at anything 3d and it's problems with large resolution LCDs through dvi) and so far i'm rather split... on the one side i would love an apple desktop to replace my mini and my windows computer but going with the minimal configuration (mac pro since macm ini is veryl ikely be discontinued/no replacable graphicscard/single screen) i would start at more than 2290 €, which for me is not feasable since it would be cheaper to get a macbook with crappy 3d performance and a gaming PC with better 3d performance which would be again annoying

seriously apple's ignorance towards people who happen to want 3d performance is staggering only topped by their "let's lock them in on one model which is highly priced and total overkill in anything besides graphics cards which he have to upgrade for hundreds of dollors on top of it" way of marketing
 

Haoshiro

macrumors 68000
Original poster
Feb 9, 2006
1,894
6
USA, OR
I think that's because they are really just going after the casual user and the Pro user.

That middle-range is where a lot of the home PC system builders are. I just don't think it's a market Apple wants to be in. We're talking about people who, by and large, want to buy a cheap mac they can run Windows on and swap their GPUs out to there hearts content.

Even though Mac-compatible GPUs aren't in huge abundance, are they? More support costs, more hassles for Apple. People who want to open their machines and tinker with them, and expect everything to work... then call Apple when they break something (by, say, installing a PC-only GPU).

As a consumer I might love to have something like that, but it just doesn't seem like a great idea on a business level. Those people can go to newegg and tinker all they want, they can have their Windows and their $600 hardcore gaming GPUs.

I just don't think that is what Apple is about, nor who Macs are for. They are trying to make an encompassing experience that is far better for the majority of users (the every day non-geek), and satisfy the needs of professionals at the same time.

Who complains about this "gap" in their lineup anyway? Gamers. A small subset of Gamers who also want to be Mac users in their own way.

As more and more of my friends get older, they seem to be wrapped up in hardware less and less. Several of my hardcore gaming friends are planning on moving to Apple soon, and having a Low to Mid range GPU is something they've decided is worth it... they don't have as much time to sink into these and a lot less pride to care if they can max out the settings or not.

A couple are like me, who decided to switch the majority of my gaming time to consoles. It's worked out great so far! :D
 

TheSilencer

macrumors regular
May 27, 2007
111
0
Well, the base of this discussion is the announcement of games for mac. Now people ask for which Mac they announced it? Only the the PRO Series and PRO customers? That is the real question behind all this. If they really want games back on Mac, they have something to give for it IMO, because now only the Mac Pro is a option for their games like Command and Conquer 3 - if it comes out before Duke Nukem Forever :D - and NFS Carbon, not to mention the upcomming NFS Pro Street or other games this year like ET Quake Wars.
 

takao

macrumors 68040
Dec 25, 2003
3,827
605
Dornbirn (Austria)
well i also like strategy games and thus consoles only is out of question

so apple would have extra hassle because of replacable graphics cards ? they already _have that_ on the mac pro

and what sense does it make that stock harddisks/RAM works but stock graphics cards don't ... more difficult for apple but i as a customer want to do something like that ... for me it would be the same like os x only working with models from 1 printer vendor

personally i want to have a mid range GPU anyway... a mid range desktop GPU with normal amount of VRAM for the year 2007
with apple that means 2500+ with a mac pro

_that's_ why there is only a _small_ subset and not a _bigger one_ ... it's not the customers fault ...

personally i know 4-5 who would buy an apple desktop if they could upgrade graphics cards/harddrives to their needs below a 1200-1500 price point

i don't care about maxing out settings either.. i want to run new games "ok" 2-3 years down the line, which you simply can't with an entry level cards

also about gamers being a small subset is that seriously coming from somebody owning a mac ? ;) .. believe it or not macs have a huge draw _especially_ from nerds... nowhere are you going to find as many macbooks here than on university in a computer science course...

also gaming folks are willing to spend more money on hardware than the average user which would be ... guess what: i too know more (3) console/pc-gamers too who bought macs additionally (especially laptops) than other casual computer users... of which _all_ (5) without exception rejected the idea of a mac when proposed to them by me
in fact i know easily 4 guys who dropped the idea of buying a mac just because of the "no good graphics card below 1500" problem

edit: oh and you talked about "media use": personally i consider games also media just like movies
 

Barham

macrumors regular
Feb 5, 2004
166
37
Ok, this thread has become a lost cause IMO, but I think it's important that I say this.

We are forgetting something very important here about what Apple has done to the iMac line. Last rev, apple made it possible for us to have a low-mid-midplus-high end iMac. It seems to me that Apple has now cut our options (low-mid-high).

What I'm saying is that we should be comparing the 2400 to the old 17" which had integrated graphics. So, in that light, the new iMacs are a huge upgrade.

Now as for the 24", we have more of an arguement.


Apple still has the same problem they always have: too much resolution for the GPU's that are available to the iMac form factor.

What confuses me, is that they parade EA around at WWDC and then refuse to give us a "Gamer's Mac". We could run around that topic forever building possible configs, but it seems as though Apple is determined to stick with Consumer and Pro in their desktops, leaving the Prosumer (read: Gamer) out of the equation.
 

aliquis-

macrumors 6502a
May 20, 2007
680
0
Uhm, isn't both vertex and pixel "shaders"? And in that case what does the other graphics cards use if not shaders?

I doubt it will be much faster than 8600GT aswell.

Also OS X won't have usage for modern gpu technologies until their opengl stack is updated I guess.

Are you sure that they will offer more shader performance? What are the base for that claim? I hope it's not only the number of stream processors because clockrate and what they do each time matters aswell.
 

aliquis-

macrumors 6502a
May 20, 2007
680
0
Just like DirectX 9 doesn't benefit when run on a DirectX 10 video card, am I right?
I don't know how it works actually, obviously Direct X 9.0 games (shader model 3?) works on dx 10 cards so the unified shader model seems to work anyway, but say you wrote a game for dx 7 which didn't had support for sm3.0 and all the functions in the graphics card, only because you bought a new card or updated dx to dx9.0 you wouldn't get better graphics, you need to have the game use them aswell.

And in the case of os x games and their opengl functions I've read earlier that Apple is slow in updating their opengl versions and that it's also quite slow so therefor I would assume that it doesn't make use of all the "tricks" on the most modern graphics cards and that some things (say lightning) can get better later once they start using the new better functions for it.
 

Eidorian

macrumors Penryn
Mar 23, 2005
29,190
386
Indianapolis
I don't know how it works actually, obviously Direct X 9.0 games (shader model 3?) works on dx 10 cards so the unified shader model seems to work anyway, but say you wrote a game for dx 7 which didn't had support for sm3.0 and all the functions in the graphics card, only because you bought a new card or updated dx to dx9.0 you wouldn't get better graphics, you need to have the game use them aswell.

And in the case of os x games and their opengl functions I've read earlier that Apple is slow in updating their opengl versions and that it's also quite slow so therefor I would assume that it doesn't make use of all the "tricks" on the most modern graphics cards and that some things (say lightning) can get better later once they start using the new better functions for it.
Actually you can force new tricks onto old games. They're limited to anti-aliasing and anisotropic filtering from what I've seen though.

x16 AA & x16 AF make old games look a little better.
 

fblack

macrumors 6502a
May 16, 2006
528
1
USA
No, I don't. My points have largely been simply that there is uncertainty until we see what these do specifically in Macs, and specifically the Mobile version of the chips with Apple's drivers, etc.

I'm ok with you saying there's uncertainty and maybe wanting some benchmarks specifically using the new iMac. I think that's perfectly fine. Overall I think this is a good thread because if people were looking to these macs as a gaming alternative they may pause and wait a few weeeks to see some benchmarks.:)

However, I just want to point out that you did not seem to be operating from uncertainty at the begining of this thread. I might have mistaken the tone, but it sounded to me to be future positive about what these cards can deliver. Like in your next statement:

Mobile chips have always been worse in my experience, but can often also be different architecture. nVidia has done this in the past, used old technology rebranded. I imagine the same is possible in reverse as well, if a mobile version comes out after the original, it could have refined architecture, but not necessarily higher specs.

That sounds to me like hoping for an outcome that could be a long shot. You also stated in the begining of the thread:
performance is almost garaunteed to improve as both drivers and game software matures.
Uncertainty works both ways it may be good, it may be bad. Your position seemed to me to be generally positive and that is why I responded with the reviews that it may not be all positive and it really depends on what you want to do with your iMac gaming included.

But the reviews, and especially benchmarks, that were posted are different. These are PC tech sites comparing PC graphics cards on high-end gaming systems most of the time. And I felt that is where it gets unfair to toss the iMac into that.

If the cards don't do well on machines with faster specs, how do you think they will do on an iMac? I think the reviews have value in helping us determine if its a good investment for casual or hardcore gaming. I think this is different than when people try to match the iMac up against a gaming rig and then say it bites. That is of course like you said unfair, an iMac is more of a family machine not made solely for gaming.

Like I said in the OP, iMacs are essentially "desktop laptops", and if you want to know how an iMac compares to others in it's class, it needs to be pitted against laptops and mobile-class chips. You pit a HD 2600 Pro versus an 8600 GT and you'll have different results then if you put a Mobility HD 2600 Pro versus an 8600M GT.

I like your term and I think its reasonable to compare to an iMac to laptops. I'm not saying compare it to a high end gaming rig, what I'm saying is if the 2600 gets 16.9FPS vs an 8600's 12FPS they both bite and if we get this in regular cards can we expect better out of mobile? Should we not be cautious in our praises?

If people want to complain, like I said, complain there is no consumer level Mac Pro. Don't bash the iMacs just because they don't compete with your huge PC tower.
I am not bashing the iMac. I like the new look, heck I kind of like the new keyboard (not the BT one). I think its a good machine, I am just skeptical of the GPU. I've been a long time mac user and I love them, but GPU selection hasn't ever been great. I mean 7300 on the Macpro? 2years of the same Radeon 9700 on the G4 powerbooks? In contrast the 7600GT BTO at the time was one of the better offerings I've seen from apple.

I wish there was a consumer tower I'd get in line to buy one.

The truth is an iMac could play games before they were updated and they still can. It's never been a power-house for gaming and never will be. People are acting as if these new iMac gpus are the end of the world, as if they are the worst iMac ever. That isn't true, they should easily out class the previous offerings (minus that 2400 XT).
Absolutely. It just depends on what you want to play and at what settings. But it would be nice to have choices, no? A BTO would help alot. Say, anybody done a poll on what people consider satisfactory FPS? I know GFLPraxis said in another thread 60 FPS, low-to-medium settings. I'd be curious what most people would say.

Playing the latest and greatest games - especially at high settings - was always something the gamers with endless pockets got to enjoy. If you didn't want to fork over for a new card every 6 months, the only games you were likely to run at max settings + max resolution were 3 years old, maybe.
I absolutely agree with you there. I'm happy running games at 1024x768 as long as I get decent frame rates.

Actually, what I found interesting about that, was even at those low framerates, the HD 2600 Pro was still beating out the 7600GT! Oblivion at 14.1 FPS on the 7600 GT, compared to 16.9 FPS on the HD 2600 Pro. Neither are good framerates, and it makes me wonder if Oblivion just has a horrible engine!

LOL. What I find interesting is Apples game page:
http://www.apple.com/games/hardware/
The iMac is listed for intermediate gamers and under suggested games you find Rollercoater Tycoon, The Movies, Sims2:pets, Ratatouille. Be still my beating heart!:D

Strategy gamers should take heart tho, CIV4 and StarWars Empire at War are listed.
 

GFLPraxis

macrumors 604
Mar 17, 2004
7,152
460
I absolutely agree with you there. I'm happy running games at 1024x768 as long as I get decent frame rates.

Exactly; I've always been in the camp willing to turn down some settings if I get a good framerate and don't have to spend ridiculous amounts of money for it. :)

My old CRT was 1600x1200 but I usually gamed at 1024. Everyone called my card horrid on the internet (Geforce FX 5200, 128 MB) but with a 20% overclock and medium settings I could run new releases just fine (like when Star Wars Battlefront came out, I picked it up day one and had medium settings, 1024x768, perfect).

Heck, I game on a GMA950 Macbook now w/2 GB of RAM. Of course, no Oblivion, but I'd be perfectly happy with Oblivion at Medium settings on this new iMac.
 

Eidorian

macrumors Penryn
Mar 23, 2005
29,190
386
Indianapolis
Exactly; I've always been in the camp willing to turn down some settings if I get a good framerate and don't have to spend ridiculous amounts of money for it. :)

My old CRT was 1600x1200 but I usually gamed at 1024. Everyone called my card horrid on the internet (Geforce FX 5200, 128 MB) but with a 20% overclock and medium settings I could run new releases just fine (like when Star Wars Battlefront came out, I picked it up day one and had medium settings, 1024x768, perfect).

Heck, I game on a GMA950 Macbook now w/2 GB of RAM. Of course, no Oblivion, but I'd be perfectly happy with Oblivion at Medium settings on this new iMac.
That would be fine on a CRT but this is LCD land.
 

aliquis-

macrumors 6502a
May 20, 2007
680
0
HD 2600 Pro:
Shader Operations: 72000 Operations/sec
7600GT:
Shader Operations: 6720 Operations/sec
+ 1,071% more the Shader Operations (10.71X)
But it's hard to compare cards on specs alone, so only real FPS numbers will tell you the truth, and in this case the unified shaders only work with one pixel at the time where the old methods worked with four components each time if I remember things correctly, that might mean that in comparable shader operations the 7600GT did 6720*4=26 880 operations, which is still almost just 1/3 as much but much more than your original claim anyway.

I guess stuff as:
Pixel Fill Rate: 4480 MPixels/sec
Texture Fill Rate: 6720 MTexels/sec
Vertex Operations: 700 MVertices/sec

Explains why it's hard/stupid to go by specs alone.

Anyway the card is probably a similair performer as the old x1600, but it supports dx10 and more importantly for Apple hd-decoding, and that is probably why they switched. (Also it's newer, why would they stay with an old generation of gpus?)
 

contoursvt

macrumors 6502a
Jul 22, 2005
832
0
Well mine are the Dell units which I had purchased used like 4 years ago but they still work fantastic (knock wood) and right now I run both at 1600x1200 (on two seperate machines). I think I tried 1800x something once but man things started getting kinda small :)

Anyway they are not as sharp or bright as LCD monitors but I actually find them pretty easy on the eyes and amazing colour quality so when I work with my digital photos I have no issues at all.

Oh and of course gaming rocks on them :)


High Five!

I have the exact same monitor! 76 lbs. of joy and stupid high resolutions over VGA.
 

aliquis-

macrumors 6502a
May 20, 2007
680
0
Hm, yes. But the DX10 performance is very low for all mid range cards, the fun starts with GeForce 8800GTS 640MB or ATi HD2900XT.

http://www.anandtech.com/video/showdoc.aspx?i=3029&p=3
8600GT is better all over in both dx9 and dx10 there.

I think it's decent framerates anyway, more than I expected and makes me a little more confident that sc2 might really play. I guess they had more vram than 128MB thought, **** you Apple one more time ;D
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.