Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ArkSingularity

macrumors 6502a
Mar 5, 2022
925
1,122
Ehh, I don't really care too much about energy efficiency for its own sake when it comes to gaming. Computers use only a tiny percentage of the power used by a typical household.

Throttling is another issue, and that's where Macs shine. My PC laptop throttles down after about 30 minutes or so and I lose about 1/3 of the framerate. Never had this issue on my Mac, whether on battery or whether plugged in.
 

Homy

macrumors 68020
Jan 14, 2006
2,137
1,994
Sweden
Ehh, I don't really care too much about energy efficiency for its own sake when it comes to gaming. Computers use only a tiny percentage of the power used by a typical household.

Throttling is another issue, and that's where Macs shine. My PC laptop throttles down after about 30 minutes or so and I lose about 1/3 of the framerate. Never had this issue on my Mac, whether on battery or whether plugged in.

4090 Ti is rumored to have a TDP of up to 900W. i9-13900K is rumored to have a performance mode with up to 350W TDP. Combined it's over 1200W. With electricity prices here in Europe reaching $0.5/kWh in the coming winter it will definitely be something to care about. Even in Hawaii and San Diego it costs now over $0.4/kWh.

Gaming 4 hours a day in a month would consume 144 kWh costing $72 or $864 a year for just CPU/GPU. An M1 Ultra draws a quarter of that.
 
Last edited:

Homy

macrumors 68020
Jan 14, 2006
2,137
1,994
Sweden
You're assuming that they need that wattage all the time, but you're almost never maxing both out at the same time while gaming. Or, simply put: Save the world, use VSync. 😄

That's the worst case but it can happen. Even with normal usage it's no doubt that you will feel the difference in your wallet with these new upcoming cards and CPUs.
 
  • Like
Reactions: Irishman

MandiMac

macrumors 65816
Feb 25, 2012
1,431
882
That's the worst case but it can happen. Even with normal usage it's no doubt that you will feel the difference in your wallet with these new upcoming cards and CPUs.
What day-to-day compute load would max out both GPU and CPU at the same time?
And who would buy a 4090 TI graphics card and a i9-13900K for normal usage like surfing the web or Word, when being simultaneously worried about the financial impact of energy costs? Let alone that these two items would cost thousands of dollars on their own, that's a lot of energy one could waste with the hardware already available...
 
  • Like
Reactions: Flint Ironstag

dimme

macrumors 68040
Feb 14, 2007
3,054
28,164
SF, CA
I think this is a important topic, I think the point here is, intel/amd/nivida are just using more power to boost performance and Apple is looking at reinventing the CPU/GPU to increase performance. The energy savings you get with apple is a nice benefit. intel/amd/nivida are going to have to rethink their chips sooner or later.
 
  • Like
Reactions: Irishman and Homy

MandiMac

macrumors 65816
Feb 25, 2012
1,431
882
intel/amd/nivida are going to have to rethink their chips sooner or later.
That is true, but as long as gamers and everyone else are just looking for the best performance, the Big Three will cater to them. For others who favor efficiency, however, Apple is the most attractive option without a shadow of a doubt.
 
  • Like
Reactions: Irishman

ArkSingularity

macrumors 6502a
Mar 5, 2022
925
1,122
Indeed. But strangely upgrading your mobile phone each year isn't....
Don't worry though, at least there will be fewer chargers that end up in the landfill. Now you just have to buy a separate charger when you buy a phone, and that charger will even come with its own packaging that creates even more waste than before. 😂
 

mi7chy

macrumors G4
Oct 24, 2014
10,495
11,155
Why would anyone waste money and electricity on a Mac that can't play games when a portable $400 Steam Deck can at ~24W total system power consumption? Whoever says you need a 4090ti/13900K is talking non-sense.

 

orionquest

Suspended
Mar 16, 2022
871
788
The Great White North
What day-to-day compute load would max out both GPU and CPU at the same time?
And who would buy a 4090 TI graphics card and a i9-13900K for normal usage like surfing the web or Word, when being simultaneously worried about the financial impact of energy costs? Let alone that these two items would cost thousands of dollars on their own, that's a lot of energy one could waste with the hardware already available...
Are you new to the forums? There are plenty of examples around here of people buying new hardware "just because" with little to none real usage requirements. I'm sure the same can be said on the other side.....
 

Mr47

Suspended
May 21, 2022
38
55
Ehh, I don't really care too much about energy efficiency for its own sake when it comes to gaming. Computers use only a tiny percentage of the power used by a typical household.

Throttling is another issue, and that's where Macs shine. My PC laptop throttles down after about 30 minutes or so and I lose about 1/3 of the framerate. Never had this issue on my Mac, whether on battery or whether plugged in.
What bs, I play warcraft shadowlands on my Studio and also using it for work. I just funny compared with my Intel + Nvidia the difference I am saving in power (especially now here with the prices). 35watt compared to 250watt is for sure noticeable if you play hours and having the same or better performance. + no Windows, what a relief.
It is the way to go, but games on MacOS have a long way to go before viable and playable and ported.
 
  • Like
Reactions: dimme

diamond.g

macrumors G4
Mar 20, 2007
11,155
2,466
OBX
What bs, I play warcraft shadowlands on my Studio and also using it for work. I just funny compared with my Intel + Nvidia the difference I am saving in power (especially now here with the prices). 35watt compared to 250watt is for sure noticeable if you play hours and having the same or better performance. + no Windows, what a relief.
It is the way to go, but games on MacOS have a long way to go before viable and playable and ported.
I believe them, I am not super worried about power consumption when playing games on my PC, other than to make sure I am getting all the performance I paid for, lol.

If I had the option to play the same games with the same settings and performance on my Mac I would. But so far that isn't really an option.
 
  • Like
Reactions: Flint Ironstag

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,072
2,650
What day-to-day compute load would max out both GPU and CPU at the same time?
Running graphical simulations and doing some lightweight machine learning with pre-processing on the CPU can do that. 100% all the time? No, but pretty close.

I have a Xeon Platinum and RTX8000 running in my desktop workstation (actually two of those in my office, Mac Pros are retired), which makes it warm (hot) and cozy especially in the summer days. AC is running to compensate. Come winter, I can just leave my office windows open and it'll be warm enough to wear a t-shirt. ;)

I'm looking forward to the 4090Ti mainly for the rumoured 96MB of cache. That should be a real game changer (no pun intended). Will probably get one or two. Peak power consumption isn't that much of an issue if it's faster. The M1 Ultra might consume less power, but a rusty old 1080Ti is still almost twice as fast as the 48 core M1 Ultra training a simple VGG16 on CIFAR-10 (https://sebastianraschka.com/blog/2022/pytorch-m1-gpu.html). So a quick burst to even 900W might still be cheaper if the running time is much longer on a different GPU. It depends on the use-case.
 
  • Like
Reactions: Flint Ironstag

Lihp8270

macrumors 65816
Dec 31, 2016
1,119
1,590
4090 Ti is rumored to have a TDP of up to 900W. i9-13900K is rumored to have a performance mode with up to 350W TDP. Combined it's over 1200W. With electricity prices here in Europe reaching $0.5/kWh in the coming winter it will definitely be something to care about. Even in Hawaii and San Diego it costs now over $0.4/kWh.

Gaming 4 hours a day in a month would consume 144 kWh costing $72 or $864 a year for just CPU/GPU. An M1 Ultra draws a quarter of that.
Maybe. But I did the maths yesterday as our contract was up.

At 0.68p/kWh. It would still be over 100 hours use at 100% power draw to make the money back if I sold my Alienware and spent £2k on AS.

If you’re in the market for a new machine apple is a good option, but even with insane energy prices. It’s not economical to switch.
 

Homy

macrumors 68020
Jan 14, 2006
2,137
1,994
Sweden
I think this is a important topic, I think the point here is, intel/amd/nivida are just using more power to boost performance and Apple is looking at reinventing the CPU/GPU to increase performance. The energy savings you get with apple is a nice benefit. intel/amd/nivida are going to have to rethink their chips sooner or later.

Yeah, many people seem to be missing the bigger picture. They only look at their personal usage and say they can afford it, it's worth it or they're not going to use maximum TDP. The CPUs and GPUs still use much more power than Apple Silicon even during normal usage. What happens when hundreds of thousands or millions buy and use these power hungry systems?

Even with half the TDP, 600W, gaming 4 hours a day in a month would consume 72 kWh costing $36 or $438 a year for just CPU/GPU for ONE person. 10 million people would use 8 760 million kWh yearly costing $4 380 million a year.

In the US the average annual electricity consumption is 10 715 kWh. The electricity consumption of the gaming people above would be equivalent to 817 546 households. It takes around 1460 onshore wind turbines to produce the energy, or 3.5 nuclear power plants on average every year. Think big, think different!
 
  • Like
Reactions: dimme

diamond.g

macrumors G4
Mar 20, 2007
11,155
2,466
OBX
Yeah, many people seem to be missing the bigger picture. They only look at their personal usage and say they can afford it, it's worth it or they're not going to use maximum TDP. The CPUs and GPUs still use much more power than Apple Silicon even during normal usage. What happens when hundreds of thousands or millions buy and use these power hungry systems?

Even with half the TDP, 600W, gaming 4 hours a day in a month would consume 72 kWh costing $36 or $438 a year for just CPU/GPU for ONE person. 10 million people would use 8 760 million kWh yearly costing $4 380 million a year.

In the US the average annual electricity consumption is 10 715 kWh. The electricity consumption of the gaming people above would be equivalent to 817 546 households. It takes around 1460 onshore wind turbines to produce the energy, or 3.5 nuclear power plants on average every year. Think big, think different!
I think Apple's hardware is fine for gaming, assuming the games you want to play are on it.

I still feel that Apple Arcade is a poor substitute for Game Pass PC though.
 

R!TTER

macrumors member
Jun 7, 2022
58
44
35watt compared to 250watt is for sure noticeable if you play hours and having the same or better performance. + no Windows, what a relief.
At what settings & resolution, 720p perhaps? Besides you can also make Intel/AMD/Nvidia products a lot more efficient with a little bit of tuning, even something like limiting power on a GPU can save a lot of energy. The traditional PC would still be less efficient but you're exaggerating quite a bit here!
 

mi7chy

macrumors G4
Oct 24, 2014
10,495
11,155
At what settings & resolution, 720p perhaps? Besides you can also make Intel/AMD/Nvidia products a lot more efficient with a little bit of tuning, even something like limiting power on a GPU can save a lot of energy. The traditional PC would still be less efficient but you're exaggerating quite a bit here!

What games though?

 

kpluck

macrumors regular
Oct 8, 2018
148
475
Sacramento
Yeah, many people seem to be missing the bigger picture. They only look at their personal usage and say they can afford it, it's worth it or they're not going to use maximum TDP. The CPUs and GPUs still use much more power than Apple Silicon even during normal usage. What happens when hundreds of thousands or millions buy and use these power hungry systems?
Pointless posts on the internet consume far more power. ;)
 
  • Like
Reactions: ArkSingularity

ArkSingularity

macrumors 6502a
Mar 5, 2022
925
1,122
Yeah, many people seem to be missing the bigger picture. They only look at their personal usage and say they can afford it, it's worth it or they're not going to use maximum TDP. The CPUs and GPUs still use much more power than Apple Silicon even during normal usage. What happens when hundreds of thousands or millions buy and use these power hungry systems?

Even with half the TDP, 600W, gaming 4 hours a day in a month would consume 72 kWh costing $36 or $438 a year for just CPU/GPU for ONE person. 10 million people would use 8 760 million kWh yearly costing $4 380 million a year.

In the US the average annual electricity consumption is 10 715 kWh. The electricity consumption of the gaming people above would be equivalent to 817 546 households. It takes around 1460 onshore wind turbines to produce the energy, or 3.5 nuclear power plants on average every year. Think big, think different!
That's 817,546 households out of hundreds of millions of households in the US, or less than 1% out of household energy usage alone. And household/consumer usage doesn't even count for the majority of energy consumption to begin with: Industry uses about 70% of the world's energy worldwide.

Gamers use a tiny, tiny percentage of that. Going after gamers for "using too much energy" is like trying to address a tsunami with a bunch of buckets by hand. It's not practical and it's not the kind of solution that would make any real difference in the end.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.