Using Nvidia and Intel hardware seems like a crime in this moment
Ehh, I don't really care too much about energy efficiency for its own sake when it comes to gaming. Computers use only a tiny percentage of the power used by a typical household.
Throttling is another issue, and that's where Macs shine. My PC laptop throttles down after about 30 minutes or so and I lose about 1/3 of the framerate. Never had this issue on my Mac, whether on battery or whether plugged in.
You're assuming that they need that wattage all the time, but you're almost never maxing both out at the same time while gaming. Or, simply put: Save the world, use VSync. 😄
What day-to-day compute load would max out both GPU and CPU at the same time?That's the worst case but it can happen. Even with normal usage it's no doubt that you will feel the difference in your wallet with these new upcoming cards and CPUs.
That is true, but as long as gamers and everyone else are just looking for the best performance, the Big Three will cater to them. For others who favor efficiency, however, Apple is the most attractive option without a shadow of a doubt.intel/amd/nivida are going to have to rethink their chips sooner or later.
Driving a car seems like a crime to the planet. Taking a trip on a cruise, a flight to anywhere as well. Eating meat every day of the week is killing the planet.Using Nvidia and Intel hardware seems like a crime in this moment
Sadly with a smaller game library, using Apple hardware to game isn't 100% bulletproof.Using Nvidia and Intel hardware seems like a crime in this moment
Indeed. But strangely upgrading your mobile phone each year isn't....Driving a car seems like a crime to the planet. Taking a trip on a cruise, a flight to anywhere as well. Eating meat every day of the week is killing the planet.
...and so on. ;-)
Don't worry though, at least there will be fewer chargers that end up in the landfill. Now you just have to buy a separate charger when you buy a phone, and that charger will even come with its own packaging that creates even more waste than before. 😂Indeed. But strangely upgrading your mobile phone each year isn't....
Are you new to the forums? There are plenty of examples around here of people buying new hardware "just because" with little to none real usage requirements. I'm sure the same can be said on the other side.....What day-to-day compute load would max out both GPU and CPU at the same time?
And who would buy a 4090 TI graphics card and a i9-13900K for normal usage like surfing the web or Word, when being simultaneously worried about the financial impact of energy costs? Let alone that these two items would cost thousands of dollars on their own, that's a lot of energy one could waste with the hardware already available...
What bs, I play warcraft shadowlands on my Studio and also using it for work. I just funny compared with my Intel + Nvidia the difference I am saving in power (especially now here with the prices). 35watt compared to 250watt is for sure noticeable if you play hours and having the same or better performance. + no Windows, what a relief.Ehh, I don't really care too much about energy efficiency for its own sake when it comes to gaming. Computers use only a tiny percentage of the power used by a typical household.
Throttling is another issue, and that's where Macs shine. My PC laptop throttles down after about 30 minutes or so and I lose about 1/3 of the framerate. Never had this issue on my Mac, whether on battery or whether plugged in.
I believe them, I am not super worried about power consumption when playing games on my PC, other than to make sure I am getting all the performance I paid for, lol.What bs, I play warcraft shadowlands on my Studio and also using it for work. I just funny compared with my Intel + Nvidia the difference I am saving in power (especially now here with the prices). 35watt compared to 250watt is for sure noticeable if you play hours and having the same or better performance. + no Windows, what a relief.
It is the way to go, but games on MacOS have a long way to go before viable and playable and ported.
Running graphical simulations and doing some lightweight machine learning with pre-processing on the CPU can do that. 100% all the time? No, but pretty close.What day-to-day compute load would max out both GPU and CPU at the same time?
Maybe. But I did the maths yesterday as our contract was up.4090 Ti is rumored to have a TDP of up to 900W. i9-13900K is rumored to have a performance mode with up to 350W TDP. Combined it's over 1200W. With electricity prices here in Europe reaching $0.5/kWh in the coming winter it will definitely be something to care about. Even in Hawaii and San Diego it costs now over $0.4/kWh.
Gaming 4 hours a day in a month would consume 144 kWh costing $72 or $864 a year for just CPU/GPU. An M1 Ultra draws a quarter of that.
I think this is a important topic, I think the point here is, intel/amd/nivida are just using more power to boost performance and Apple is looking at reinventing the CPU/GPU to increase performance. The energy savings you get with apple is a nice benefit. intel/amd/nivida are going to have to rethink their chips sooner or later.
I think Apple's hardware is fine for gaming, assuming the games you want to play are on it.Yeah, many people seem to be missing the bigger picture. They only look at their personal usage and say they can afford it, it's worth it or they're not going to use maximum TDP. The CPUs and GPUs still use much more power than Apple Silicon even during normal usage. What happens when hundreds of thousands or millions buy and use these power hungry systems?
Even with half the TDP, 600W, gaming 4 hours a day in a month would consume 72 kWh costing $36 or $438 a year for just CPU/GPU for ONE person. 10 million people would use 8 760 million kWh yearly costing $4 380 million a year.
In the US the average annual electricity consumption is 10 715 kWh. The electricity consumption of the gaming people above would be equivalent to 817 546 households. It takes around 1460 onshore wind turbines to produce the energy, or 3.5 nuclear power plants on average every year. Think big, think different!
At what settings & resolution, 720p perhaps? Besides you can also make Intel/AMD/Nvidia products a lot more efficient with a little bit of tuning, even something like limiting power on a GPU can save a lot of energy. The traditional PC would still be less efficient but you're exaggerating quite a bit here!35watt compared to 250watt is for sure noticeable if you play hours and having the same or better performance. + no Windows, what a relief.
At what settings & resolution, 720p perhaps? Besides you can also make Intel/AMD/Nvidia products a lot more efficient with a little bit of tuning, even something like limiting power on a GPU can save a lot of energy. The traditional PC would still be less efficient but you're exaggerating quite a bit here!
Unfortunately with Mac Gaming you live in a mansion, but are locked in the bathroom and every so often someone chops up some gaming food to slide it to you under the door…Using Nvidia and Intel hardware seems like a crime in this moment
Pointless posts on the internet consume far more power.Yeah, many people seem to be missing the bigger picture. They only look at their personal usage and say they can afford it, it's worth it or they're not going to use maximum TDP. The CPUs and GPUs still use much more power than Apple Silicon even during normal usage. What happens when hundreds of thousands or millions buy and use these power hungry systems?
That's 817,546 households out of hundreds of millions of households in the US, or less than 1% out of household energy usage alone. And household/consumer usage doesn't even count for the majority of energy consumption to begin with: Industry uses about 70% of the world's energy worldwide.Yeah, many people seem to be missing the bigger picture. They only look at their personal usage and say they can afford it, it's worth it or they're not going to use maximum TDP. The CPUs and GPUs still use much more power than Apple Silicon even during normal usage. What happens when hundreds of thousands or millions buy and use these power hungry systems?
Even with half the TDP, 600W, gaming 4 hours a day in a month would consume 72 kWh costing $36 or $438 a year for just CPU/GPU for ONE person. 10 million people would use 8 760 million kWh yearly costing $4 380 million a year.
In the US the average annual electricity consumption is 10 715 kWh. The electricity consumption of the gaming people above would be equivalent to 817 546 households. It takes around 1460 onshore wind turbines to produce the energy, or 3.5 nuclear power plants on average every year. Think big, think different!