Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,076
2,656
Which would be better served by going with the Mac Studio right?
Depends on what you're doing and what software you're using. For most people working at home on small and mid size projects, sure. Others might need x86 compatibility (Rosetta 2 can't emulate all x86/Intel instructions), dual boot into Windows or need more RAM than 128GB. You're also missing out on the installation of PCIe cards for additional storage, I/O and so on, if needed. No dual 10GBit ethernet and the lack of ECC memory on the Studio (this is a killer feature). Also, the NVMe drives in non MacBook Macs are bound to cause trouble right now: https://threadreaderapp.com/thread/1494213855387734019.html (another killer)

Or it could simply be for compute work, but that could probably be done on some cloud server running Nvidia GPUs.

The Mac Studio is not a Mac Pro replacement, it's an alternative for low end Mac Pro configurations. It caps out at around $8k vs. $50k+ for a fully loaded Mac Pro. Those are two completely different machines, different use cases. Apple has yet to introduce their full Mac Pro replacement.
 

diamond.g

macrumors G4
Mar 20, 2007
11,191
2,496
OBX
Depends on what you're doing and what software you're using. For most people working at home on small and mid size projects, sure. Others might need x86 compatibility (Rosetta 2 can't emulate all x86/Intel instructions), dual boot into Windows or need more RAM than 128GB. You're also missing out on the installation of PCIe cards for additional storage, I/O and so on, if needed. No dual 10GBit ethernet and the lack of ECC memory on the Studio (this is a killer feature). Also, the NVMe drives in non MacBook Macs are bound to cause trouble right now: https://threadreaderapp.com/thread/1494213855387734019.html (another killer)

Or it could simply be for compute work, but that could probably be done on some cloud server running Nvidia GPUs.

The Mac Studio is not a Mac Pro replacement, it's an alternative for low end Mac Pro configurations. It caps out at around $8k vs. $50k+ for a fully loaded Mac Pro. Those are two completely different machines, different use cases. Apple has yet to introduce their full Mac Pro replacement.
Is it enough of a performance boost over what was offered before for Apple to bother writing drivers for compatibility, knowing the current Mac Pro is going to be the last/only Mac to support the hardware?
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
I’m not sure the source for your numbers so I decided to do a bit of digging:
The numbers typically vary wildly but usually fall with Mobile being around 2x what PC is, usually mid to high 50% of total market.

I dunno where you got so low revenue for the PC market.

Either way, the reports (which all are either paywalled or have ridiculously long links) on Statista, Businesswore, and others typically point to mobile being the dominant segment, and having larger growth as well.
 
  • Like
Reactions: Irishman

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
I think people seem to miss a fundamental difference between “Apple should do something to get more games” and “I would like more games on Mac”.

Simply put, there’s inherent bias in assuming one’s hobby of desktop video games is a big deal. And assuming that by not catering to one’s hobby, Apple is making a big mistake.

I see no problem in wanting more games to come to Mac, but one shouldn’t assume that it’s imperative to the platform.
 
  • Like
Reactions: GrumpyCoder

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,076
2,656
Is it enough of a performance boost over what was offered before for Apple to bother writing drivers for compatibility, knowing the current Mac Pro is going to be the last/only Mac to support the hardware?
With the W6800 Duo and W6900? Sure, some people need the performance. As far as drivers, that's a question for AMD, not Apple who support them.
 

diamond.g

macrumors G4
Mar 20, 2007
11,191
2,496
OBX
With the W6800 Duo and W6900? Sure, some people need the performance. As far as drivers, that's a question for AMD, not Apple who support them.
Wait, AMD writes macOS drivers? I thought Apple was handling it in house. I wonder why they don't do monthly updates to the drivers like they do in Linux/Windows (mainly Windows, lol).
 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,076
2,656
Wait, AMD writes macOS drivers? I thought Apple was handling it in house. I wonder why they don't do monthly updates to the drivers like they do in Linux/Windows (mainly Windows, lol).
AMD does "base drivers" and Apple integrates into macOS, they're working together. You can't download a driver directly from AMD, it comes with macOS, hence the lack of frequent updates. I think you can download bootcamp drivers from AMD directly, but it's been ages since I've last used bootcamp, so not sure.
 
  • Like
Reactions: Irishman

diamond.g

macrumors G4
Mar 20, 2007
11,191
2,496
OBX
AMD does "base drivers" and Apple integrates into macOS, they're working together. You can't download a driver directly from AMD, it comes with macOS, hence the lack of frequent updates. I think you can download bootcamp drivers from AMD directly, but it's been ages since I've last used bootcamp, so not sure.
I wonder why Apple doesn't allow the vendor to do more frequent updates.
 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,076
2,656
I wonder why Apple doesn't allow the vendor to do more frequent updates.
Quality control... if there is such a thing with Apple these days. ;)

Back in the day Apple used to write their own drivers for Nvidia cards. They also had their own implementation of Java. Updates arrived with new versions of OS X. At some point they switched over to having Nvidia and Sun (now Oracle) do the work. Things got bad, more issues, more bugs, more people complaining. So I guess Apple having control over AMD is a good thing for them. Switching from Intel/AMD to AS allows them to do the same thing again (at least for the GPU).
 

Imhotep397

macrumors 6502
Jul 22, 2002
350
37
I don’t understand why they bothered releasing the W6000 series at all on the Mac pro.
What's even the purpose for all the thunderbolt ports of the Mac Studio if the intent is for it to be a completely closed appliance? Are they there essentially for cosmetic purposes?

It does seem fairly foolish to have built the Mac Pro then W6000 module if they were going to do an about face to that path by going completely counter standards depreciating users machines a few short months/years later and then staying as stubborn as possible by thumbing their nose at the Mac 3D community and prospective 3D Mac users. The Mac Pros were predictably going to sell poorly given the price point and lack of Nvidia hardware still and anyone working in any area of media knew they were extreme overkill for video and photo work. It seems like it was all a pointless sham rather than a reorienting and broadening of use case scenarios for Macs for other neglected areas like 3D.

Apple forming a strategic Alliance with Sony/Playstation would be ideal as I doubt Apple has any high level 3D tool or content developers on the MacOS team. Tim Cook also doesn't really seem technically inclined so he's not an advocate for any of it, if you work in 3D in any area everyday, you're always left wondering whether the reason Apple hasn't made significant headway with 3D for any category, relative to 3D on Windows, is because they just don't have the people for it.

If a strategic alliance with Sony was created Apple could have Mark Cerny contribute to Mac development and help craft a more inclusive set of graphics APIs that facilitate more content that is already made and dependent on other industry standard hardware at this point to getting to the Mac and the hardware running on the Mac via thunderbolt based eGPUs. This move alone would re-engage third party software that's been abandoned actually ported to the Mac quickly while still maintaining MacOS' future goals outside of 3D and building towards complete Metal native operation in the future. Apple could also hire Richard Marks from Google and he could contribute to hardware development in a number of areas including Apple wearable tech in general.

For the people that claim their only concern for "AAA" games being on the Mac are that they are risky and costly: Playstation is the number one brand in gaming. A strategic alliance with Sony/Playstation would increase opputunities for more hardware sales and mitigate any potential risk that might be incurred since Playstatio finances all it's own projects. If there were any losses (which wouldn't happen) the alliance could be terminated, parties go their separate ways and minimal money would be lost.

Apple could also hire Shawn Layden, Shannon Studstill and Alan Becker as "old hands" in the gaming business that could push 3D content and OS development at Apple past the low hanging fruit Apple Arcade level and that would be informative for games development as well as general development enhancements for other areas in 3D for Apple.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
What's even the purpose for all the thunderbolt ports of the Mac Studio if the intent is for it to be a completely closed appliance? Are they there essentially for cosmetic purposes?
External storage and the thunderbolt monitor they just released, please pay attention.
It does seem fairly foolish to have built the Mac Pro then W6000 module if they were going to do an about face to that path by going completely counter standards depreciating users machines a few short months/years later
They haven’t deprecated support for the 7,1?
and then staying as stubborn as possible by thumbing their nose at the Mac 3D community and prospective 3D Mac users.
They’ve assigned people to contribute to the Blender project and routinely show off Maya in their keynotes.
The Mac Pros were predictably going to sell poorly given the price point and lack of Nvidia hardware still and anyone working in any area of media knew they were extreme overkill for video and photo work.
They’re popular in audio work, especially with room for AVID cards.
It seems like it was all a pointless sham rather than a reorienting and broadening of use case scenarios for Macs for other neglected areas like 3D.
See above.
Apple forming a strategic Alliance with Sony/Playstation would be ideal as I doubt Apple has any high level 3D tool or content developers on the MacOS team.
So the people making the Metal frameworks are idiots?
Tim Cook also doesn't really seem technically inclined so he's not an advocate for any of it, if you work in 3D in any area everyday, you're always left wondering whether the reason Apple hasn't made significant headway with 3D for any category, relative to 3D on Windows, is because they just don't have the people for it.
Or that in making Optix, NVidia has formed a moat around the segment.
If a strategic alliance with Sony was created Apple could have Mark Cerny contribute to Mac development and help craft a more inclusive set of graphics APIs that facilitate more content that is already made and dependent on other industry standard hardware at this point to getting to the Mac and the hardware running on the Mac via thunderbolt based eGPUs.
So you suggest making an entirely new API? That’s beyond asinine. Moreover, AMD gpus in the Playstation share very little hardware similarities with the Apple Silicon gpus. There’s no expertise to be found.

And even then, AMD doesn’t have any sort of performance lead in 3D or gaming. This idea not only leads nowhere but a common source of frustration from Mac users was the lack of NVidia options!
This move alone would re-engage third party software that's been abandoned actually ported to the Mac quickly while still maintaining MacOS' future goals outside of 3D and building towards complete Metal native operation in the future. Apple could also hire Richard Marks from Google and he could contribute to hardware development in a number of areas including Apple wearable tech in general.
Ah yes, Google glass, that huge wearable success. Has Google had that much success in hardware? I know I haven’t seen anything that convinces me they have a leg up on Apple.
For the people that claim their only concern for "AAA" games being on the Mac are that they are risky and costly: Playstation is the number one brand in gaming.
Microsoft begs to differ.
A strategic alliance with Sony/Playstation would increase opputunities for more hardware sales and mitigate any potential risk that might be incurred since Playstatio finances all it's own projects.
Give a single good reason Sony would do this.
If there were any losses (which wouldn't happen) the alliance could be terminated, parties go their separate ways and minimal money would be lost.
Lol “minimal” you clearly have no knowledge of these costs.
Apple could also hire Shawn Layden, Shannon Studstill and Alan Becker as "old hands" in the gaming business that could push 3D content and OS development at Apple past the low hanging fruit Apple Arcade level and that would be informative for games development as well as general development enhancements for other areas in 3D for Apple.
See above.

It’s patently obvious you need to do some critical thinking before throwing names around and writing paragraphs which have no basis in reality. This reads more like a disorganized wishlist for santa claus more than an actual grounded strategy for getting games to the Mac.
 
  • Like
Reactions: GrumpyCoder

Imhotep397

macrumors 6502
Jul 22, 2002
350
37
I’m not sure the source ...
Just to be clear I never questioned the money coming from the mobile sector. ALL my numbers came from statista. I question the sustainability of the market given the extremely low number of contributors to it relative to the whole market (only 10%.)

More specifically I question the long term viability of that revenue stream considering Apple won't be getting it's customary 30% take from that minuscule group as litigation is forcing Apple to open secondary purchase sales to outside groups they can't collect all that money from.

Above all of that I question the ethical nature of the mobile games sales in general as the games that do actually make money are designed to target and psychologically manipulate people with obsessive/compulsive tendencies and disposable income, so called "Whales" to spend, spend, spend.

Apple has always purported itself to be an ethical company, both directly and indirectly. The nature of the mobile games business is going to be further scrutinized publicly at some point as to whether it should be regulated under the rules of gambling proper and gambling has always existed in a grey area ethically.

If Apple were to be drawn into any legal battles involving the mobile games business and gambling directly it could likely hurt the company's reputation. Considering how ruthless Apple had been about not wanting to foster competition of any kind in the Epic vs. Apple litigation in the mobile games space that would be viewed as a green to all manner of psychological manipulation for the purposes of generating profit.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Just to be clear I never questioned the money coming from the mobile sector. ALL my numbers came from statista. I question the sustainability of the market given the extremely low number of contributors to it relative to the whole market (only 10%.)

More specifically I question the long term viability of that revenue stream considering Apple won't be getting it's customary 30% take from that minuscule group as litigation is forcing Apple to open secondary purchase sales to outside groups they can't collect all that money from.
That’s a possibility, but I’d still bet on that remaining the lion’s share of the market, and it remaining a bigger source of revenue.
Above all of that I question the ethical nature of the mobile games sales in general as the games that do actually make money are designed to target and psychologically manipulate people with obsessive/compulsive tendencies and disposable income, so called "Whales" to spend, spend, spend.
Apparently you haven’t been paying attention to the desktop game market, which adopted those strategies a long time ago.
Apple has always purported itself to be an ethical company, both directly and indirectly. The nature of the mobile games business is going to be further scrutinized publicly at some point as to whether it should be regulated under the rules of gambling proper and gambling has always existed in a grey area ethically.
If this ever comes to fruition it’ll be a shakeup on the entire market. And the entire gaming market is ethically bankrupt. See: sexual harassment lawsuits, regular employee burnout, and other fun things like day 1 paywalled content which were a big deal back in the day but gamers were just conditioned to accept!

Playing in that market at all is just whaling, luckily gamers are stupid.
If Apple were to be drawn into any legal battles involving the mobile games business and gambling directly it could likely hurt the company's reputation. Considering how ruthless Apple had been about not wanting to foster competition of any kind in the Epic vs. Apple litigation in the mobile games space that would be viewed as a green to all manner of psychological manipulation for the purposes of generating profit.
Am I reading this correctly and you’re taking Epic’s side?
 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,076
2,656
Unless a person needs to run Windows (and not even then really) the value to performance to price proposition with what the Mac can run well isn't there to even consider buying a Mac Pro right now, with the Mac Studio available.
The studio is not a Mac Pro replacement.
1.) Memory is limited to 128GB, which is nothing compared to the Mac Pro. It is more than enough for what most people do at home and the average YouTuber making videos. It is by far not enough for professional users in the industry, working on $100M+ projects.

2.) No ECC memory in the Studio, which is a deal killer. Search for that old story when Apple sold non-ECC professional machines doing compute jobs for weeks and hundreds of machines kept crashing and had to be replaced later with ECC machines. No ECC in the enterprise world is a no go.

3.) Data integrity part 2: Read this: https://threadreaderapp.com/thread/1494213855387734019.html
This is another of these "look at unlabelled Apple marketing graphs" stories they're using to push their hardware out with specific benchmarks, while ignoring the bigger picture because that would look back for Apple

4.) Studio can't be easily upgraded. Thunderbolt ports are for "home users" and do their jobs, but how can one add additional NVMe controllers, or low latency I/O, HD-SDI, etc.?

5.) Raw power, as usual Apple took specific benchmarks to let the Studio shine, while for other things is much slower than a Mac Pro. You don't have to go that far into Hollywood dubstages/studios, right here in this forum is enough: https://forums.macrumors.com/threads/mkbhd-switching-to-mac-studio-from-mac-pro-but-why.2340919/
So a W6800x Duo equipped Mac Pro is 60% faster than a fully maxed out Studio Ultra when working with R3D Raw. If one is making money with such a machine, I'd sure know what I'd buy right now. And I'd also know what I'd buy as a hobbyist doing some work at home, even if it makes me a little money.

The full capacity of the Mac Pro exceeded it's usefulness in audio and video production using the applications it was best suited to run for most well equipped studios. The Mac Pro has pretty much been the Bizarro of creative workstations.
The RED/ARRI community begs to differ, as the link above to the forum post also suggests.
Nope, never stated that. Metal, the MacOS and Apple Silicon can be expanded/extended to fully interoperate with Nvidia and/or AMD GPUs at any time.
Uhm, no. Maybe start here: https://www.raywenderlich.com/books/metal-by-tutorials and then compare to other APIs and hardware. You can always add translation layers though, but at the cost of performance. Also Apple isn't interested in this and the hardware doesn't support it.
 

Imhotep397

macrumors 6502
Jul 22, 2002
350
37
The studio is not a Mac Pro replacement.
1.) Memory is limited to 128GB, which is nothing compared to the Mac Pro. It is more than enough for what most people do at home and the average YouTuber making videos. It is by far not enough for professional users in the industry, working on $100M+ projects.

Case by case. (this is going to be a recurring motif)
On the content creation side there are a lot of different roles and and a lot of different levels of computational requirements the Mac Studio can find a lot of desk homes. Sure, there are are circumstances where more memory is preferable, but the combination of SSD optimization and Apple's Virtual RAM scheme reduces the "Absolutely gotta have it" cases by a lot.

Outside of final frame rendering in a server room and even then when eliminating the server room when you're networked to a NAS or have multiple modern NAS clusters a lot of those data integrity concerns are diminished in intervals after being handed off. In work groups of 100 or less (which is going to be most) there are a lot good use cases for these and there would be more if they had access to CUDA core GPUs. Then also the computers are replaced every 3-5 years or less so you don't really get involved with breakdowns due to physical degradation often.

In larger operations it's just not going to be a server room "Run by Mac Pros" scenario.

2.) No ECC memory in the Studio, which is a deal killer. Search for that old story when Apple sold non-ECC professional machines doing compute jobs for weeks and hundreds of machines kept crashing and had to be replaced later with ECC machines. No ECC in the enterprise world is a no go.

Another case by case situation. On the content creation side where Macs are concerned (and as long as you avoid Gen 1 of anything Apple) Apple tends to source a dramatically higher quality memory and drives to begin with. (there are always exceptions with certain models, years etc.) Generally you don't run into the kinds of constant breakdowns due to blurring and other hardware level errors that you get on the PC side where historically QA for RAM and other hardware components is either non-existent or often times as thin as shoddy marketing.

Once those kinds of rampant break downs start happening like what you talked about on the PC side of things (knowing the history there) then it is time to shell out for the ECC RAM machines en masse. Macs generally wouldn't run well for long with compatible retail RAM intended for PCs either.

For Enterprise where there's the high levels of data storage and retrieval and where they have to be on Windows or Linux or both currently I can see where they must use ECC. These are not/never have been/ probably never will be Mac Pro scenarios.

4.) Studio can't be easily upgraded. Thunderbolt ports are for "home users" and do their jobs, but how can one add additional NVMe controllers, or low latency I/O, HD-SDI, etc.?
I go back to Magma PCIE to PCIE enclosures and I was actually ecstatic with their early adoption of thunderbolt. Thunderbolt is very robust as a pro solution and a modernization update to firewire. It's really Intel's handiwork of artificially impeding Thunderbolt's adoption after Apple turned it over to them that has kept it from being widely adopted as a replacement in a lot of different data transfer cases. SDI and XLR are stone aged jumbo sized "pro" ports for instance that should have been phased for locking thunderbolt ports and adapters years ago, but that's a different discussion for a different time.

Faster CPUs/GPUs and fast NAS clusters that got easier to maintain along with with highly optimized OSes running on SSDs made those Magma enclosure obsolete, but thunderbolt has continued to progress well and is far beyond anything most home customers need for external peripherals.

5.) Raw power, as usual Apple took specific benchmarks to let the Studio shine, while for other things is much slower than a Mac Pro. You don't have to go that far into Hollywood dubstages/studios, right here in this forum is enough: https://forums.macrumors.com/threads/mkbhd-switching-to-mac-studio-from-mac-pro-but-why.2340919/
So a W6800x Duo equipped Mac Pro is 60% faster than a fully maxed out Studio Ultra when working with R3D Raw. If one is making money with such a machine, I'd sure know what I'd buy right now. And I'd also know what I'd buy as a hobbyist doing some work at home, even if it makes me a little money.
These are optimization issues where outside of the codecs Apple is (hopefully) at the end of a blood feud with RED and Blackmagic over RAW codec patents (A recurring motif for Apple). As a result RED and Blackmagic have been slow to optimize their RAW decoders on M1. They're going to have to do it and soon though. There are too many of their customers on Macs that will switch to Assimilate/Final Cut/ProRes as a full end to end replacement for RedRAW, BRaw etc. at a drop of a dime where that lagging performance doesn't exist.

Uhm, no. Maybe start here: https://www.raywenderlich.com/books/metal-by-tutorials and then compare to other APIs and hardware. You can always add translation layers though, but at the cost of performance. Also Apple isn't interested in this and the hardware doesn't support it.

LOL...

Ok, I'm always open to digging in and learning new things as a lot of people are, but the mere fact that everybody has to toss out a lot of years of experience and knowledge gained just to get to baseline, low level functionality on the Mac is offensive. The performance hits of translation layers are preferable to a draconian system that eliminates choice.

More importantly Otoy, the company behind the Octane renderer has been talking about CUDA code translation for a number of years. Apple brought many of their engineers into Apple for some publicly unspecified development before the M1 was released. I never could suss out what they were doing then, but now looking on it it wouldn't surprise me if Apple had Otoy build them a super efficient CUDA translation layer for Mx SOC processors. That would explain a lot of things actually. We'll have to wait and see.

"According to Otoy’s CEO, Jules Urbach, the point of developing this CUDA translation layer is so that the company’s high-end Octane Render software can run as easily on AMD GPUs as their Intel counterparts. “We have been able to do this without changing a line of CUDA code, and it runs on AMD chips,” Urbach said. “You can now program once and take CUDA everywhere. AMD has never really been able to provide an alternative.”

WE WILL HAVE ACCESS TO THE CUDA CODES...bwa..hahahahahaha

"...the hardware doesn't support it."

That's what new generations of hardware are all about.
 
  • Like
Reactions: Irishman

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,076
2,656
So we're back to you posting stuff you googled and don't even understand... ok, let's see...
Case by case. (this is going to be a recurring motif)
Case by case in the sense that when the enduser reading emails at home and browsing the web won't need more than 128GB RAM? Sure. Same for the hobbyist cutting home videos and touching up photos from grandmas last birthday party.

Professional users working with large datasets in $$$ projects, not so much. They need more than 128GB when their data is larger, period. This just shows you've never worked on such large projects. Or simply speaking, you can't fill five gallons of water in a two gallon jug. The whole 8GB on M1 = 16GB on Intel is marketing nonsense that fanboys came up with and it's debunked. This is something first semester computer science students learn right away.
Sure, there are are circumstances where more memory is preferable, but the combination of SSD optimization and Apple's Virtual RAM scheme reduces the "Absolutely gotta have it" cases by a lot.
No, it it's. You just don't understand the technology, that's all. Normally I would ask what "Apple's Virtual RAM scheme" is, to reduce it and would expect you to explain the algorithm. But since so far you have failed to answer a single technical question and reply with marketing nonsense instead, it's pointless because by now everyone knows you don't have the technical background to answer it. You probably don't need it and that's ok, it's just that normally when someone has no clue about it, they choose not to argue about things they don't understand. In psychology it's called selective response.
In larger operations it's just not going to be a server room "Run by Mac Pros" scenario.
No one said anything about servers. The Mac Pro is what it is, a professional workstation, similar to what Dell, HP and Lenovo offer.
Another case by case situation. On the content creation side where Macs are concerned (and as long as you avoid Gen 1 of anything Apple) Apple tends to source a dramatically higher quality memory and drives to begin with.
No they don't. Check the modules used and compare datasheets of off-the-shelf chips you can order in bulk. Also, higher quality (by what metric, things can be measured you know) has nothing to do with ECC.
For Enterprise where there's the high levels of data storage and retrieval and where they have to be on Windows or Linux or both currently I can see where they must use ECC.
Enterprise data storage and retrieval have nothing to do with the need for ECC. Again, you don't understand the technology of ECC, which again is taught to first semester computer science students. Normally people would try to understand what ECC actually does.
SDI and XLR are stone aged jumbo sized "pro" ports for instance that should have been phased for locking thunderbolt ports and adapters years ago, but that's a different discussion for a different time.
What an utter nonsense. Where do you come up with these google results. It's quite clear you've never been in broadcast and movie industry where this is the standard. And yes, the hobbyist and YouTuber at home doesn't care about it. Professionals in $$$ projects do.
Faster CPUs/GPUs and fast NAS clusters that got easier to maintain along with with highly optimized OSes running on SSDs made those Magma enclosure obsolete, but thunderbolt has continued to progress well and is far beyond anything most home customers need for external peripherals.
And this has what to do with what I said? Nothing, right. A NAS in any form doesn't replace local storage, leave alone I/O for interfaces. As for PCIe enclosures... you're taking a massive performance hit and since you specifically mention Magma as a manufacturer, check what they do to PCIe lanes (datasheets and measurements, you know...) ... funny you choose this Manufacturer of all for the professional market when it's all about performance. ;)
These are optimization issues where outside of the codecs Apple is (hopefully) at the end of a blood feud with RED and Blackmagic over RAW codec patents (A recurring motif for Apple).
No, they're not. I've worked at the research institute that came up with h264 around the same time when Amir Majidimehr worked on VC-1 at Microsoft. Again, normally I'd ask for a technical explanation (and not marketing material by fanboys/sales stuff) why you think that is the case. But since you're not able to answer it, I'll just leave this. There is always room for optimization, but not on this large performance gap. M-series is not the magic some people make it out to be. It's all about performance-per-watt. ProRes is so much faster than on x86 machines because M-series chips have specific hardware for ProRes, they lack this specific hardware for other codecs. The same is true for photo work, there's a nice article and thread over at dpreview (for the end user). That doesn't mean Apple won't introduce optimized hardware with M2+ in the future, but for M1/Pro/Max/Ultra that ship is sailed.
Indeed...
Ok, I'm always open to digging in and learning new things as a lot of people are, but the mere fact that everybody has to toss out a lot of years of experience and knowledge gained just to get to baseline, low level functionality on the Mac is offensive.
When all you do is making claims and not answer a single technical question and choose selective responses instead to avoid delivering answers, that's what you get. You don't even understand what the difference between a driver and a full blown toolkit is from your google search. That alone unfortunately shows you don't have any understanding about the underlying software or hardware. What else would one recommend other than read up on the very basics of technology.
The performance hits of translation layers are preferable to a draconian system that eliminates choice.
So you're willing to take a performance hit for GPU applications that makes it slower than a native CPU implementation? I'm not, because you know... it's even slower than without a GPU.
More importantly Otoy, the company behind the Octane renderer has been talking about CUDA code translation for a number of years. Apple brought many of their engineers into Apple for some publicly unspecified development before the M1 was released. I never could suss out what they were doing then, but now looking on it it wouldn't surprise me if Apple had Otoy build them a super efficient CUDA translation layer for Mx SOC processors. That would explain a lot of things actually. We'll have to wait and see.

"According to Otoy’s CEO, Jules Urbach, the point of developing this CUDA translation layer is so that the company’s high-end Octane Render software can run as easily on AMD GPUs as their Intel counterparts. “We have been able to do this without changing a line of CUDA code, and it runs on AMD chips,” Urbach said. “You can now program once and take CUDA everywhere. AMD has never really been able to provide an alternative.”
As you said above, LOL indeed. Is that all you could come up with in your google search?

First let me state that people (myself included) have tried this for years in professional research, with published results and failed. Basic tech demos work, full feature set doesn't. Every project has been abandoned or is making no progress.

What you failed to mention is, you took this from a over 6 year old interview/article here: https://www.extremetech.com/computi...company-claims-to-run-cuda-on-non-nvidia-gpus

What you failed to mention is that Otoy had a tech demo running with basic functionality that was so slow no one would ever use it in production, leave alone the missing features.

What you failed to mention is that Otoy have abandoned this project because it didn't work as expected.

What you failed to mention is that Otoy's recent work/software for macOS in based on Metal, which they adopted as the only way. See their press release: https://home.otoy.com/octane-x-launch/

Let me quote the relevant part:
The latest release of Octane X is the capstone of years of development to rebuild the industry’s leading unbiased GPU path tracer from the ground up, optimized for maximum performance with the Apple Metal API. Octane X features full pixel parity with OctaneRender 2020 and later, making state of the art cinematic rendering available natively for the first time in macOS.
So yes, LOL indeed to what you posted. LOL indeed.
WE WILL HAVE ACCESS TO THE CUDA CODES...bwa..hahahahahaha
This sentence alone shows you don't understand the technology. Maybe do a search what actually happens to the CUDA code when using translation. ;)

So to sum it up, CUDA is dead on macOS. Even Apple is not making any effort, because they've begun porting CUDA heavy software to native Metal in some cases, why don't you check out their work with tensorflow (they tried translation first and failed, hence the native Metal version).

Facebook, more precisely FAIR is doing the same with PyTorch... going Metal native.

So no, you won't have access to anything except in your marketing dreamworld you find with google searches. ;)
The real world Apple systems will remain Metal, period.
 

Imhotep397

macrumors 6502
Jul 22, 2002
350
37
Case by case in the sense that when the enduser..

One area of difference between an old University Comp Sci. Professor without day to day practical hands on creative industry experience and someone that actually works in the creative industry (and has for years) is that the person with the industry experience in a number of different capacities with multiple degrees knows that coders are told to shoot for the stars while the person working in the trenches, seeing how the sausage is made realizes that 80% of the various specialty areas that contribute to various final "$$$" products don't need as lofty a spec as what the coders are told to build to get their day to day tasks complete.

I guess you can keep projecting your personal anecdotal descriptions onto me like you know anything about me since it seems help you out in some way, weird as that is. Your black and white perspective on how commercial creative industries work may be indicative of some other things but I'm not going to speculate any further than that.


it's just that normally...
Yeah, since I'm not responding in the way you would prefer it's probably indicative of those descriptions not being useful in practical applications discussion.

is taught to first semester computer science students....
I know what ECC, how the coding works, what what it's uses are and that it originated in radio communication. Frankly that's more than adequate as it relates to practical use of the hardware with pipeline specific software and specific groupings or hardware to practically determine if it's a worthwhile purchase option for your use and for your groups uses.

What an utter nonsense....
Sounds like you've never run cable or transported heavy cable knowing there are modern lighter weight digital cables that can be easily re-purposed for most uses as they are in other areas and asked the question, "Why are we still using these dinosaur analog cables from the 1980's and earlier?" (and that could have been from the early 2000's all the way to today) It's very easy to not ask questions when you just take orders, see the things and leave those things for other people to contend with.


I'll go through the rest of this at some point later in the week
 
Last edited by a moderator:

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,076
2,656
One area of difference between an old University Comp Sci. Professor without day to day practical hands on creative industry experience and someone that actually works in the creative industry (and has for years) is that the person with the industry experience in a number of different capacities with multiple degrees knows that coders are told to shoot for the stars while the person working in the trenches, seeing how the sausage is made realizes that 80% of the various specialty areas that contribute to various final "$$$" products don't need as lofty a spec as what the coders are told to build to get their day to day tasks complete.
Don't worry, I work with industry partners all the time. You on the other hand do not, not on a professional level as you have demonstrated so often now by posting false stuff. Nice try though.

I guess you can keep projecting your personal anecdotal descriptions onto me like you know anything about me since it seems help you out in some way, weird as that is.
I know you have yet to post/answer anything technical that is remotely correct. That's good enough for this forum, you know nothing and are wrong all the time as demonstrated. That driver/toolkit thing was on a new level, but a really good laugh.

Yeah, since I'm not responding in the way you would prefer it's probably indicative of those descriptions not being useful in practical applications discussion.
You're not responding because you don't know what you're talking about and have no answers, because you don't even understand the questions to begin with, just like you don't understand the difference between a driver and a toolkit as demonstrated. As I said, it's called selective response in psychology and it speaks volumes.

I know what ECC, how the coding works, what what it's uses are and that it originated in radio communication.
As you have evidently demonstrated, no you don't.

Sounds like you've never run cable or transported heavy cable knowing there are modern lighter weight digital cables that can be easily re-purposed for most uses as they are in other areas and asked the question, "Why are we still using these dinosaur analog cables from the 1980's and earlier?"
Because SDI is so analogue and Thunderbolt cable length allow for high distances... huh? Another thing you just demonstrated the lack of knowledge. It's also odd none of the major broadcast facilities, studios and dub stages want this. But hey, nice try again trying to go off topic.


I'll go through the rest of this at some point later in the week
Don't bother. It's quite clear you need time to come up with new google results you don't understand without answering any questions to your claims (which you're actually required to by forum rules) only to go off topic further to draw people away from the false stuff you posted so far. As you have demonstrated multiple times now that you have no clue about this stuff, the conclusion is simple... you're wrong and you're not getting anything you want, no matter how much you whine about it. @JMacHack already nailed you spot on why you're posting this. ;)
 
Last edited:
  • Like
Reactions: ipponrg

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,076
2,656
The forum is called "Mac and PC Games computer-based games". Also where is anyone anti Mac gaming? No one ever said anything anti Mac gaming. Some of us are stating facts, some of us don't live in dream worlds like you where we compare drivers and toolkits, post quotes from over 6 year old articles, claim the technology will arrive when in reality it has long been abandoned.

If studios want to bring games to the Mac, they should do it. Accepting the reality that it is only financially feasible for simple and quick to make games is not anti Mac gaming... it's reality.

Some people act like kids in a candy store that don't get what they want, which in this case are games on Macs. Here's a more graphical visualization:
mini-golf-kid.gif


Other who are more reasonable and accept facts react more like this:
i-dont-care-friends.gif

When I want to play a game, I simply buy the system it's on. I've owned pretty much any major gaming system (and maybe also not so major, PC Engine, Neo Geo anyone?) since the 8-bit days. I just act and make things work to solve a problem, I don't whine and tell others that long abandoned technology will save graphics on the Mac to make myself feel better because I'm not getting things or bet money on the wrong horse. And as far as non-console, desktop computing only games go... I'm just using a PC with Windows for this purpose and if a game is available for a Mac I play it there, Starcraft, Counter Strike Source, Diablo 3, etc.. Just like I use PCs with Linux for scientific work and a Lenovo and Razer laptop dual booting into Windows/Linux for all the stuff I can't do with my Mac Mini, Mac Pro and Hackintosh desktops or my Macbook Pros. My Macs/iPads are daily drivers, reading/writing, some Photo and Video work and so on.
 
Last edited by a moderator:
  • Like
Reactions: ipponrg

diamond.g

macrumors G4
Mar 20, 2007
11,191
2,496
OBX
Mac is getting Resident Evil Village. Seems like Apple talked to Capcom (maybe threw some money their way?).

Wonder if Capcom will do Monster Hunter in MacOS.
 

Kpjoslee

macrumors 6502
Sep 11, 2007
416
266
Mac is getting Resident Evil Village. Seems like Apple talked to Capcom (maybe threw some money their way?).

Wonder if Capcom will do Monster Hunter in MacOS.

Capcom is trying to test the market with REVIII. If it does well on MacOS, then it is a possibility.
 
  • Like
Reactions: Irishman

diamond.g

macrumors G4
Mar 20, 2007
11,191
2,496
OBX
Capcom is trying to test the market with REVIII. If it does well on MacOS, then it is a possibility.
Yeah that game didn’t do well on the other platforms so this feels like it is destined to do poopily.

That came across bad. I meant that for all the platforms it was released in it seemed to only have sold a few million units. Which could be good based on how much they spent to make the game. But as far a popularity is concerned MH pulls in more players and would have made more sense to port to test the waters. IMHO
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.