Back in the day Quadro cards were really a cut above always in everything. The workstation Quadro cards today I've personally found not to be as far ahead with everything, especially GPU rendering with Redshift or Arnold vs. GeForce cards so it really depends on what kind of work you're doing that's going dictate whether the Quadro cards are going to be good purchase for you. Of course when you get to the Server based cards and they just start throwing mountains of clustered CUDA and Tensor cores if the functions you can use are in the CUDA libraries those cards/servers are going to be Boss.Same technology as in silicone, sure. Architecture not so much, just do a comparison of their gaming cards with Quadro/server cards and see a massive difference in performance for these two fields.
And that worked so well for Apple in the past... to the point they discontinued their server line. You need a full line up, something that Dell, Lenovo and HP offer. Apple really isn't interested in this, because they won't be able to offer the services.
Apple was a fraction of the size they are now vs. back when they discontinued the Xserve. I already mentioned that fewer specific industry sector server deployments would be better...that would be because there would be fewer large clients to support.
his has been discussed up and down on this forum, but once more... even if Nvidia would own ARM, there would be no way to block Apple from developing ARM chips.
Of course they could. It's a privately held company. Nvidia could essentially freeze releases of general ARM advancements for Apple specifically, only give them the absolute minimal info necessary to satisfy the license and kill all publicly distributed materials on other major annually released advancements. They could then give Apple the run around on getting updated specs until the license runs out and not allow Apple to renew or keep going that way until Apple's Mx advancements deviated enough from the standard that would create cost problems for them getting chips fabbed relative to other ARM fab work.
Apple may try to switch to an open source alternative to ARM to avoid mounting royalties, but if Nvidia had been allowed to buy ARM Apple would have likely gone open source faster or maybe even gone to AMD for x86 CPUs again.
Grace is similar to what Nvidia has been doing for years. The use case for Grace is improved I/O for the GPU, nothing else.
Funny, you tried to insinuate the opposite when I stated the roots of ALL Nvidia hardware was essentially the same when I mentioned gaming GPUs and Video Games being the roots of Nvidia. I guess those filthy gaming GPUs are just the red headed mutant stepchildren of Nvidia to your deluded mind...lol.
Yes really.
This surely wouldn't have been possible if Apple was so behind back in the day? (and yes, I worked on parts of that, that's also what Jobs used for benchmarks on stage)
Those case studies were from the early 2000's reported in 2006-2009...The tech/methodologies are like 20 years old. In fact that was right around the time when Apple shut the door on Nvidia and Nvidia ran away with the major advancements in big data set crunching miles from Macs were capable of doing using CUDA. They were testing with high level gaming and data centric science processing like this for weather forecasting.
In the game development process there are a multitude of games and VFX projects that use scanning for creating point cloud data to rebuild photo-realistic assets with photogrammetry that acquire far denser clouds and more varied types of data with far more real-time interaction than medical visualization. Most of those separate stage processes for use in games and VFX are GPU accelerated to great effect.
The MedViz projects could be lagging back in development advances which wouldn't surprise me being that they are in the Med area where performance is not really prioritized, but if they have been keeping up they likely have been moved over to GPU acceleration on the Windows side and maybe even Linux so that those versions have been running circles around the Mac versions for years. Those GPU accelerated processes were pioneered from constant and constantly expanding games development and VFX development that have been prioritizing hardware purchases for performance first though.
I have no idea what you're saying, maybe because you still haven't answered a simple question about parallel programming paradigms. Any dataset on the PC side has also been available on the Mac side for developers.
(data set, not dataset)
Ok, so initially I'm thinking, "With all his vast stores of knowledge, awards lavished on him and such...SURELY he already knows all this stuff and is just being a WiseAss." Now that this is the third or fourth time you've brought it up I'm not sure anymore. Maybe that's the case, I don't know but here goes:
Towardsdatascience.com
GPUs render images more quickly than a CPU because of its parallel processing architecture, which allows it to perform multiple calculations across streams of data simultaneously. The CPU is the brain of the operation, responsible for giving instructions to the rest of the system, including the GPU(s).
ACM version:
GPUs support coarse-grain task-level parallelism via concurrent execution of different tasks on different powerful cores. A GPU kernel comprises multiple individual threads. A GPU thread forms a basic sequential unit of execution. At the lowest level, the hardware scheduler manages threads in small cohorts.
_______________
Nvidia found that it was easier to design and fab simpler cores and many of them early on focusing on calculating simple math matrices in short parallel chunks simultaneously. In the beginning these isolated code chunks were exclusively related to parts of imaging/rendering processes that they worked for. Essentially a RISC like approach, so RISC won the war after all. The catch has been the all of these isolated component imaging function code chunks have had to be coded as individual features that essentially bypass the CPU to go to the GPU for parallel evaluation. At least this is the way it was explained to me, like 10 years ago. As it turned out quite a lot of programming tasks can be broken down into these his approach has been passed to predictive simulation, machine learning/ai etc.
The problem for Apple is that this was all originally done in proprietary CUDA code for CUDA cores. AMD has less features, but they are open, but even that doesn't help Apple now because they've eschewed the open standards for their own proprietary standards exclusively.
As much disdain as I have for MS and their practices atleast when they copied OpenGL and added audio to spawn Direct X, they mostly kept the development up via Xbox game development and competition. They didn't block AMD and Nvidia functionality like Apple has.
Nvidia CUDA driver updates for Mac ended 3 years ago and the actual functional feature advancements stopped back in 2014 so the updates basically have been bug fixes and security patches.
Random FYI - Mac download of CUDA 15 MB / Windows download of CUDA 2.5 GB
Like the Shareware licensing scheme was oh so profitable.Then they will change the license model, just like other manufacturers and Apple had a different model in the past as well. I really don't see Microsoft providing all those tools (again, maybe you could actually specify what exactly you actually need) for developers out there and yet somehow developers get their stuff running on Windows for Intel/Nvidia/AMD graphics.
Yeah, a platform company isn't going to make a decision about standards they will or won't support based on the profits of and individual studio or two. They will make the decision based on what collections of AAA, AA and Indie games combined would financially work out best for them. The "AAA" games attract players to the platform and the independent games and "AA" games make the profits.So looking at actual numbers from AAA game development studios is the wrong place to look at when trying to make money with games?
"...People in the Professional World...do not play games on their Macs, they do actually work""A lot" of hardware? For that lot of hardware, they sure have a very small market share. And the gaming market is even smaller. And no, people in the professional world, be it film/music studios, dub stages, etc. do not play games on their Macs, they do actually work. Youtubers? Sure, but that's even a smaller target market.
Did you practice that in a mirror with a snobbish fake French or fake English accent to make yourself feel better before you posted?
"Raises Hand, Hey I'm one of them." Many of us game on the weekends, even a weeknight or two on occasion (God forbid...what if the neighbors found out I was playing Elden Ring? The Shock, the Terror!) Some of us game with our kids in fact and wouldn't find it objectionable in the least if we didn't have to maintain multiple OS environments to do so.
As much as you might want to scoff at Youtubers they are a significant part of Apple's professional userbase and Apple takes them VERY seriously, believe me.
So maybe we're getting somewhere here... why exactly is it that Metal is such a pain in the ass to developers? Be technical here, let us know what parts of the API are problematic, what parts are not? I hope this isn't another one of you're statements we'll never get an answer for. Please, no marketing talk.
I was hoping you'd ask... (a couple of brief horror stories below)
The version of OpenGL that's part of Metal is OpenGL 2 from like 2013. This is obviously a major impediment.
_____________________
"Most games built with Unity 2018 and Unity 2019 do not work on Mac OS X 10.9. They will launch initially, and occasionally even get as far as the main menu, but invariably crash before entering gameplay.”
_______________
Apple's licensing was such that it didn't allow Unity to support non-Metal based Apple hardware at that point.
Imagine spending “as little as” $5M on developing a Mac game and a Mac app within Unity one year and having it not even salable two years after launch… Then contact Apple support and long story short you get the equivalent of an eye roll. This happened to a friend of mine that I've known since elementary school. He knows I work primarily on Macs and has told me more than once that he will NEVER develop anything for MacOS again in life . You don't realize how many Mac owners are still running their Intel Mac Pro towers with no updated OS since Apple stopped making desktop towers until recently and I still run one. (I used to think I was an anomaly before this situation)
______________________
“Apple cuts off Epic from its tools, endangering future Unreal Engine projects on iOS and Mac”
Apple cuts off Epic from its tools, endangering future Unreal Engine projects on iOS and Mac
Studios big and small could be hit with this decision, not just Epic.
www.washingtonpost.com
_______________________
The EULA for the Mac OS X SDK forbids use on non-Apple branded hardware so we (Unreal 4) can't build for Mac from Windows as there's no alternative to the official SDK.
Not the case for iOS.
__________________________
What's the story with Apple still refusing to support Vulkan natively?
They don't need to. Also Metal isn't really 1:1 comparable to Vulkan in terms of abstraction level.
Same way MS didn't need to support common HTML/JavaScript since they had ActiveX?
If Apple would have acted properly, instead of their usual "eat our lock-in" mentality, they could provide Vulkan support, and then build higher Metal-style abstractions on top of it. But no, it's Apple for you. Eat Apple only Metal or get lost.
_____________________
As of the 3.3 update, you can no longer access Elite Dangerous products through the Mac OS. However, you will still be able to log into your account and play on PC (or via Bootcamp.)
Despite our best efforts, we have been unable to bring Horizons content to Mac due to technical barriers. With the improvements arriving in our Chapter Four update of the Beyond season, we have felt it necessary to make this difficult decision in order to allow us to bring in content and features in the way that we felt was best for the overall Elite Dangerous experience.
We hope you will understand why we have taken this course of action and would encourage those of you who have questions or concerns to please contact our Customer Support team by using the button below.
_________________
In an advisory notice posted on its website last month, Foundry announced that the current release, Mari 4.7, would be the last to support macOS. The firm will continue to release maintenance updates to Mari 4.7 on macOS until the end of 2021.
_______________________________
The fact is before you even get to the problem of Apple moving away from standards with no part for part equivalents for those standards or decrepit drivers components in place all of this kind of schizophrenic behavior that you see between Apple and other corporate "partners" is brinkmanship that you don't see with other platform holders and partners serves to whittle down developers from considering the Mac as a stable deployment platform.
Even with a great deal of user interest in games on the Mac the fact is that it will cost developers more than developing on other larger platforms that don't make them jump through a bunch of hoops. Even if you jump through the hoops Apple could just drop or radically change parts of the OS you need for your app or game to work and just leave you and hundreds/thousands/millions of others straggling in the wind. That kind of thing could happen in under just a couple of years not even like 10+.
Meanwhile, over on Windows you can play a game like Dead Space that was released in 2008 on a Win 10 or 11 PC in 2022 with performance enhancements. This is the stability and reliability Macs used to have.
(did that earlier in here)What low level API tools in Swift?
As for mobile games, if I'd be just into making money, then the mobile iOS (and Android) market is precisely where I'd go. I'd release a new game every week or two, do some fancy in game purchases and have a good cash flow. If a game doesn't do so well, then I have wasted listed time and money. The other option would be to put down $100M+ first, do years of developments only to find people don't like a game and make a loss with it.
A few limitations or missing components:
Unreal "running" on Metal:
____________________
There's nothing fancy about that test scene, no texture data sets, no real environment data sets, minimal lighting, no enemy or NPC or secondary characters, No dynamic Particle data sets, or AI data sets. There's just minimal animation data sets one characeter and a single piece of bog standard platform witha few stairs and without even getting into the newer tech that isn't available to MacOS the whole demo just chugs from the start.
I know a couple of people that have released iOS and Android apps and a few mobile games and most of them haven't made any money from those. They tell me it's only the Candy Crushes, Angry Birds, and Wordles or the world with insane marketing budgets that can make any money and looking at the revenue breakdown that spilled out back with the Epic vs. Apple case that was pretty well proven with 10% of ALL App store users playing games and providing 70% of the App store revenue.
Universities are adopting Unity and Unreal that were originally developed as videogame engines to build decent UIs to pair with their software/assets more and more often rather then just going with the absolute bare minimal of UIs. The transition has been happening for years, so YES Apple should adopt some level of conformity with the standards to make certain these apps and the hardware/GPU driven approaches they are reliant on are actually performant on Macs going forward or Acquire whoe they need to acquire to get their own underlying approaches up to match their hardware innovations:
____________________________
Some missing other bits and bobs for Swift/Metal:
Buffer device address:
This feature allows the application to query a 64-bit buffer device address value for a buffer. It is very useful for D3D12 emulation and for compatibility with Vulkan, e.g. to implement ray tracing on MoltenVK.
DrawIndirectCount:
This feature allows an application to source the number of draws for indirect drawing calls from a buffer. Also very useful in many gpu driven situations.
Only 500000 resources per argument buffer:
Metal has a limit of 500000 resources per argument buffer. To be equivalent to D3D12 Resource Binding Tear 2, you would need 1 million. This is also very important as so many DirectX12 game engines could be ported to Metal more easily.
Integrated full up to date Vulkan under Metal with as much of Metal's hardware specific efficiencies as possible.