Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

JahBoolean

Suspended
Jul 14, 2021
552
425
What's the one single system preference or setting to set it to be hidden globally?
Get a wallpaper whose menu bar will display as black, typically it happens with all darker tones. The control on this setting is lackluster, but hey you can still manipulate it !
 
  • Haha
Reactions: prefuse07

mattspace

macrumors 68040
Jun 5, 2013
3,185
2,879
Australia
Get a wallpaper whose menu bar will display as black, typically it happens with all darker tones. The control on this setting is lackluster, but hey you can still manipulate it !
Wallpaper doesn't solve the notch taking up real estate in the menubar, if its presence is even recognised by an app at all. My question stands, where is the one central control to permanently shift the menubar below the notch, on a global basis.
 

JahBoolean

Suspended
Jul 14, 2021
552
425
Wallpaper doesn't solve the notch taking up real estate in the menubar, if its presence is even recognised by an app at all. My question stands, where is the one central control to permanently shift the menubar below the notch, on a global basis.

Are you still hung-up on bugs from the first public build of mojave ? Do you own an M1 Pro device or is this pure jest ?

It is a non issue in the real world.

Screenshot 2022-05-15 at 12.37.41.png

And I am running unreasonable scaling to soothe my eyes.

Back to the topic at hand alas.
 
Last edited:
  • Wow
Reactions: prefuse07

exoticSpice

Suspended
Original poster
Jan 9, 2022
1,242
1,951
?

But that doesn't answer the question, if the "solution" to the notch is you can move the menubar below it and still get a 16:10 display, where is the control to do it on a universal basis - because if it doesn't have a universal workaround, it doesn't have a workaround.
If you use this app. It's universial and you can also use any wallpaper you like. The Menu Bar will always stay black.
The notch is not even a big deal with this app as it works as intended, if it notch annoys you.

No, the menu bar does move below. The menu bar sits above the 16:10 display with the notch. You also get more real space than the 16" 2019 MacBook Pro because of this.
1652656730184.png


1652656712895.png
 
Last edited:

mattspace

macrumors 68040
Jun 5, 2013
3,185
2,879
Australia
No, the menu bar does move below. The menu bar sits above the 16:10 display with the notch. You also get more real space than the 16" 2019 MacBook Pro because of this.

10 pixels high, is not worth losing an entire menu in width.

So back to the original issue - *notch* -> *nope*

A kludge that ruins the utility and consistency of the menubar, for the decorative goal of having top and side bezels be the same thickness.
 

exoticSpice

Suspended
Original poster
Jan 9, 2022
1,242
1,951
A kludge that ruins the utility and consistency of the menubar, for the decorative goal of having top and side bezels be the same thickness.
Does not ruin it at all for me. I get more space than I did before on my 16" 2019 MBP.

You are just complaining for the sake of complaning. The only thing that is annoying is that you cannot place items where the notch is.

I have proved you wrong. What happened to "Universal workaround"?

On the 16" 2019 MBP you don't even use the 16:10 if it's not in fullscreen mode. The menu bar takes up the some of the display space. I rather have the notch for now to give me proper 16:10 on the 2021 MBP.

In the future when Apple moves the camera underneath or makes the webcam small enough to fit in the small bezel, that will be the best solution. Usage of full 16:10 display and no obstruction in the menu bar.

10 pixels high, is not worth losing an entire menu in width.
You don't lose whole menu just a part of it but you gain in screen space which is real useful on a laptop.
for the decorative goal of having top and side bezels be the same thickness.

Apple wanted it stand out and stand out it does. You would know it's the 2021 MBP from looking it from everywhere. The notch, the black keyboard deck, the engraving at the bottom and the flat lid.
 

mattspace

macrumors 68040
Jun 5, 2013
3,185
2,879
Australia
Does not ruin it at all for me. I get more space than I did before on my 16" 2019 MBP.

You are just complaining for the sake of complaning. The only thing that is annoying is that you cannot place items where the notch is.

I have proved you wrong. What happened to "Universal workaround"?

You have proved nothing - where is the universal workaround that gives you the full width of the menubar - ie that moves it to below the notch, which doesn't have to be applied on an app-by-app basis?

The pro-notch argument was that there was an easy workaround to it, where is it? How can I get a menubar that is the full width of the display, as a universal feature of the machine?

On the 16" 2019 MBP you don't even use the 16:10 if it's not in fullscreen mode. The menu bar takes up the some of the display space. I rather have the notch for now to give me proper 16:10 on the 2021 MBP.

The menubar doesn't "take up" display space, displaying it is one of the primary purposes of the display - it's a feature element of the tool. A little less content (which can be moved around or scaled) space for more menu is a *good* thing.

In the future when Apple moves the camera underneath or makes the webcam small enough to fit in the small bezel, that will be the best solution. Usage of full 16:10 display and no obstruction in the menu bar.

Slightly thicker top bezel, and slightly thinner bottom bezel would have worked just as well, and is what everyone else in the industry does, because it is a better solution.

Apple wanted it stand out and stand out it does. You would know it's the 2021 MBP from looking it from everywhere. The notch, the black keyboard deck, the engraving at the bottom and the flat lid.

Standing out is not a utility goal. It's decorative, & yet another example of Apple saying "Design is how it Functions", but doing "Design is how it Looks".
 

mattspace

macrumors 68040
Jun 5, 2013
3,185
2,879
Australia
Why do you want the menu bar below the notch? That would risk cutting into the 16:10 display.
Because I have apps with lots of menus & menubar apps etc that I don't want to run out of space, and because a gap in the menubar would drive me spare. And because it's a fundamental principle of the Mac UI that the menubar stays in the same place, which makes the enabling the menubar below the notch on an app-by-app basis a horrible kludge.

And most importantly, because the defence of the notch seems to be "there's easy workarounds", which don't in fact exist, yet it seems to be repeated purely as an article of faith.
 
  • Like
Reactions: th0masp

exoticSpice

Suspended
Original poster
Jan 9, 2022
1,242
1,951
Design is how it Looks
Last time Design was #1 for Apple we ended up with almost no ports on a Mac and a trash can Mac Pro whose purpose was for expandability.

The menubar doesn't "take up" display space, displaying it is one of the primary purposes of the display - it's a feature element of the tool. A little less content (which can be moved around or scaled) space for more menu is a *good* thing.
It does take up space and that is why before I set the menu bar to auto hide.



Slightly thicker top bezel, and slightly thinner bottom bezel would have worked just as well, and is what everyone else in the industry does, because it is a better solution.
The industry also has bad webcams in the small bezel. Have you seen how ugly and blurry the new webcams look on the Dell XPS.
 

exoticSpice

Suspended
Original poster
Jan 9, 2022
1,242
1,951
And most importantly, because the defence of the notch seems to be "there's easy workarounds", which don't in fact exist, yet it seems to be repeated purely as an article of faith.
Most people want to hide the notch and that's it. If a tiny strip of black bar annoys you that much then you must been shocked at the 12" MacBook.

With the option I have provided which is turning the menu bar black is what most people anyway. If you want to move the menu bar below the notch Apple provided it per app basis.

Let's end this here and stick to the topic please.
 
  • Like
Reactions: AlphaCentauri

iPadified

macrumors 68000
Apr 25, 2017
1,915
2,114
What has the notch to do the future GPU?

Back on topic: The Mac Pro will be the most interesting Mac in the transition as SoCs are not a perfect fit for such work stations. Throwing multiple Ultras on the problem might solve it, but will people want to pay for CPU when it is the GPU they use (or vice versa)? Hope WWDC will give us a clue.
 
  • Like
Reactions: Slartibart

deconstruct60

macrumors G5
Mar 10, 2009
12,318
3,906
There were rumors for quite some time that Apple was developing its own discrete GPU alongside the all-in-one SoC packaging of the M-series chips.

Was the rumors really about a discrete GPU?



“… According to relevant sources, Apple's self-developed GPU is progressing smoothly. The research and development code is Lifuka. Like the upcoming A14X processor, it is produced using TSMC's 5nm process. Apple has designed a series of processors for Mac personal computers. The new GPU will provide better performance per watt and higher computing performance. It has tile-based deferred rendering technology that allows application developers to write more powerful professional application software and game software…..”

This was a GPU for an iMac . Apple didn’t release a 27”, large screen iMac. They released the Studio with M1 Max and Ultra SoCs . Furthermore Lifuka is an island that is a part of ( subset of ) Tonga. The code name is indicative of something that is part of a whole. Not necessary something completely independent ( e.g. a complete discrete and detached GPU ) .

Apple did the Ultra . There is another GPU on another chip there ( relative to the primary die there in the package. ) . But it is not ‘discrete’ in the same sense that adjective is used in the general GPU discussions .

Decent chance folks were mapping there current understanding of the Intel iMac over the last 6-9 years .

Apple GPU going to use tile rendering …. that is like saying water is wet . They have done that all along since using the Imagination Tech baseline design . TSMC N5 … the whole M1 line up does. Performance per watt as major feature …. A Key feature of M1 Ultra reveal .

[ at one time pretty good chance there was a large screen iMac , but also extre likely that it shared SoC packages with the Studio . It highly like was not the GPU that suppressed that for now. Screen costs ( e.g. mini LEDs that didn’t scale ‘cheap’ ) , Studio Display priority , supple chain logistics , and/or market segmentation ]

Running a GPU that presents as a single GPU split over two dies would have been a research project .

It wouldn't be impossible that Apple might release Mac Pro chips as a new series outside of the M family (one that doesn't have to be updated as regularly) - CPU/Neural Engine-focused chips that that support modular GPUs and RAM.

Not impossible . Also not impossible that large meteor will fall out of sky and hit Apple campus center ring building like a bullseye .

Apple forks off a SoC package and Apple GPU driver stack that only works in the Mac Pro ? Probably not. The telling indicator is that Apple has put little visible effort into any discrete GPU driver stack develop ( both their own GPU or anyone else’s ) over last 18 months. skipping in 2020 is excusable as they were busy getting new OS branch off the ground. 2021 skipping again starts to more look like long term plan.

The rumors also point to a quad die M-seres solution to scale to a relatively massive GPU. That entirely lines up Apple‘s driver work so far M1 ‘plain’ -> Ultra. Apple spends 10’s of millions developing the Ultra Fusion connectivity …. pretty good chance going to leverage that further in GPU space going forward.


Designing and manufacturing these sorts of custom parts for a small portion of the Mac market would no doubt be expensive, but given the current price points of the Mac Pro it wouldn't be a far leap. Doing so would free Apple to ramp up core counts on both the CPU and GPU without having to make the chip die impossibly large.

Modern 2.5 and 3D chip packaging means Apple does not need to go past 500-650mm^2 dies to scale. Apple may shrink the equivalent of the Ultra into a single die ( and just do a double Ultra )

Decoupling the CPU and GPU would mean Apple would need a new software stack that they don’t have now .

in most case , the bulk of the Mac Pro cost is not the CPU package .

8 core system $5,999 . W-3225 $1,199 -> 20%
16 core system $7,999 . W-3245 $1,999 -> 25%

Yes, the 24 and 28 core pacackes have both Intel’s “ > 1TB RAM “ tax and Apple’s “ pile on top” tax on them , but that isn’t the bulk of MacPro sales.

Furthermore Apple is already charging $2,000 to go from full M1 Mac to Full Ultra .
if talking a 3-6x more lower volume die , then the Apple markup is going to be about as many multiples of that $2k.


Apple should enable 3rd party ’compute’ GPGPUs but mainstream GPU that is detached from the Studio SoC apparoach is going to be quite small .
 

mattspace

macrumors 68040
Jun 5, 2013
3,185
2,879
Australia
What has the notch to do the future GPU?

Back on topic: The Mac Pro will be the most interesting Mac in the transition as SoCs are not a perfect fit for such work stations. Throwing multiple Ultras on the problem might solve it, but will people want to pay for CPU when it is the GPU they use (or vice versa)? Hope WWDC will give us a clue.

The notch topic came up to refute an opinion, stated as an article of faith, that modern Macbooks have the best displays ever. The counterpoint is, they have trashfire displays, and Apple are too cowardly to include a universal switch to put the menubar below the notch on a global basis, because it might reveal their decorative choice to prioritise equal bezels isn't actually beloved by the users.

Just like they said people loved the butterfly keyboard, because so many people bought butterfly keyboard macbooks. Tolerating is not the same as liking, and buying because it's the only option, is not the same as choosing.

But yes, back to the Mac Pro, ANY design that lacks user-upgradable discrete (display) GPU, that is separate from the processor, is just a retread of the 2013, and we know the reason it failed in the market as a workstation (we're not including the majority of sales which were for running them headless as server racks, because that's not a *workstation* role, that's accidentally making a server), was because the machines who exist on a ~3-4year minimum turnaround, were incapable of competing in a world where GPUs are on a 12 month turnaround. We know this to be the truth of the matter, because Apple literally offers 12 month refreshes of GPUs as "upgrade your mac" marketed products. The availability of major system component upgrades was made clear to Apple as table stakes for the market.

Apple's BEST AS processor is handily spanked on Apple's own graphics stack, by midrange GPUs that cost 1/3 of the delta for the mid->top end Apple GPU upgrade, and they can do that spanking running in a 12 year old system.

It takes a very special delusion to look at that, and conclude that Apple is going to produce an SOC with a GPU that will in some magical way, completely rewrite the current reality, because the integration, the super fast communication, the shared memory, none of it counts for s^&t given how badly Apple GPUs perform, on Apple's own graphics stack.

It makes no sense to assume that because macs using evolved iPad processors have integrated only graphics, that workstations will follow the same path. Macs shipped for years with Intel processors with integrated graphics, didn't stop them also making machines with discrete GPUs.

The Mac Pro was always a different processor from the consumer Macs, there's no reason for that paradigm to change. The Mac Studio is a beefed up Mac Mini, and the Mac Mini used integrated graphics. It makes perfect sense from that perspective. But, it also makes no sense to project from that, that the Mac Pro will follow its architectural paradigm.

Yes, there's no AMD drivers for Apple Silicon. There's also (according to AMD's site) no AMD Drivers for ANY ARM systems, so this is perhaps less to do with Apple's strategies, and more to do with AMD having not gotten around to making the reference drivers yet, which is a far simpler and more likely solution than this kremliniology that because there's no eGPU support for AMD on AS, there'll therefore be no AMD ever.

More likely, Apple is just embarrassed as f^&k that AMD are giving them a low priority, and forced them to launch a (generation before it was intended thanks to Intel shortages) product transition with no drivers for discrete GPUs, and are therefore trying to sour grapes things by puffing up the marketing of their anaemic graphics capabilities, and trying to foster a belief in the developer community that "there'll never be discrete graphics, so you'd better get all in on optimising for ASGPU".
 
Last edited:

goMac

Contributor
Apr 15, 2004
7,662
1,694
The telling indicator is that Apple has put little visible effort into any discrete GPU driver stack develop ( both their own GPU or anyone else’s ) over last 18 months.

Hmmmm? It's all one driver stack. There is no discrete vs integrated driver stack. It's all the same.

There are a few issues. Apple family GPUs promise a single address space. However - they don't necessarily promise unified memory. Such a change would be solvable. AMD has been doing a lot of work with single address space discrete cards. And Metal does allow an Apple family GPU to say it does not support unified memory.

Unified memory in Metal also doesn't imply single address space. Intel GPUs claim to have unified memory - even though they have a separate address space.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,318
3,906
The notch topic came up to refute an opinion, stated as an article of faith, that modern Macbooks have the best displays ever.
....
Yes, there's no AMD drivers for Apple Silicon. There's also (according to AMD's site) no AMD Drivers for ANY ARM systems,


More to unwind out of this general post later ... but for now...


RDNA2 is running on millions of Arm chips now inside of these:


Not particularly creditable at all that AMD played zero part at all in that driver development. AMD's site for downloadable drivers isn't exactly prof of existence or not of driver work done.
[ The implementation instance Samsung has is likely semi-custom but it isn't a completely independent architecture either. Not off-the-shelf desktop/laptop drivers but not completely different either. ]


Or

https://www.reddit.com/r/Amd/comments/kly8kk

"... the HUAWEI Qingyun W510 computer, built on the Kunpeng 920-3211K platform with 24 cores compatible with Arm version 8.2, running at 2.6 GHz. ...
... The graphics accelerator of the new product is AMD Radeon 520. ..."


so this is perhaps less to do with Apple's strategies, and more to do with AMD having not gotten around to making the reference drivers yet, which is a far simpler and more likely solution than this kremliniology that because there's no eGPU support for AMD on AS, there'll therefore be no AMD ever.

Other folks are shipping (or have shipped not sure if that Huawei system is still up for sale. ) .

The foundation of having eGPU support tis to have a basic PCI-e card driver. eGPU just means don't make assumptions to ignore optional "plug and play" PCI-e directives and encode some "unplug" event handlers (which isn't really the card's issue to handle. )


More likely, Apple is just embarrassed as f^&k that AMD are giving them a low priority, and forced them to launch a (generation before it was intended thanks to Intel shortages) product transition with no drivers for discreet GPUs, and are therefore trying to sour grapes things by puffing up the marketing of their anaemic graphics capabilities, and trying to foster a belief in the developer community that "there'll never be discreet graphics, so you'd better get all in on optimising for ASGPU".

Really don't have to do much "fostering of belief" to software developers with their head not stuck in an ostrich hole .


iMac Intel dGPUs --> M-series iMac no dGPUs.
high end MBP 15/16 dGPUs ---> M-series MPB no dGPUs.

mini (and can toss Studio into that camp if not the iMac one ) and rest of laptops never had a dGPU. Even before the switch 55+ % of Macs sold didn't have one. It is just way more skewed now.


In terms of product volume that is millions of target systems gone. The Mac Pro is at least an order of magnitude ( if not two ) lower than that. That is not going to be a volume base for you Mac software for anything shop that sells anywhere close to volume.


I doubt Apple is embarrassed. This likely has more to do with the general Apple trend of kicking everyone out of the kernel space. That is an openly stated direction by Apple. There is no doubt that is coming. It is a "when" not an "if" that is debatable.
Throw on top the "seamlessly run native iOS/iPadOS apps" and direction is even more clear.



P.S. RISC-V also has had working AMD GPUs for a while.


"... The open source Radeon drivers have worked fine on the HiFive Unleashed for 2.5 years already and there's no reason they would stop working just because of an updated board. ..."
https://www.reddit.com/r/RISCV/comments/kcu4k3

P.P.S.

In FAQ here ...

"....
Are AMD GPUs supported ?
Yes, AMD GPUs are supported and patches for open source drivers are provided to make sure they can work with Ampere Altra and Altra Max systems. Currently the following AMD GPUs have been tested:

  • WX5100
  • WX5500
  • W6800
Please note that these AMD GPUs require assisted encoding to balance encoding capacity for streaming.
"
 
Last edited:

exoticSpice

Suspended
Original poster
Jan 9, 2022
1,242
1,951
If anyone needs proof of what @mattspace is saying about 12 year old machines, please check out this post
and? The CPU sucks now. That great GPU is bottlenecked by a piss poor CPU by todays workstation standards.

If I wanted to CPU work on computer I would not be using a 12 year old system. I would also want a current CPU and GPU.

That's why the 7,1 is best for now if you want decent CPU speed
 

kvic

macrumors 6502a
Sep 10, 2015
516
459
People forgot the recent drama on Radeon performance drop in Monterey 12.3 (?) and got fixed in 12.3.1 (?)

You have to pause a minute and think it through why that happened. To me, apparently some ppl at Apple Spaceship were changing Radeon codes and in a non-trivial way. Some bugs slipped through..and the rest is what the public had seen and forgotten by now (even though the time is just v12.4)

You've to wonder why Apple was/is still spending time on Radeon codes not just RDNA2 but older models based on Polaris as well. That's a good indication to me Radeon drivers are still pretty much being actively worked on.

I feel like RX7900XT is a sure thing on MacOS. Perhaps will see RX7600/XT too. Don't hold your breath though.
 

exoticSpice

Suspended
Original poster
Jan 9, 2022
1,242
1,951
You've to wonder why Apple was/is still spending time on Radeon codes not just RDNA2 but older models based on Polaris as well. That's a good indication to me Radeon drivers are still pretty much being actively worked on.
It is because Metal support for AMD cards for Blender were added 12.3.
 

mattspace

macrumors 68040
Jun 5, 2013
3,185
2,879
Australia
RDNA2 is running on millions of Arm chips now inside of these:

When there's an AMD ARM Driver for a 6x00, or any of their discrete GPUs, then AMD having an ARM driver and Apple choosing not to release drivers will suggest a strategy, until that point, the lack of eGPU support is just meaningless chicken entrails.
 

iPadified

macrumors 68000
Apr 25, 2017
1,915
2,114
It takes a very special delusion to look at that, and conclude that Apple is going to produce an SOC with a GPU that will in some magical way, completely rewrite the current reality, because the integration, the super fast communication, the shared memory, none of it counts for s^&t given how badly Apple GPUs perform, on Apple's own graphics stack.

It makes no sense to assume that because macs using evolved iPad processors have integrated only graphics, that workstations will follow the same path. Macs shipped for years with Intel processors with integrated graphics, didn't stop them also making machines with discrete GPUs.
I fully agree. I am a bit pessimistic of Apple catering for third party GPUs. They got literally burned by AMD with their very hot running GPUs in iMac and laptops. However, Map Pro as a market segment is very small and a dedicated Mac Pro chip with user choice of CPU and GPU core counts independently of each other seem to be economical stupidity. This is a difficult equation to solve and therefore the most interesting Mac to observe.
 
  • Like
Reactions: prefuse07
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.