Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Minga089

macrumors regular
Jun 26, 2020
122
99
München, Bayern
So, this doesn’t work anymore? https://github.com/xzhih/one-key-hidpi

What this does is makes Mac OS render the UI at 2x your native resolution, and then scale it down. This gives you sharper and more legible text at the expense of screen estate, because text and UI elements are bigger.
I am still on Catalina but if that doesn't work anymore I won't be upgrading, since this is the only way to get a crisp picture with my 2K monitor in macOS.
 

StellarVixen

macrumors 68040
Mar 1, 2018
3,180
5,653
Somewhere between 0 and 1
Just updated, 1080p native looks horrible.

Luckily, this works, for those who want to run at HiDPI:https://github.com/mlch911/one-key-hidpi

Result:

Screen Shot 2020-11-15 at 10.19.41 PM.png
 
  • Like
Reactions: akke4323

fabdub

macrumors newbie
Nov 6, 2020
28
11
Code:
sudo defaults write /Library/Preferences/com.apple.windowserver.plist DisplayResolutionEnabled -bool true

This doesn't do anything on my new m1 Air ... I'm stuck with an ugly work Monitor right now ...
Anyone else had a way to enable HiDPI on their new M1 ???

and this outputs nothing:
Code:
ioreg -lw0 | grep -i "IODisplayEDID"

(Yes I tried with SIP disabled)
 
Last edited:

joevt

Contributor
Jun 21, 2012
6,701
4,089
Code:
sudo defaults write /Library/Preferences/com.apple.windowserver.plist DisplayResolutionEnabled -bool true

This doesn't do anything on my new m1 Air ... I'm stuck with an ugly work Monitor right now ...
Anyone else had a way to enable HiDPI on their new M1 ???

and this outputs nothing:
Code:
ioreg -lw0 | grep -i "IODisplayEDID"

(Yes I tried with SIP disabled)
M1 doesn't store EDID in ioreg (uses different hardware/software for graphics than Intel Macs).

What is the resolution of your work monitor?

Can you add a scaled resolution with SwitchResX (there's a beta 11 but I don't know how well it works with M1 Macs - it should at least let you create custom resolutions and timings).
 

fabdub

macrumors newbie
Nov 6, 2020
28
11
M1 doesn't store EDID in ioreg (uses different hardware/software for graphics than Intel Macs).

What is the resolution of your work monitor?

Can you add a scaled resolution with SwitchResX (there's a beta 11 but I don't know how well it works with M1 Macs - it should at least let you create custom resolutions and timings).
3440 x 1440 on 32".
I used to work sometimes with pure 2x @1720x720 but that was HUGE.
SO I had a custom 1580x1080. Anyways I can use those res but they are NOT HiDPI.
Nobody seems to have found a solution yet. Even SwitchResX creator wrote to me:
"Yes this is broken in macOS on M1/ARM Macs. No I don't know of any other replacement."
 

joevt

Contributor
Jun 21, 2012
6,701
4,089
1580x1080
So you can't make a 3160x2160 scaled resolution? I believe people have created overrides on M1 Macs that can at least change the display name which means the override is being read, or am I mis-remembering that?
 

fabdub

macrumors newbie
Nov 6, 2020
28
11
So you can't make a 3160x2160 scaled resolution? I believe people have created overrides on M1 Macs that can at least change the display name which means the override is being read, or am I mis-remembering that?
We used to be able to activate HiDPI resolutions by using 2x pixels or even 1.5x, whatever we wanted, we'd specifiy: this is an HiDPI option. Now the command to activate those possibilities is useless on m1 macs.

Code:
sudo defaults write /Library/Preferences/com.apple.windowserver.plist DisplayResolutionEnabled -bool true
 

landale

macrumors newbie
Apr 8, 2020
4
7
Ugh, I wish I had seen this thread before I went out and bought a Lenovo 27" QHD monitor. I had been test driving a Samsung 32" UHD monitor but it was a VA panel so everything kinda looked a bit off as I am used to IPS displays. Figured the Lenovo was only QHD but that was still a good resolution for me but now I realize I can't run it at QHD without horrible looking text. So ridiculous since all my other devices look great on it. I guess basically the short answer is that when it comes to external monitors and MacOS you need to buy something 4K or above or it looks like garbage?
 

Jassbag

macrumors newbie
Dec 27, 2017
23
6
Athens
For all those suggesting using resolution overrides to be able to use "HiDPI", do I get this correctly?

For example, on a 1440p monitor, running at a HiDPI resolution of 1152p, you are suggesting that the GPU renders 60% more pixels for 20% less UI space, just for the UI and text "zoom". Is that correct?

Am I the only one that finds this crazy?

UI scaling is not supposed to work like that. In Retina Macbook Pros for example, the default scaling (which is not native) offers:

12.5% more pixels for 12.5% more space

What you suggest is the complete opposite and a total waste of system resources.
 

joevt

Contributor
Jun 21, 2012
6,701
4,089
Where do your 60%, 20%, and 12.5% numbers come from?


pixels to render = 1152p HiDPI = 2048x1152 HiDPI = 2359296 HiDPI pixels = 4096x2304 = 9437184 pixels
UI pixels = 1440p = 2560x1440 = 3686400

% increase for pixels = (9437184 pixels to render - 3686400 UI pixels) / 3686400 UI pixels * 100% = 156%
% decrease for UI = (3686400 UI pixels - 2359296 HiDPI pixels) / 3686400 UI pixels * 100% = 36%

156% more pixels to render 36% less UI (plus the additional step to scale down the image). But you get something that looks a little better than 2048x1152 scaled up to 2560x1440.

Maybe you meant UI pixels = 1280x720 HiDPI = 921600 pixels.
It's still 156% more pixels to render (plus the additional step to scale down the image) but now you also get 156% more UI:
% increase for UI = (2359296 UI pixels - 921600 UI pixels) / 921600 UI pixels * 100% = 156%

Retina MacBook Pros do the same thing but they have a 2560x1600 (13 inch) or 2880×1800 (15 inch) or 3072×1920 (16 inch) display.
 
  • Like
Reactions: Minga089

Jassbag

macrumors newbie
Dec 27, 2017
23
6
Athens
I was only counting on vertical pixels. It's even bigger while the correlation stays the same.

Retina MacBook Pros with 2560x1600 (13 inch), for example, using your counting, scale the UI by default on 1440x900 UI, with a rendered resolution of 2880 x 1800 (that gets scaled down to 2560x1600 with an additional step).

In that case, you render ~26.5% more pixels (5184000 pixels) for ~26.5% more UI real estate.

Nowhere near what all these guides suggest, like setting 1152p HiDPI on a 1440p monitor:

In that case, you render ~156% more pixels (9437184 pixels) for ~36% LESS UI real estate.

It's insane to me that this is a suggestion.

All that just to get a bigger UI? With GPU working on overtime plus any downscaling errors? Is text even rendered better this way (downscaled) versus native resolution rendering with proper AA?
 

Minga089

macrumors regular
Jun 26, 2020
122
99
München, Bayern
I was only counting on vertical pixels. It's even bigger while the correlation stays the same.

Retina MacBook Pros with 2560x1600 (13 inch), for example, using your counting, scale the UI by default on 1440x900 UI, with a rendered resolution of 2880 x 1800 (that gets scaled down to 2560x1600 with an additional step).

In that case, you render ~26.5% more pixels (5184000 pixels) for ~26.5% more UI real estate.

Nowhere near what all these guides suggest, like setting 1152p HiDPI on a 1440p monitor:

In that case, you render ~156% more pixels (9437184 pixels) for ~36% LESS UI real estate.

It's insane to me that this is a suggestion.

All that just to get a bigger UI? With GPU working on overtime plus any downscaling errors? Is text even rendered better this way (downscaled) versus native resolution rendering with proper AA?
The text definitely looks a lot better on my 24" monitor in 2048x1152 HiDPI (compared to native 2560x1440) otherwise I wouldn't do it.
 

DeadnAlive

macrumors newbie
May 28, 2020
3
0
I'm planning on buying a Dell 27" monitor with a resolution of 3840 x 2160. That makes it a 163ppi.
Will this be enough for the font to be rendered smoothly?
 

joevt

Contributor
Jun 21, 2012
6,701
4,089
I was only counting on vertical pixels. It's even bigger while the correlation stays the same.

Retina MacBook Pros with 2560x1600 (13 inch), for example, using your counting, scale the UI by default on 1440x900 UI, with a rendered resolution of 2880 x 1800 (that gets scaled down to 2560x1600 with an additional step).

In that case, you render ~26.5% more pixels (5184000 pixels) for ~26.5% more UI real estate.

Nowhere near what all these guides suggest, like setting 1152p HiDPI on a 1440p monitor:

In that case, you render ~156% more pixels (9437184 pixels) for ~36% LESS UI real estate.

It's insane to me that this is a suggestion.

All that just to get a bigger UI? With GPU working on overtime plus any downscaling errors? Is text even rendered better this way (downscaled) versus native resolution rendering with proper AA?
macOS did have an arbitrary scaling factor between 100 and 300% but Apple decided that doing 200% with additional scaling step was the best way to go.
#945
 

Jassbag

macrumors newbie
Dec 27, 2017
23
6
Athens
but Apple decided that doing 200% with additional scaling step was the best way to go.
It's good they did that, for developers. That's why right now scaling is only available on a HiDPI screen.

Choosing a HiDPI resolution for a non-HiDPI screen, via hacks, just to get the UI scaling is insane.

Rendering ~156% more pixels to get a 25% bigger font and declare "The font is now rendered nicely". No, it's just bigger.

That must be the most computationally intensive zoom in the history of history ?
 

Minga089

macrumors regular
Jun 26, 2020
122
99
München, Bayern
It's good they did that, for developers. That's why right now scaling is only available on a HiDPI screen.

Choosing a HiDPI resolution for a non-HiDPI screen, via hacks, just to get the UI scaling is insane.

Rendering ~156% more pixels to get a 25% bigger font and declare "The font is now rendered nicely". No, it's just bigger.

That must be the most computationally intensive zoom in the history of history ?
What is your problem? For you it's insane for other people it's the only way to use a 24" 2K Monitor in macOS, because native it looks horrible.
 

Jassbag

macrumors newbie
Dec 27, 2017
23
6
Athens
Maybe you meant UI pixels = 1280x720 HiDPI = 921600 pixels.
It's still 156% more pixels to render (plus the additional step to scale down the image) but now you also get 156% more UI:
% increase for UI = (2359296 UI pixels - 921600 UI pixels) / 921600 UI pixels * 100% = 156%

By "more pixels", I mean wasted pixels. Pixels that are not mapped to your screen and are wasted.

In the case of 1280x720 HiDPI there are 0% wasted pixels in 1440p monitors. In the case of 1280x800 HiDPI on Retina MacBook Pros (13 inch), there are also 0% wasted pixels. The rest of the "wasted" pack is calculated above.

What is your problem?

My problem? Insufficient data on the Internet.
Your GPU/CPU problem? Generating heat and reducing your battery lifespan by rendering ~156% wasted pixels just to get a 25% bigger (not cleaner) font.

I really wonder, do you have adjusted the "sharpness" or whatever text-clarity setting name your monitor has correctly?

I mean ok, things are not sub-pixel AA rendered anymore (good riddance, old LCDs), but, the grayscale antialiasing + the universal use of modern fonts + extra monitor techniques result in excellent rendered text
 

Attachments

  • Font rendering 1.jpg
    Font rendering 1.jpg
    796.2 KB · Views: 153
Last edited:
  • Haha
Reactions: Minga089

Minga089

macrumors regular
Jun 26, 2020
122
99
München, Bayern
By "more pixels", I mean wasted pixels. Pixels that are not mapped to your screen and are wasted.

In the case of 1280x720 HiDPI there are 0% wasted pixels in 1440p monitors. In the case of 1280x800 HiDPI on Retina MacBook Pros (13 inch), there are also 0% wasted pixels. The rest of the "wasted" pack is calculated above.



My problem? Insufficient data on the Internet.
Your GPU/CPU problem? Generating heat and reducing your battery lifespan by rendering ~156% wasted pixels just to get a 25% bigger (not cleaner) font.

I really wonder, do you have adjusted the "sharpness" or whatever text-clarity setting name your monitor has correctly?

I mean ok, things are not sub-pixel AA rendered anymore (good riddance, old LCDs), but, the grayscale antialiasing + the universal use of modern fonts + extra monitor techniques result in excellent rendered text
You clearly never used a 24" 2K monitor in macOS otherwise you wouldn't talk like that. The web is full of people having the exact same problem with sub 4K but especially 24" 2K monitors in MacOS and it has nothing to do with monitor settings.

What heat? My dGPU draws less W in clamshell mode when I run my monitor in 2048x1152 HiDPI than in native 2560x1440 (5W compared to 17W in native resolution). When I use my machine open lid it's about the same. The machine barely gets hotter than 70C and you talk about reduced battery lifespan? LOL!
 

joevt

Contributor
Jun 21, 2012
6,701
4,089
It's good they did that, for developers. That's why right now scaling is only available on a HiDPI screen.

Choosing a HiDPI resolution for a non-HiDPI screen, via hacks, just to get the UI scaling is insane.
There is no distinction between a HiDPI display and a non-HiDPI display except the dpi.

I have a 30" 2560x1600 screen. If I want to use a 1280x800 mode then I might want the HiDPI version because I have the pixels for it.

My 30" is not HiDPI but a 13" MacBook Pro with 1600p is HiDPI. Why should my 30" be stuck with 1280x800 low resolution while the 13" can have 1280x800 HiDPI? That would be insane.

Rendering ~156% more pixels to get a 25% bigger font and declare "The font is now rendered nicely". No, it's just bigger.
It's not bigger if the pixels are smaller.

That must be the most computationally intensive zoom in the history of history ?
I suppose the more pixels you have to render the more computation it takes. In that case, rendering the exact number of pixels for your display is probably less expensive than going over the size of the display and scaling down. However, you can gain some antialiasing (not the subpixel kind) with the scaling down operation (bicubic interpolation or whatever).

By "more pixels", I mean wasted pixels. Pixels that are not mapped to your screen and are wasted.
They're not wasted if their color values are used in the scaling down operation (interpolation).

In the case of 1280x720 HiDPI there are 0% wasted pixels in 1440p monitors. In the case of 1280x800 HiDPI on Retina MacBook Pros (13 inch), there are also 0% wasted pixels. The rest of the "wasted" pack is calculated above.
Yup.
 

Jassbag

macrumors newbie
Dec 27, 2017
23
6
Athens
I might want the HiDPI version because I have the pixels for it

To be precise, you just want the UI scaling that is a byproduct of the HiDPI version. To put it in other words, if you could have a UI scaling option you wouldn't bother with the "fake HiDPI" resolution. Correct?

My 30" is not HiDPI but a 13" MacBook Pro with 1600p is HiDPI. Why should my 30" be stuck with 1280x800 low resolution while the 13" can have 1280x800 HiDPI? That would be insane.

Well of course, if your target is a 1280x800 UI then you have a perfect match with the 1280x800 HiDPI resolution.

Seriously, running a 1280x800 UI on a 30" screen? Seriously??

It's not bigger if the pixels are smaller.

Of course, but, are we really talking about shrinking pixels here? Pixels of your monitor are shrinking down when you select a HiDPI resolution? :rolleyes::rolleyes: What kind of sorcery modifies the pixel size on my monitor? I want that.. If we are really talking about this kind of magic, I rest my case.

They're not wasted if their color values are used in the scaling down operation (interpolation).

However, you can gain some antialiasing (not the subpixel kind) with the scaling down operation (bicubic interpolation or whatever).

They are wasted. There are no additional antialiasing benefits from doing this. In fact, sometimes there are maybe errors while doing this on text (let's call it fuziness). Antialiasing is correctly calculated on the native pixel grid of the screen. We are talking about text here, not real-time graphics (which would be benefited from this kind of supersampling, in the excess of extra GPU sweat).

Then again, you have the case of bitmaps. To make it simpler to digest, you can imagine a 2560x1440 wallpaper bitmap. It would fit pixel-perfect on your 1440p monitor. But, in the "fake" HiDPI mode, the bitmap is first upsampled to whatever "fake" HiDPI resolution you choose and then downsampled in the last step to fit your native pixel grid. If you are ok with working with assets that are presented to you after a dual pass of "sampling", no worries then. I just feel pity for any "professional" that chooses to work that way.

To sum it up, what you get with these "HiDPI hacks" is simply a workaround to UI scaling that results in bigger text (with wasted rendered pixels + any downscaling errors)*. Is it worth it? Is it worth it more than investing in a better screen with proper "sharpness" (it doesn't have to be high resolution / high DPI)?

*The only "valid/correct" HiDPI hack is that of running half the native resolution in HiDPI. But I'm seriously asking, can you run a 1280x800 UI on a 30" screen? Or a 1280x720 UI on a 27" screen?

I specifically asked @StellarVixen in a pm about it (Do you really use your computer on 720?), after seeing this. He refused to answer directly.
 

joevt

Contributor
Jun 21, 2012
6,701
4,089
To be precise, you just want the UI scaling that is a byproduct of the HiDPI version. To put it in other words, if you could have a UI scaling option you wouldn't bother with the "fake HiDPI" resolution. Correct?
If you mean UI scaling like Windows, I think it would be a nice feature to have. Similar to the experimental arbitrary scaling macOS had in Leopard and Snow Leopard.

Seriously, running a 1280x800 UI on a 30" screen? Seriously??
Not really. Someone else maybe. If I need a screenshot of HiDPI, I can choose 1280x800 HiDPI, or I can choose 2560x1600 HiDPI which looks very much like 2560x1600 on my 1560x1600 display. I could go nuts and do 3272x2045 HiDPI (I think I have an 8K x 4K limit on my old Mac GTX 680).

Of course, but, are we really talking about shrinking pixels here? Pixels of your monitor are shrinking down when you select a HiDPI resolution? :rolleyes::rolleyes: What kind of sorcery modifies the pixel size on my monitor? I want that.. If we are really talking about this kind of magic, I rest my case.
I mean getting a display with more pixels. Or having the display further away.

They are wasted. There are no additional antialiasing benefits from doing this. In fact, sometimes there are maybe errors while doing this on text (let's call it fuziness). Antialiasing is correctly calculated on the native pixel grid of the screen. We are talking about text here, not real-time graphics (which would be benefited from this kind of supersampling, in the excess of extra GPU sweat).
Antialiasing of text is calculated on the pixel grid of the framebuffer.

In addition, I've looked at some test patterns with alternating color patterns (no antialiasing on the pixel grid of the framebuffer - the patterns were checkerboard, vertical lines, horizontal lines, etc.). When the framebuffer is scaled for output to the display, pixels are averaged or interpolated - which means the information of the extra pixels is not completely thrown away. black and white vertical lines appear gray after scaling down - this means the black and white pixels were both used in the final image - pixels are not skipped. You loose their exact position, contrast, and separation but that's unavoidable when you reduce the number of output pixels.

I'm not sure what real-time graphics has to do with scaling - every frame output to the display has the scaling, whether the scaling is done once for each change to the framebuffer, or done for every frame of the refresh rate.

Then again, you have the case of bitmaps. To make it simpler to digest, you can imagine a 2560x1440 wallpaper bitmap. It would fit pixel-perfect on your 1440p monitor. But, in the "fake" HiDPI mode, the bitmap is first upsampled to whatever "fake" HiDPI resolution you choose and then downsampled in the last step to fit your native pixel grid. If you are ok with working with assets that are presented to you after a dual pass of "sampling", no worries then. I just feel pity for any "professional" that chooses to work that way.
I've done a test of this as well using GraphicConverter.app. The upsampling for the HiDPI framebuffer appears to be perfect for a 200%x200% scaling. It doesn't seem to be using a fuzzy scaling like when I use my GPU to scale 1280x800 (not HiDPI) to 2560x1600 (not HiDPI). My display is an Apple 30" Cinema Display - if it receives a 1280x800 signal, then it uses a non-fuzzy perfect scaling up to 2560x1600 so text is blocky instead of fuzzy. The Apple 30" Cinema Display doesn't have the smarts to scale any other input signal.

If you have arbitrary scaling instead of just 200%x200% then you can get upscaling problems since it's no longer perfect. Maybe that's one reason Apple choose 200%.

There is a problem with the alternating black and white patterns in downscaling though. Sometimes they are perfect, and sometimes they are gray. I think they turn gray when the image is drawn starting from an odd pixel. for example, if a display is set to 2560x1600 HiDPI on a 2560x1600 display, and you have a 640x480 image with some single pixel horizontal lines and some single pixel vertical lines, and the image is drawn using 1280x960 pixels to the HiDPI framebuffer, then everything is perfect if the image is drawn on an even pixel (0, 2, 4, ...). If x is odd, then the horizontal lines are gray. If y is odd then the vertical pixels are gray. If both x and y are odd, then both the horizontal and vertical lines are gray.

Moving a window in HiDPI mode moves it two pixels at a time so you don't see shimmering that way.
Resizing a window in HiDPI mode resizes it by two pixel increments. If the image is centred in the window, then that's when you see the shimmering (where the two pixel wide black and white lines alternate between being perfect and being gray as you continue changing the size).
Scrolling the window does not always use two pixel increments (because the increments are a ratio of the image size and window size) so shimming can occur with horizontal lines when scrolling vertically, or with vertical lines when scrolling horizontally.

With your wall paper example, the image is always aligned at 0,0 and it's static so you don't see the animated shimmering that you see with scrolling or resizing. Plus wall papers usually don't have single-pixel details/objects.

To sum it up, what you get with these "HiDPI hacks" is simply a workaround to UI scaling that results in bigger text (with wasted rendered pixels + any downscaling errors)*. Is it worth it? Is it worth it more than investing in a better screen with proper "sharpness" (it doesn't have to be high resolution / high DPI)?

*The only "valid/correct" HiDPI hack is that of running half the native resolution in HiDPI. But I'm seriously asking, can you run a 1280x800 UI on a 30" screen? Or a 1280x720 UI on a 27" screen?
Depends on what you're doing. I like 1600p low res on my 1600p display. I use 3008p HiDPI on my 4K.

I specifically asked @StellarVixen in a pm about it (Do you really use your computer on 720?), after seeing this. He refused to answer directly.
He said it looks nicer. I'm not going to argue with that.
 
  • Like
Reactions: edubfromktown

Jassbag

macrumors newbie
Dec 27, 2017
23
6
Athens
If you have arbitrary scaling instead of just 200%x200% then you can get upscaling problems since it's no longer perfect. Maybe that's one reason Apple choose 200%.

? What I'm trying to pass-through, but some people refuse to digest.

There is a problem with the alternating black and white patterns in downscaling though. Sometimes they are perfect, and sometimes they are gray. I think they turn gray when the image is drawn starting from an odd pixel. for example, if a display is set to 2560x1600 HiDPI on a 2560x1600 display, and you have a 640x480 image with some single pixel horizontal lines and some single pixel vertical lines, and the image is drawn using 1280x960 pixels to the HiDPI framebuffer, then everything is perfect if the image is drawn on an even pixel (0, 2, 4, ...). If x is odd, then the horizontal lines are gray. If y is odd then the vertical pixels are gray. If both x and y are odd, then both the horizontal and vertical lines are gray.

Moving a window in HiDPI mode moves it two pixels at a time so you don't see shimmering that way.
Resizing a window in HiDPI mode resizes it by two pixel increments. If the image is centred in the window, then that's when you see the shimmering (where the two pixel wide black and white lines alternate between being perfect and being gray as you continue changing the size).
Scrolling the window does not always use two pixel increments (because the increments are a ratio of the image size and window size) so shimming can occur with horizontal lines when scrolling vertically, or with vertical lines when scrolling horizontally.

? This is just one of the many real-world usecases of artifacts. A hint if you want to think about another usecases: The web and it's rendering engines

With your wall paper example, the image is always aligned at 0,0 and it's static so you don't see the animated shimmering that you see with scrolling or resizing. Plus wall papers usually don't have single-pixel details/objects.

I used the wallpaper as an example just to make it easier to digest due to it's resolution. The keyword here is "usually" & pixel-perfect rendering

:)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.