Hacker Newsnew | past | comments | ask | show | jobs light | darkhn

>Windows is the only OS that does scaling properly. macOS just pretends non-'retina' displays don't exist

Not true. I use a high-DPI (~250) MacBook with a non-high-DPI (~100) external monitor [0] and the transition between the two is seamless. Windows are identically sized when dragging from one screen to another. The same holds true when I use the laptop with a mid-DPI (~150) monitor.

I could not say the same was true a few years ago when I tried a high-DPI Windows 10 laptop with a non-high-DPI external monitor; it looked something like this [1]. Perhaps this has since been fixed.

macOS is able to achieve consistent sizing across displays irrespective of pixel density because it uses a compositor to render the whole screen at a high resolution and, if necessary, downsamples it proportionally for each screen. (Wayland on Linux can do the same, though it's certainly a much bigger headache to get consistently working than macOS.) When I tried using Windows 10 at two DPIs simultaneously, it just let me scale the font size and other UI elements on a per-screen basis, but not the screen as a whole, since I assume it does not use a compositor.

[0] Not my setup, but here is someone doing just that with a 30" 2560x1600 (~100 PPI) display and a ~250 PPI MacBook: https://www.reddit.com/r/macsetups/comments/tfbpid/my_macboo...

[1] Again, not my setup, but the Windows UI is rendered at different sizes on displays of different resolutions: https://www.reddit.com/r/computers/comments/16y1dux/how_do_i...


I have a long blog post stewing here. I'll give you the gist.

Moving windows between monitors of different pixel densities is a rather difficult problem. Windows handles pixel density per-application, not globally, and it uses something called device-independent pixels (DIPs) for scaling. macOS and every desktop environment I've tried on Linux does scaling globally, or at least globally per-display.

On Windows, when a window is moved across two displays with different scaling factors, a simple algorithm is used. It will choose the display that the greater fraction of the window is in to select the DIP, render, compose and rasterise, and hence and one part of the window may appear too small or too large on the other display.

On the other hand, macOS, GNOME, and KDE take the easy (but IMO very lazy) way out by rasterising the entire application window to the pixel density of whichever display that the greater fraction of the window is in, copying that framebuffer to the viewport of the other displays, scaling with some filtering algorithm, and then composing, leading to blurring on at least one display. I am happy to bet that you're just not noticing the early rasterisation and filtered scaling going on. Having used all 3 OSs across a variety of monitors, I am extremely particular about blurry text; enough that I will stop using a certain setup if it doesn't satisfy me (it's why I stopped using Linux on my personal system).

I'll concede neither is good enough. The real solution here is:

  1. Render the application to as many viewports as there are displays that the application window is in, with the appropriate DIP for each display's scale factor
  2. Compose the application viewports into each display's viewport depending on the apparent window position
  3. The above will automatically clip away the fraction of the window that is outside each display
  4. Rasterise the composed viewport for each display
Another concession: I personally prefer pixel-perfect rendering rather than having the same visual size, and hardly ever use windows spanning multiple displays (especially of different pixel density), so Windows' behaviour is less of a problem to me.

My bigger issue is other desktop environments not supporting subpixel anti-aliasing, not supporting 'fractional' scaling (macOS is by far the biggest offender), and edge artifacts that result from bad clipping. I have a few photos I took of KDE, where random pixels are lit up at the bottom of my secondary display, with my laptop below it.


You may be right about the correct solution, but nobody actually uses any app long term with it straddling two displays, so the actual impact of this is not huge.

And DIPs have their own problems that I first encountered with WPF - rendering an application on a DPI that's not a neat multiple of what it was designed for means that lines and features don't necessarily line up with the pixel grid.

Depending how the app chose to handle this, it either causes blurriness or uneven and changing line widths as you move the window.


>I am happy to bet that you're just not noticing the early rasterisation and filtered scaling going on

macOS renders content on my 100 PPI monitor at exactly 100 DPI; 1:1, no scaling, so everything looks crisp at the pixel level. The scaling only happens on high-DPI displays (I think the cutoff is around 150-200), and for me at least, ~250 PPI is more than dense enough to not see any individual pixels and thus no aliasing artifacts. Since you like pixel-perfect rendering even at very high resolutions, perhaps you have superhuman vision. My eyes are decidedly average. :-)

>I hardly ever use windows spanning multiple displays

Me neither. My issue is that the windows are rendered at different sizes even when they're not spanning both displays: if I dragged the window in the example photo upwards to sit entirely on the top display, it would stay huge, whereas if I dragged it downwards to sit entirely on the bottom display, it would stay small.


> Since you like pixel-perfect rendering even at very high resolutions, perhaps you have superhuman vision.

I'm just annoyingly particular about this. It's why I accept a framerate hit in video games and don't use upscalers like DLSS, and why I intend to swap my 3840 × 2160 600 × 340 mm monitor for a 5120 × 2880 one of the same physical size. Some really nice ones were demonstrated at CES a fortnight ago.

> if I dragged the window in the example photo upwards to sit entirely on the top display, it would stay huge, whereas if I dragged it downwards to sit entirely on the bottom display, it would stay small.

This is not the behaviour I see. The window upon occupying the larger percentage of a display, 'snaps' to the DIP of that display.


You get that that’s worse though, right?

Windows renders the window once at a single DIP resolution. The other side of the window appears either too big or too small.

MacOS renders the window once at a single DIP resolution. Then, the other half of the window is upscaled or downscaled for the other screen. It’s going out of its way to make it consistent; Windows doesn’t bother.

Your worries about blurry text go away when you use nearest neighbor upscaling (this is configurable in the MacOS zoom settings). Nice crisp text, at the right size.


> Your worries about blurry text go away when you use nearest neighbor upscaling (this is configurable in the MacOS zoom settings). Nice crisp text, at the right size.

macOS does not do nearest-neighbour when doing pixel density scaling. It especially does not do nearest-neighbour when the 'looks like' resolution of any display is not a nice divisor of the physical resolution. As the grandparent commenter said, macOS renders to a fixed framebuffer. The size of this framebuffer depends on the pixel density of the physical display; at Apple-blessed densities of ≥ 79 px/cm, this framebuffer is four times the 'looks like' resolution (twice in each dimension); below this, it is the same resolution.

After this rendering macOS applies filtered scaling to fit the framebuffer to the physical resolution. If upscaled, this leads to blurry text and UI; when downscaled, this causes ringing artifacts[1].

[1]: https://www.reddit.com/r/mac/comments/12j14ud/macos_vs_windo...

I concede that Windows' implementation is simpler, but I will argue that practically it doesn't matter because basically no one I know uses an application window across multiple displays.

My very strong opinion: text/vector UI should never be raster-scaled to fit varying pixel densities.


They might be referring to when Apple removed subpixel antialiasing around ~2018. It caused some consternation at the time because there were still plenty of non-retina Macbook Airs in service. While it technically works, macOS really is not meant for non-high DPI displays.


>macOS is able to achieve consistent sizing across displays irrespective of pixel density because it . . . downsamples it proportionally for each screen

For me, the most important consideration is to complete avoid downsampling -- because it makes everything blurry.

On both macOS and Linux, the way I do that is to choose to scale the UI by an integral factor (usually 200% in my case) and then (since 200% makes things a little bigger than I prefer) fine tune the apps in which I spend most of my time (namely, Emacs and my browser).

Specifically, on Linux, Emacs relies on GTK to draw its window, which IIUC cannot do fractional scaling, so if I were to set a fractional scaling factor in Gnome Settings, then Emacs would be blurry whereas there is no blurriness when I set an integral scaling factor in Gnome and use something like (set-face-attribute 'default nil :height 90) to adjust the size of text in Emacs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact |

Search: