macOS 10.15 vs Windows 10: Apple is losing ground
After a few discussions and my previous blog post, I've noticed that I've changed my tenor on macOS, by far my most loved Apple product. I used without question profess macOS as the superior OS* (aside from gaming). Unfortunately, this post is going to feature some serious nerdage without a lot of explination, for the sake of brevity.
Now that asterisk has expanded to include a lot more things. Take the following: I cannot properly stream a 4k movie 1 in surround sound 2 in HDR 3 on an Nvidia GPU 4 or be sure it's using hardware decoding 5.
Let me break that down:
- macOS currently does not support the latest DRM that Netflix most other DRMed streaming services use. macOS is capable of playing back 4k media. Windows can play back 4k Netflix in Edge and its app. Vudu and UltraFix also support 4k Windows playback. Windows users face the same issue as Mac users with Amazon Prime and Hulu.
- macOS still does not have any multichannel decoding for popular consumer codecs from Dolby or DTS. Even the Apple touted AAC can do multichannel audio, but macOS has no ability to decode to multichannel outputs. macOS can route multichannel audio via CoreAudio using prosumer/professional hardware, but to this day, no front end for software exists to route multichannel audio to analog outputs. It can, however, pass pre-existing bitstreams via SPDIF and HDMI in a few applications, most notably VLC and not any Apple software.
- 10-bit support for macOS is completely bonkers. My MacBook Pro 15 inch 2017 has a Radeon Pro 555x and reports 10-bit on the internal display. The internal display is not 10 bit, but rather 8 bit. My Mac Pro 2010 has a Vega 56 connected to a BenQ PD3220u (a true 10 bit panel, not FRC) reports 10-bit, but it's unclear when/if 10 bit is actually being piped to the display as I can't seem to read the bitstream info. Mac OS does not any consumer formats for HDR support regardless. Windows 10 will automatically in games trigger the HDR color profile on the monitor as it detects the HDR10 bitstream. macOS never even does this. By visual tests, it appears my MacBook will output 10 bit to the monitor. The Vega 56 will not but reports otherwise.
- Apple, as of 10.13 has actively blocked NVidia support for any non-Keppler GPU. Want an RX 2080 for your Mac Pro 2019? It'll only work in Windows. The spat has drawn widespread media coverage outside the Mac universe. Theories range from eGPU support to Nvidia and Apple stalemating over CUDA. Whatever the case, Mac users are severely handicapped in GPU choice.
- Apple no longer officially supports hardware decoding for certain codecs for non-T2 chipset enabled computers. My MacBook Pro 2015 in Catalina now uses the T2 chipset to assist with H264/H265. My Mac Pro, despite having much better hardware without the power consumption requirements, does not. It is unclear if the Mac Pro 2019 can use its $5000 GPU options to assist with these codecs.
I do not like Windows 10's UX (Why does it have two sets of control panels still!?!?) and inborn advertising, but you gotta hand it to MS. I can run some programs written for Windows 98 using compatibility mode, any 32-bit apps, and it comes with pretty good threat detection with Windows Defender.
Apple did something though that MS never had to clear two nearly impossible hurdles with the Macintosh platform. The first, switching from Mac OS 9 to *nix, and jumping x86 to PPC. This meant to a few less-than-ideal solutions like Carbon, which eased Mac OS 9 to Mac OS X. That said, x86 Cocoa Apps (Carbon's successor) from 10.6 often do not work in macOS 10.15. Apple does not offer a compatibility mode. Apple also dropped 32-bit binary support in 10.15 arbitrarily, meaning old apps are even less likely to work. This is not a confidence builder for a platform's robustness.
As far as threat detection, Apple's "security" almost entirely comes from negatively sandboxing the user in 10.15 and over-reliance on signed code, which ironically puts power users at greater risk, as they're the most likely to disable system-integrity-protection. It's evidenced by Malwarebytes' report that Mac malware outpaced Windows PCs threats for first time in 2019. The T2 chipset has some interesting features, and eases things like disk encryption but also is wantonly silly for the most common threat vectors and gives less-educated users a false sense of security. Apple has gone to excessive lengths recently to iOSify its security by depreciating services like Kexts but still has yet to offer threat detection. The Mac Pro 2019 cannot use 3rd party SSDs in its two factory SSD slots because of the T2 despite plenty of M.2 adapters existing to Apple NGFF. If anything, the T2 chipset feels like kneecapping macOS with self-destruct button it can hit at any time, locking users out of their own hardware.
Lastly comes the most controversial statement, but macOS is not the fastest OS. You can take your pick of benchmarking site, but macOS on the same exact hardware routinely is bested by Windows (and, of course, Ubuntu). It's tough comparing benchmarks as many tasks aren't fully cross-platform, or some are heavily stilted towards a single platform (example, Java drastically performs better on Linux than Windows or macOS). That said, all things considered, Windows generally performs better than macOS by a noticeable margin, be it better-assisted technologies like CUDA, or DirectX, better ports, and sadly many native OS operations.
The last great push forward feels like 10.9 Mavericks when Apple introduced Timer coalescing, Memory compression, App Napping, 4k support/Retina for all (10.9.3), maximum RAM increase beyond 96 GB, IPoTB (ThunderBolt transfering between two Thunderbolt equipped Macs), ability call/receive Facetime and a notification system (finally putting growlr to rest). Compare that against Windows 8.1 (both OSes release in 2013) which was almost entirely damage control. I used to confidently say OS X was the best, but in the era of macOS my answer far more pragmatic: "it depends".