If you've been using macOS for a while, you might remember a time when Apple had both GPU options from ATI (purchased later by AMD) and Nvidia. In fact, the Macintosh was the first platform to sport the GeForce 3 in 2001. Nvidia even made a special chipset that was found in the 2008 MacBooks that helped deliver better GPU performance and skipped the Intel integrated chipsets. </p

Then suddenly, Apple stopped using NVidia chipsets. The last Macs featuring an Nvidia GPU was in 2015.

The video version differs slightly as it includes more personal ancedotes and asides.


Appleinsider isn't my favorite source for Apple news as it's too evangelical, generally portraying Apple as the protagonist in its reporting. Still, I have to give them credit as they've followed the Apple/Nvidia saga better than any other publication. It's Apple's management doesn't want Nvidia support in macOS, and that's a bad sign for the Mac Pro is a great first stop, but it's a bit dated and self-referential. I've tried to piece together the narrative as told by many news reports over the years, much of it I read as it was happening. It's a particular topic that interests me as it dates back to when I bought my first Nvidia GPU in 2001, a VisionTek GeForce 3, and used DOS with nvflash.exe to load the Mac Firmware onto the GPU. It was a crazy leap of faith as I read some guy who claimed to have done it on XLR8yourmac.com (once a powerhouse of a website for power users) and then reported back the steps I used to flash the card to the community. Over the years wrote a few popular guides on using Nvidia GPUs on the Mac and wrote a lot about Mac GPUs as part of my monstrous The Definitive Classic Mac Pro (2006-2012) Upgrade Guide. I don't have any particular insider info, but what I do have is the power of hind-sight.

The history of Nvidia and Apple

The first Mac to ship with an Nvidia chipset was the Nvidia GeForce 2 MX, with the G4 Digital Audio in 2001, and Apple would also at the same time ship the PowerMacs with an option GeForce 3 GPU.

In 2004, 30-Inch Apple's Cinema Display release was delayed by Nvidia's GeForce 8600 Ultra yields, not producing the cards in a timely enough fashion for Apple's liking. Still, Apple continued to offer plenty of Nvidia options. As important as Apple was during this time frame, it wasn't the goliath it is today.

The year 2008 is when the relationship with NVidia changed during a flurry of events. Apple pulled into a legal battle that was primarily between Nvidia and Intel. To understand this, we have to jump back to 2004.

In 2004, Intel and NVIDIA joined forces for a patent licensing agreement for Intel CPUs with integrated memory controllers, the MCP79 and the MCP89. Then in 2008, Nvidia produced Nehalem-based chipsets that bypassed the Intel Northbridge (Memory controller) and South Bridge (I/0 controller) chipset. Apple was the first PC maker to adopt Nvidia's new chipset. The advantage was that Apple was going to be able to simplify its GPU strategy. It'd allowed Apple to stop using the underwhelming Intel integrated GPUs and unify them to mirror the desktops. At the time, Intel's integrated GPUs were pretty bad and could not support OpenCL, thus limiting the amount of offloading to the GPU that Apple could reliably bank with the OS.

Intel was much more central to Apple as a business partner, and Intel enjoyed Apple in its company roster. Nvidia pulling a fast one on Intel put Apple in the center of its own controversial strategy.

Predictably, Intel then filed suit against Nvidia, throwing Apple's plans into disarray. Neither company was endeared to Apple, as the squabble had many industry people speculating that Apple may look into AMD processors, even though AMD had very few competitive offerings in the laptop space. Nvidia tried to court Apple into its legal saga but ultimately failed, leaving Nvidia feeling spurned. Apple continued to use Nvidia GPUs, but sadly, its lower-end offerings were constrained to Intel's supremely mediocre integrated GPUs. This wasn't the only issue Apple was having with its relationship with Nvidia.

Meanwhile, in 2008 Nvidia was hit with a securities lawsuit around knowingly shipping faulty GPUs and trying to mitigate the problem through firmware, burning $196 million for replacements. HP at the time said it had 24 models of laptops affected, and Dell had 15. Apple had 2, the MacBook Pro using the GeForce 8600M GT.

GPUs were failing at a steady clip (not just for Apple), and Apple had to extend its warranties for consumers in 2009 (ending in 2012) and issued a software update in 2009 trying to mitigate the GPU issues. The problem came down to the soldering that held the a chip it's printed circuit board cracked under thermal stress. This still landed Apple in a class action lawsuit. Nvidia saw Apple as a smaller player and refused to extend support costs beyond an unknown amount of money (it only handed out $10,000,000 to Dell after it threatend to pull from Nvidia), putting another twist in the Apple relationship. This was the dividing moment by most accounts.

By this point, multiple publications reported a frosty air between Nvidia and Apple, although the high-end MacBook Pros would continue to use Nvidia GPUs.

Tried to use an AMD chipset in the MacBook Pros in 2011 and ended up in yet another class action lawsuit over faulty GPUs. Apple would switch back to Nvidia in 2012 MacBook Pros.

2013 marked a substantial shift away from Nvidia. Apple went with long-time Nvidia rival AMD for its partnership to produce custom variations of the Radeon FirePros for the 2013 Mac Pros. The iMac 2014s moved to AMD with the introduction of the 5k iMac.

If there was any hope of Nvidia and Apple reconciling, 2014 was the end of it. Nvidia went litigious against Samsung and Qualcomm over mobile graphics patents, filing a lawsuit over mobile GPUs. They went as far as to try and block shipments of Samsung Galaxy S / Note /Tab lines, with speculation that Nvidia wanted the iOS and Android business. At this time, Apple was still relying on components from Qualcomm and Samsung for its mobile units.

Things seemed quiet. Nvidia had ported CUDA to macOS and created Web Drivers even while Nvidia still was producing GPUs for Apple as their relationship fizzled.

Apple had embraced OpenCL, the popular framework used for GPU accelerated computing tasks. Nvidia had created its own closed alternative, CUDA, and using its marketing power to court various software publishers to use it over OpenCL. NVidia's CUDA did not work on AMD hardware, thus giving Nvidia a competitive advantage if a software maker chose to use CUDA. Adobe embraced CUDA even on macOS and thus earned CUDA a favored position among creative professionals, especially those using the Adobe Suite. Adobe went as far as to build CUDA specific applications for Nvidia GPUs. In the background, Apple was poaching industry talent for it's own GPU ambitions.

Nvidia continued a quiet strategy for macOS by bringing support for its later GPUs on macOS and updating CUDA. This meant classic Mac Pro owners, eGPU users, and Hackintosh users could enjoy the latest Nvidia hardware under macOS, which continued uninterrupted for nearly seven years. Many Mac professionals invested in Nvidia hardware as AMD's offerings generally paled against Nvidia at the higher end, and CUDA offered a lot more performance in Adobe video applications like Premiere Pro and After Effects. Nvidia didn't overtly flaunt its web drivers, and it came as a surprise to many Mac users to learn that they could buy Nvidia GPUs and use them in their Mac Pros. As a personal anecdote, I wrote two popular guides on using a GeForce 700s series and GeForce 1000 series GPU in a Mac Pro.

With the release of macOS 10.14 Mojave, everything changed. Outside of the people on Infinite Loop, no one knew for sure that Apple's grand ambition was to merge macOS and iOS hardware. Most users at the time feared the iOSfication of Apple's software instead of hardware.

For years, Microsoft had a huge leg up in the graphics department by owning its own graphics API in the form of DirectX. OpenGL, Apple's preferred graphics API, had floundered in the late 2000s, whereas DirectX, for all its faults, leaped ahead of OpenGL in graphics capabilities and support.

Rather than wait for the next open-source library, Vulkan, to formalize, Apple developed its own graphics API, Metal, for use with iOS. Microsoft most certainly inspired Metal. Bringing Metal to macOS was all-but given and was ported to macOS in 2019, set to replace both OpenGL and OpenCL and skip Vulkan support.

macOS 10.14 Mojave required metal-compatible GPUs. At some point, during the macOS Mojave beta, Apple pulled Nvidia's ability to sign its code, which ended Nvidia's support for macOS in one spiteful, anti-competitive move. In order for GPUs to be Metal compatible, they needed drivers, and Nvidia wasn't able to release drivers.

Nvidia publicly announced that it had working metal drivers on its forum, but Apple had revoked its developer license leaving the blame squarely at Apple's feet. Nvidia even called out apple on his support page but has now since modified it.

My personal take is that it boiled down to CUDA, Metal, and the M1. CUDA represented a significant problem for Metal adoption. In order to get professional applications on board with Metal, they had to cut out CUDA, and my guess is that NVIDIA was not willing to give up CUDA in its driver. Yet again, this was the impasse between Apple's management and Nvidia.

In order for Apple to launch Apple Silicon very smoothly, they needed everyone to support Apple's current technologies, and CUDA was a roadblock to that success.

Apple also knew the aftermarket install base for NVidia GPUs was quite small and limited to classic Mac Pro users and adventurous people who had eGPUs and the Hackintosh community. The group of people this affected was a group Apple the past decade has seemed vaguely resentful of: users who like modular computing. Axing Nvidia was another blow against modularity and another win for Apple's tight-fisted control of when products are obsolete.

The goalposts have now changed. The question isn't whether Nvidia and Apple will get along. It is now whether Apple will allow external GPUs or dedicated GPUs. At the time of writing this, this MacRumors on its buyer's guide page lists that it thinks that apple will release GPUs that outstrip AMD and Nvidia's current offerings.

Usually, MacRumors is pretty on point. Still, I'm just hyper skeptical the year-over-year gains in the GPU market have been not just consistent but going up also. Nvidia and AMD are two of TSMC's biggest clients. They, too, will have access to the same manufacturing processes as Apple. They've been doing it much longer, and they're very good at it.

I have a very unusual take on this whole thing, and that is that in the future, we're going to see macs that absolutely rock at laptop performances and low wattage.

Also, we'll probably see iMacs in a year or two that can edit 8k natively but also can't ray trace and are pretty crap when it comes to things like TensorFlow.

To quote myself after I received my first Apple Silicon mac in December of 2020: "for the portable class of computing, Apple silicon looks like it'll be unmatched, and expensive brute force versus efficiency will be the story of x86 versus Apple Silicon versus ARM, and I expect there will always be a clear winner. Welcome to the next decade of computing."