The Mac Pro Buyers Upgrade mini-guide

    I try and not recommend hardware based on my preferences on the Definitive Mac Pro upgrade guide, but here are my opinions for anyone looking for a cheat sheet on what to buy.

    CPU

    Mac Pro 4.1/5.1 configurations vary quite a bit but there's really only two CPUs to really consider due to pricing. The x5680 is cheap, even with a dual CPU Mac Pro, its roughly $70 to purchase 2 CPUs, making it almost the same price as a single x5690. The x5690 is the best CPU a Mac Pro 4.1/5.1 can house.

    GPU

    GPUs on the Mac Pro are limited to the AMD sphere for 10.14 Mojave and 10.15 Catalina. The RX580s are floating around often for cheap, and they're a good entry-level card. The 560 and 570s are even cheaper, but it is hard to beat the price-to-performance. The Vega 56 is probably the best overall value as its performant and can be flashed to a Vega 64 and lands just shy of the Vega 64 in performance after flashing. The Radeon VII is the king and hard to come to buy. At $700, it's expensive but much mightier in the compute benchmarks than the 5700 XT, making it a much more well-rounded card for video editing and other GPU accelerated non-gaming tasks.

    • Good: RX580 $100 (used) or $185 (new)
    • Better: Vega 56 $225 $300 (new)
    • Best: Radeon VII $600 (used) $699 (New) (Mojave and above only)

    Soon the Radeon 5700 XT will be supported for the Mac which occupies a space between the Vega 56 and Radeon VII. The Radeon 5700 XT is great for gaming but it computational scores are low, it performs much lower than the Vega 64 in Blackmagic's Da Vinci Resolve.

    Storage

    There's a lot of storage options for the Mac Pro. The Mac Pro doesn't support bifurcation, so inexpensive dual PCIe cards out a no go. The SanDisk Ultra 3D splits the price and cost intersection nicely besting a lot of the really cheap SSDs like Kingston, although the Samsung 860 is a better SATA drive. Moving the drive to SATA3 doubles the max-transfer speed. The HP EX950 again is another splitting the middle drive between price and performance, in spitting distance of the Samsung Evo 970. NVMe requires a firmware flash for 4,1/5,1s and much more leg work for 3,1s.

    Memory

    Not a lot to say, for the 4,1/5,1 buy 1333 Mhz, go for 16 GB DIMMs if you can afford it.

    I/O

    There's no reason to mess around on the cheaper solutions, Sonnet USB cards are problem-free and do not require external power. The Allego Pro and Allegro 3.1c are the same, featuring two USB 3.1 controllers with 10 Gbps (2.5 GB/s) for 4 ports. The only difference is the interconnect. The regular Allegro has one USB controller for 5 Gbps (1.25 GB/s) total bandwidth.

    Wireless

    Wireless is slightly annoying, but there are three options: Use a PCIe card, use a mini-PCIe card, or buy the parts online separately. Honestly, its best to just read the very long upgrade guide.


    The disrupters will be "disrupted"

    This example is not a hypothetical. The meal-kit company Blue Apron revealed before its public offering that the company was spending about $460 to recruit each new member, despite making less than $400 per customer. From afar, the company looked like a powerhouse. But from a unit-economics standpoint—that is, by looking at the difference between customer value and customer cost—Blue Apron wasn’t a “company” so much as a dual-subsidy stream: first, sponsoring cooks by refusing to raise prices on ingredients to a break-even level; and second, by enriching podcast producers. Little surprise, then, that since Blue Apron went public, the firm’s valuation has crashed by more than 95 percent. - Derek Thompson, The Millennial Urban Lifestyle Is About to Get More Expensive, The Atlantic

    I don't usually do much news commentary but I've been skeptical of the gig-economy and mommy-services for ages.

    The bigger question for me has, how did Silicon Valley start-ups manage to fleece investors for this long? The example that jumps to my mind is MoviePass, which was hemorrhaging money so fast that it went as far as to change user passwords to keep users from using its service. Many of these services existed as a "hack" via legal shenanigans. Labor laws are catching up, as Gavin Newsom signed AB 5.

    When these services actually charge what it costs to use them, like eScooters, or have the double-whammy of having to raise prices to be profitable and content with actually employing the people they're exploiting and be profitable, it's going to be brutal. Many customers will be priced out from food delivery to ride-hailing services. They'll be just like the services they "disrupted" but only with a nicer app to show for it.


    Visual CSS Regression with Backstop JS

    In 2015, I wrote an article, Visual CSS Regression Testing 101 for Front End Developers, where I covered the two competing philosophies of Visual Regression testing, Comparative vs. Baseline. Since then, PhantomCSS was sunsetted as PhantomJS was not as good as running Headless Chrome and BBC's Wraith works but wasn't ever as useful as I'd of liked.

    What is Visual Regression Testing?

    There are other primers on the concept, but it's worth quickly covering visual regression testing. In the course of development, CSS/JS/templating changes can potentially have unintended changes on your website or web app. Visual Regression Testing seeks to automate the laborious task of comparing visual elements to see if any unexpected changes have occurred. This is performed by running scripts with headless web browsers to render the webpage, then capturing its renderings, and using a show diff tool to compare the screenshots, flagging changed elements for review. Once approved, the latest changes are "approved" as the gold master and then saved to compare against next time you run the test.

    Now, four years later, Backstop.JS emerged, mixing (mostly) the best of both Wraith and PhantomCSS.

    Back when I first investigated visual regression, I spent time discussing baseline and comparative tools. Baseline visual regression tools in the talk I attended were complete screen renders, whereas the comparative tools could query individual DOM elements. In hindsight, The distinction between baseline and comparative is somewhat of a moot one, as comparative tools can do baseline checks as they're able to query the screen, be it the entire body. That said, tools like Wraith that only do full-page screen renders can't make individual element selection thus are far more limited. At this point, I doubt either term gets much play, nor does it need the distinction as people have gravitated naturally to a tool that can query DOM elements.

    Backstop.js

    Backstop.JS gets major points out the gate as easy to use. Just run the global npm installer, then navigate to your project directory and run backstop init. It'll create a boilerplate template ready for you to start writing tests. This a serious upgrade, considering I once wrote a 12-step guide on how to install PhantomCSS.

    Running tests is also easy, run backstop test from the root directory and backstop will take care of the rest. Approving a batch of changes is easy, just punch in backstop approve.

    Next up is formatting: All the tests are created in using JSON, which is easy to read and familiar. I've never been super into YAML, and I like JSON. Everyone likes JSON.

    Where Backstop shines is how quick I went from never having written a test to having queried a roster of visual elements found our company website. Start up by declaring a set of screen sizes, and I created my own mobile, tablet, desktop, and large desktop screen sizes.

    {
      "viewports": [
        {
          "label": "phone",
          "width": 320,
          "height": 480
        },
        {
          "label": "tablet",
          "width": 1024,
          "height": 768
        },
        {
          "label": "laptop",
          "width": 1280,
          "height": 800
        },
        {
          "label": "highdef",
          "width": 1920,
          "height": 1080
        }
      ],
    }

    My first tests were entire pages, then I quickly graduated to advanced Backstop, testing our mobile menu. The mobile menu had a few considerations:

    • It must be clicked
    • It only makes sense to test it on a mobile resolution
    • There's a delay for the animation

    {
      "label": "Emerge Menu Open",
      "cookiePath": "backstop_data/engine_scripts/cookies.json",
      "url": "https://dev-site-url",
      "referenceUrl": "",
      "readyEvent": "",
      "readySelector": "",
      "delay": 20,
      "hideSelectors": [],
      "removeSelectors": [],
      "hoverSelector": "",
      "clickSelector": ".hamburger",
      "postInteractionWait": 1000,
      "selectors": ["header #site-navigation"],
      "selectorExpansion": true,
      "expect": 0,
      "misMatchThreshold" : 0.1,
      "requireSameDimensions": true,
      "viewports": [
        {
          "label": "phone",
          "width": 320,
          "height": 480
        }
      ]
    },

    And there you have it; my mobile navigation is being tested against JS breakage and CSS changes. I'm fairly impressed. There's even integration for Running custom scripts. The only hiccups I've had is with AJAX content. I used remove element to hack out the DOM elements, which created reliable elements to test around the AJAX content, and for the AJAX content itself, I used the readySelector.

    Lastly, chaining events is a bit cumbersome as you'll be coding up scenarios, but its still much less overhead than the days of PhantomJS.

    Chaining Backstop to deploys

    The next step is to chain backstop test to deployments. The demo shows Backstop playing with Jenkins deployments. At my office, we use bitbucket pipelines. It's a matter of translations.

    Git flow

    The gif work flow is pretty straight forward with Visual Regression testing, ignore the test folders, and track the gold masters. Backstop creates a new timestamped directory for each test in /backstop_databitmaps_test for each test. Depending on the number of tests, you run, it's easy to churn out hundreds of megabytes of images, so be prepared to have a trash collection method if you're running via a deployment method that might require such.


    Foolproof way to update the 2010 - 2012 Mac Pro 5,1 to the 144.0.0.0.0 firmware

    Updating the firmware on a Mac Pro isn't difficult, but it is possible to "miss" firmware upgrades. This guide is for anyone looking to get to the latest (and most likely last) firmware released for the Mac Pro 5,1s, without having to install Mojave 10.14.x, or if you already have installed Mojave, or are looking to install Mojave. My first try, my firmware was stuck at 138.0.0.0.0.x even when running Mojave 10.14.6. Updating the firmware adds key funcitonality to the Mac Pro 5,1s, most notably native NVMe m.2 boot support. To learn more about Firmware and the Mac Pro 5,1s, see the Firmware Upgrades section of my Mac Pro Upgrade Guide.

    Step 0: Remove unsupported GPUs

    The biggest change for macOS Mojave is the deprecation of OpenGL and OpenCL. OpenGL has been a thorn in Apple's side for quite some time, as it's been nearly dead for years. Vulkan, the OpenGL successor, wasn't quite ready for primetime when Apple originally created Metal for iOS and thus decided to port it macOS. Despite the annoyingness of having to meet the requirements, it was a necessary evil. Mojave will not install if you have a non-metal supported GPU.

    Note: some users are reporting they had to remove all PCIe cards sans their storage controller (SATA card) and GPU to install the firmware update. I did not. If you encounter issues, try removing additional PCIe cards.

    Step 1: Have a 10.13 drive

    Unfortunately, this is the biggest pain if you've already updated. You'll need a separate volume to boot into 10.13. Amazon and Newegg each have 120 GB SSDs for under $20 USD if you need a temporary drive to install macOS 10.13 on. (upside is you can buy a USB case and turn into a very fast USB 3.0 drive afterward or return it). You can get old versions of macOS via the Mac using DosDude1's installer if you can't access it. If you have no intention to upgrade to Mojave or already have it installed., don't worry. We won't be installing Mojave.

    Step 2: Boot 10.13

    The next step is pretty straight forward, boot into your install of macOS 10.13 if you haven't already.

    Step 3: Download 10.14.6 Combined installer

    Fortunately, firmware flashing does not require updating in a particular order. I went from 138.0.0.0.0 to 144.0.0.0.0 without any problems. The easiest way to obtain the final combined update for Mojave is to use Dosdude1's installer. Much like before, download the OS DosDude1's installer, even though we have supported hardware but with the patcher for 10.13.

    Note: You do not need to use the DosDude1 installer, as you can grab the update via the App store or other sources but I found this easy.

    1. Go to DOSdude1 Mojave patcher and download it
    2. Launch the patcher.

      Mojave Patcher Can't be opened because it is from an unidentified developer

      Depending on your security settings, your mac may suggest it's from an unverified developer. Go to the system prefs, Security and Privacy (general), and allow the app to open.

      Mojave Patcher was blocked from opening because it is not form an identified developer

      You'll be bugged one last time.

      macOS Mojave Patcher is from an unidentified developer. Are you sure you want to open it?

    3. The patcher should warn you that you are on supported hardware.

      Mojave Patch Native Machine warning

      This is fine, ignore the message. Within the patcher, select the download Mojave from the Tools menu.

      Tools -> Download macOS Mojave

    Step 4: Launch the installer and click shut down

    The installer should bring up a message about firmware and a shutdown message. This will not start the Mojave installer, only the firmware.

    Step 5: Boot the Mac

    Using the instructions in the previous image, press and hold the button until it blinks. If you do not have an EFI enabled GPU (see more about EFI in my Mac Pro Upgrade guide), you will not see any video output.

    I trimmed down the video, as it took about 15 seconds of holding before the button flashed. After the button flashed, the internal speaker emmitted a long lowfi "boop" sound.

    Step 5: Verify

    Go to About this Mac, and click system report. Under the first screen, look for the "Boot Rom" text. This should list your firmware version. From here, you can continue using 10.13.6, upgrade, or boot to your 10.14 volume.

    The 144.0.0.0.0 firmware works with any version of macOS your Mac Pro supports.

    Updated: November 4th, 2019 based on Feedback from Mac Pro Users user group on Facebook.


    iOS needs better app organization

    A friend of mine switched to iOS after nearly a decade of Android usage. This spawned a lot of back and forth about iOS vs. Android. One criticism I cannot defend is iOS's icon organization and folders. In 2010, Apple created folders with iOS4, (if you need a memory jog, here's what they looked like). The original visual analogy used a visual metaphor of sliding back to expose the contents, as we were in the midst of peak skeuomorphism. At touch-interfaces were relatively new, and Apple had the monumental task of on-boarding droves of barely-digital-literate users, this serviced that App. As a UX developer, I carry a lot of opinions about interfaces, so it shouldn't be surprising that I feel the need to vent time-to-time. Here are several complaints I've harbored for years combined with some slap-dash, non-pixel-perfect UI mockups.

    1) Waste of space on a broken Metaphor

    More than half the screen is burned on a blurred-out effect, presenting a minimal amount of icons. The modern iPhones are massive compared to the era of much smaller/manageable/dare-I-say-superior-iphone-5 form factors. There's no point to a 3x3 grid. It's annoying and silly. With increased storage capabilities comes more data. Today's iPhones can come clocking in at 512 GBs of storage, capable of storing the amount of data familiar to desktop users. iOS hasn't grown to take on the desktop levels of data or applications.

    Folders are a waste of space on iOS13

    2) Custom App icons

    Visually, the mini-grid isn't a bad choice, but it's dated and loses its poignancy beyond the nine apps. Plus, at a glance, it doesn't visually 'jump out' among a mess of similar icons. It'd be easy for Apple to denote a folder icon by a slightly different change quickly. Here's my 10-minute mockup of what it could be like with a custom icon.

    The focus shouldn't be on my graphic design choices as I did this fast and dirty. Instead, the take away is folder icons could vary visually from the current App icons to make them distinctive.

    iOS should have custom folder icons

    3) Folders in Folders

    Next up is another gripe is folders within folders. Apple has done quite a bit to avoid hierarchical navigation in iOS, but it exists in the system preferences and now within the Files app. Merely transposing the visual interface in files gives a sane approach to folders. Combining custom folder icons, users can see where they are in the breadcrumbs.

    Folders in folders

    4) Vertical scrolling in folders?

    Vertical scrolling on the home screen has existed before with the jailbreaks, Infiniboard, or Springfinity. Vertical scrolling within folders would help express the folder metaphor of the past-tensed drawer and ease app migration.

    5) Make search results meaningful

    Search on iOS never shows you where files are. See below.

    Folders in folders

    I can think of a few ways to alleviate this, such as list results showing location to the right for the top for App matches. I didn't bother to mock them up as what's really important is the lack of context.

    6) Better App movement

    If you've ever had to organize an iPhone, the task is so tedious it can take hours if you have a fair amount of apps. I've seen various suggestions and honestly, at this point. I'd take any.

    7) It's time to loot macOS: Smart Folders

    iOS needs to grow up. The Files app is a nice start although imperfect compared to iFiles found in the jailbreak world. Apple already has a brilliant solution that it can port to iOS, allow the OS to do auto-organization with Smart Folders. Smart folders for the unfamiliar work by using predetermined search strings. Apple could take it further and set Smart Folders on iOS to organize based on Application types. Upon app purchase or reinstall, the user can select "Smart folder", "dock", or "custom folder" and stay ahead of organization. Brillant right?

    Bonus macOS -> iOS features

    • Loot macOS's columned view for files.
    • Allow for smaller grids and list within folders.
    • Tap and hold on icons has a "Get Info" screen so you can see how much data your application is using and its associated folders
    • In a perfect world, tabbed interface to make dragging between locations easier.
    • A font manager.

    Pagespeed Insights is useful again thanks to Lighthouse

    On November 17th, 2017, I called utter and complete stupid bullshit as Google Pagespeed was giving glowing scores to pages like Wired.com and Newsweek.com. Both garnered higher scores than my hyper minimalist blog, which has a whopping 2.9k of CSS and about 40k of JS against their megabytes of JS and images. It was so irrelevant that it caused me grief professionally as clients would be unhappy with their scores despite being fairly optmized.

    Pagespeed isn't perfect, but it is now what I'd consider fixed, and I've meant to write this article for some time. My blog's page speed has gone up to 76 from 70 on my homepage. Individual articles, such as Google Page Speed lacks common sense, now scores 90. Sanity has been restored. I'm not just saying that because my numbers are better. Let's start with my complaints.

    • Pagespeed did not care about JS bloat, long as it wasn't directly linked. If a library decided to append several megabytes of Javascript, Pagespeed wouldn't even blink.
    • It would advise on how to optimize iframes even though the user has no control over such things
    • It did not care if you used post-JPG/PNG formats
    • It made no effort to measure total requests.
    • It made no effort to measure time-to-paint
    • It failed to recognize minified HTML due to a single line-break
    • It did not compare against any other real-world dataset making it relativistic only to your website's previous scores

    Surprisingly, all my major gripes were resolved to the point where I feel like a Google engineer took umbrage with my post (let me be clear; this almost certainly didn't happen). Lighthouse is oodles better (I've been using it since mid/late 2018), trying to leverage Chrome User Experience Report for higher tier pages. What's interesting is the change philosophically from the technology bucketed approach of CSS, JS, HTML, and Server-side technologies existing in their own orbits to one that clearly has standardized goals. This makes cross-site comparisons more sane. The benchmark metrics are as follows: First Contentful Paint, First Meaningful Paint, Speed Index, First CPU Idle, Time to Interactive, and Estimated Input Latency. This lends itself to a greater understanding of the stages of a web page's life-cycle. This is the most significant change, and I approve of it. The feedback is more meaningful, as well. Below is an unorganized list of my observations.

    • If you are using Wordpress, it'll suggest plugins to assist certain tasks. Another meaningful change that I didn't think of to rant about was DOM Tree depths. Google now recommends less than 32 levels of depth.
    • It makes some executive calls like using font-display: auto; which re-enables one of the banes of web dev, Flash of Unstyled text that was expressed in shorthand, as "FOUT, FOIT, FOFT". Google prefers the FOUT.
    • Google has stepped away from suggesting minified HTML, likely because compressing HTML is far more important, as demonstrated here. You can still minify HTML to squeeze extra bytes away.
    • Lighthouse measures JS execution times, not just size.
    • It no longer suggests the wantonly silly declaritive image sizes. Prior, Pagespeed wanted you to write out the pixel value to speed up render times. This was good advice in 2000, but tragically out-of-step for the responsive web.
    • Audits can pass even if they are not met 100%, such as minify CSS or JS if the vast majority has been met.

    All in all, it's good to see Lighthouse.


    Thoughts on Apple Arcade

    Apple Arcade is everything it should be, solving the biggest problem the Apple store has had: a vector for premium/high-quality games to be delivered without leaning on In-App Purchases. Not all IAPs are bad. There are a few titles that have done them correctly. Time Locker has only a $3 purchase that's remotely required, and it does not have any consumables. The only other IAPs are optional characters. Polytopia that unlocks "races" for $1 purchases, for a grand total of 9. The most you can spend on either game is roughly $10-15, which seems right for a high-quality mobile game.

    Most though, as almost everyone knows are the detested loot boxes or in-game currency, and thus we've seen a race to the bottom. For years I lamented there wasn't a classification for full-fledged games without IAPs. I wanted a premium game store where developers could charge $10-$25 and get their fair shake but never did I consider a subscription gaming service. I don't play many mobile games, but when I do, I don't want to spend hours looking for titles that are pay-once models. When at the gym, I tend to walk for 15 minutes of warm-up, and at that time, I play silly iPhone games. Tower defense games are a personal favorite, and I play them before committing to running a 5k and off to do other activities.

    Apple Arcade is damn good value as it stands, today. I don't think I've seen any console or gaming platform launch with so much content. It's good enough that I worry though about the rest of the App Store, as there's enough content for me to work through for quite some time. Also, as an added bonus, the Apple Arcade isn't limited to iOS or iPadOS, it's coming to tvOS and most importantly, Mac OS. In one swoop, Apple has a platform that spans every compartment of gaming: mobile, tablet, console/TV, and desktop (PC). All its competitors are missing one of these buckets (Steam, Microsoft, Google, Sony) and none have games that titles that can be easily ported to between all formats. Depending on Apple's commitment to funding titles, this indeed could be a very big deal. I may eat my words later, but Apple Arcade is probably the most significant product Apple has conceived since the Apple Watch. At $5 a month, it's the cost of one Playstation or Xbox game a year, or the same price as PSN a year.

    There's still asterisks to be resolved: what does the future look like? How many games can we expect? Will we ever see ports or non-exclusive content on Apple Arcade? Apple Arcade isn't going to be the end-all-be-all for gaming but out-of-the-gates, it's competition for Sony's very successful and very well executed PSN.

    So far, I haven't mentioned one giant of gaming, arguably the most loved of them all. Apple Arcade probably won't be causing much damage to the Sony Playstation, Microsoft Xbox, or Steam platform but it's scope of more casual/family-friendly titles and whimsical nature certainly reminiscent of said company, and I wouldn't be surprised if it caused parents too Switch by giving kids hand-me-down devices or simply get them an iPod Touch. Apple this year already is the fourth largest gaming company and previously, it hasn't even tried.

    I doubt we'll see any AAA-Titles ported to the Apple Arcade be it popular sports Franchises like Madden, NBA Live, NBA2k, FIFA or any games perhaps based off professional sports leagues due to licensing. I also wouldn't expect ports of classics like Sonic The Hedgehog to be folded in. (I wouldn't rule it out either, as Sonic and Frogger both made appearences.)

    Lastly, the service isn't perfect. There's not really a Steam/PSN/Xbox Live system to it for friends lists for gaming. There's also an extreme lack of titles with much depth, many of the games I've tried are nugget sized experiences. I personally love Cricket Through the Ages, and really liked Assemble With Care, both criminally short but some of the others less so. There's only so many single-button games I want to play. Some of the more hyped titles like Saynara are beautifully shallow or feel like a demo like Red Scare or cheap knockoffs like Punch Planet. As a fan of Oceanhorn, I look forward to diving into Oceanhorn 2. I'll chalk it up to launch titles, rarely are they the pinnacle of a console sans a few rare outliers, mostly from Nintendo (Tetris, Super Mario World, Pilot Wings, Super Mario 64). That said, with my relaxed interest in gaming perhaps more nugget experiences are exactly what I'm after if delivered right.

    I'll be interested to see if I agree with myself two years from now.


    Android phones now have 12 GB of RAM Apple is still shipping laptops with 8

    Right now, there are multiple 12 GB of RAM phones on the market. They run the gamut of prices and spec, but none are more expensive than a MacBook Pro 13 inch with a factory config, save perhaps the Galaxy Fold, a curiosity.

    • Samsung Galaxy Fold $???
    • Samsung Galaxy S10 Plus $1,600
    • OnePlus 7 Pro $1299
    • Asus ROG Phone 2 €899
    • Samsung Galaxy Note 10 Plus $929
    • Xiaomi Black Shark 2 or Black Shark 2 Pro $879
    • Lenovo Z6 Pro $849
    • Xiaomi Mi 9 Explorer Edition $800
    • Nubia Red Magic 3 $700
    • Vivo iQOO $650

    I didn't look up the MSRPs but rather what seemed to be legitimate prices online to give an honest representation. That's likely an impartial list, but there are at minimum 10 Android models shipping with more RAM than a $1299 MacBook Pro, and even the $1999 model of the 13 MacBook Pro ships with 8 GB of RAM. In all 13 inch models, it must be custom ordered. There's been a bit of stagnation for laptops and RAM, partially due to chipsets, partially due to modern OSes using much more efficient RAM management via compression aided by SSD scratch disks and lastly due to the increased power draw. The last feels increasingly irrelevant as phones have caught up to laptops and the foolish TouchBar. It was only in July of 2018 that Apple addressed the lack of 32 GB RAM options for the MacBook Pro lineup.

    Just as a barometer of applications: Adobe Photoshop, Lightroom recommends 8 GB of RAM or more, Illustrator recommends 16 GB with 4 GB being the minimum, After Effects recommends 16 GB with 8 GB being the minimum. Notably, the assumption is you would not be running multiple professional applications at once, which in reality with say, After Effects, which can routinely involve any number of 2D editing applications, and even 3D apps to juggle resources. Then there's development, where Docker or VMs/Simulators, and horrid JS memory vacuums exist. I'm not even going to touch professional audio. GPUs crossed the 8 GB barrier some time ago, meaning you could connect a 16 GB AMD Radeon VII to a $1999 MacBook Pro with 8 GB of RAM. Even for general web surfing, it's easy to saturate 8 GB of RAM with a browser with poor memory management (chrome).

    None of Apple's Pro laptops should ship with less than 16 GB and the MacBook Air should have a factory model that ships with 16 GB of RAM. This would be moot if we had user-serviceable RAM upgrades. Laptops needn't be held to the modular standards of desktops, but they should be for basic specs.

    Also worth noting all iPhones 11s have 4 GB of RAM*. 4 GB of RAM is good for right now but seems a bit counterproductive until you consider the revelation that the iPhone 11 may have 2 GB of ram dedicated to the camera. iOS's memory management works mostly due to Apple's stringent background task management. My guess is the next iteration of the iPhone will probably move to 6 GB of Application RAM and 2 GB for the camera.


    Under-the-hood blog updates

    Over the break, I went on a binge of minor changes to this blog.

    • The privacy policy and contact now exist on their own pages. Google supposably prefers this. Before both items existed on the about page.
    • This blog now supports a JSON Feed, looked at Apple News but screwed up the process for importing my RSS feed. May revisit that later but with the low traffic most of this blog sees, not really worth the effort.
    • Improved the JS. To reduce requests, I've concatted four JS files into one. I upgraded jQuery 1.12 to 3.x as it is faster and smaller, and its no longer hosted on a CDN.
    • Fixed the canonical URL declaration in the head.
    • Removed a few errant CSS classes, now this serves an absurdly low 2.9k of CSS down from roughly 3k.

    Sometimes I think the lack of visual flourish is mistaken for a lack of design but I like minimalism.


    How to import Feedly feeds into NetNewsWire

    NetNewsWire 5.0 was just released as open source. Ever since Google Reader shut down, I've been using Feedly as an aggregator and Reeder for iOS. Reeder makes it very easy to import Feedly feeds but it isn't as straight forward for NetNewsWire. Fortunately, Feedly and NetNewsWire both support OPML (Outline Processor Markup Language) to import/export feeds. Feedly buries this so here's a quick step-by-step to get up and running with NetNewsWire and Feedly.

    Steps

    1. Sign in to Feedly and click the gear.

      Feedly Gear Icon

    2. From the settings, next to the import OPML click and the export button.

      Feedly Export OPML

    3. From NetNewsWire, select File -> Import Subscriptions and import your OPML file

      NetNewsWire import OPML

    That's it! Enjoy. I highly recommend Reeder for iOS.


    RSS feed is fixed

    I'm pretty sure since the inception of this blog, the URLs in the RSS feed have been broken. I just noticed writing about NetNewsWire. Now they are fixed. You can add the RSS Feed here to your favorite RSS reader.

    Expect a slow trickle of tiny improvements to this blog, like the printable blog posts.


    Stupid Scary or Scary Stupid

    This isn't a blog about politics or even my opinions, but I'm going to go on the record and say nuking hurricanes is a bad idea. This seems like a silly thing to say, but here we are, having a national discussion about it, and why it is a bad idea. It's worth noting our sitting president suggested multiple times according to Axios, that suggested nuking hurricanes to stop them from hitting the U.S..

    A lot of people are clowning this idea (because it's patently stupid), but it's not entirely wrong, rather a problem of scale. We simply do not have enough nuclear weapons on the planet to end all hurricanes. If truly want to nuke our way out of hurricanes, we'll have to invest a lot more into Nuclear weapons. No more earth = no more hurricanes. Checkmate, hurricanes.

    Jokes aside, Axios may not be the household name, nor even my go-to source for journalism and Trump denies it, but it is believable. Let us consider that this is fresh on the heels of Trump asserts, "I could win that war in a week. I just don’t want to kill 10 million people". Then there are the long-standing reports that Trump asked multiple times why can't use nukes? multiple times and would use nuclear weapons in response to a terrorist attack by ISIS. Even if the latest is "fake news," the other accusations are damning enough.

    "The biggest problem we have is nuclear ... having some maniac, having some madman go out and get a nuclear weapon." - Donald Trump, 2016

    I couldn't agree more. This is clearly a person who is unfit to be given responsibility for the most destructive weapon humankind has created. As the great political journal of our age, TeenVogue*, wrote, "Yes, Trump could instigate a nuclear war without anyone stopping him."

    We are all at the mercy of a tweet.

    *Not a satirical comment.


    Git Hotfix workflow for Pantheon.io

    We've all been there, you have changes that haven't been QAed, but there's a hotfix that needs to go out yesterday. Pantheon is a great host, but it has one major gotcha: you can't switch code branches. The way Pantheon works is the Pantheon Remote git library gets one, and only one branch. There's no way to switch branches. Code can only be promoted from staging->test->live. This is problematic, especially if you're coming from various deployment management utilities that let you switch branches or platform-as-a-service like Heroku.

    Pantheon's recommended work flow

    Image Credit: Pantheon, Use the Pantheon Workflow

    Pantheon does offer multi-dev which essentially creates a separate branch for testing (and which can be promoted to the main chain) but still doesn't fix the hotfix issue.

    Pantheon Hotfix Flow

    1. Create a local branch and reset to the last commit that was made live (this is a pain as Pantheon doesn't show last commit git hashes)
    2. Make changes locally. Commit the changes to your new branch.
    3. git push -f Pantheon YOURBRANCHHERE:master
    4. Promote from Dev -> Test -> Live from the control panel
    5. Make sure your hotfix is merged into your master local (and your origin)
    6. Reset Pantheon’s Dev to the master branch git push -f Pantheon master:master

    Despite being warned that you should never use git force, this is the cleanest method. You can push up your desired hotfix and leave it on the live environment until your normal deploy chain overwrites it.


    Printable blog posts

    Using the magic of @media print, I've included a mild update to this blog to be more printer / PDF friendly, mostly for a singular post, The Definitive Mac Pro Upgrade Guide that drives the majority of my traffic.

    • Images are now capped to 75% of the page width as my images are only double density, optimized for screens, not printers.
    • The main content now expands to the full page width, so users can set the margins within their print prefs.
    • The main body copy, and line height has been reduced from 16px to 15px, and 2 em to 1.75 em.
    • I've included a link back to the original blog post at the top of each page, visible when printed so PDF users can easily return to the blog post in a web browser.

    Happy printing, you weird PDF-loving bastards!


    Time Machine: An Error Occurred Restoring from Backup + Fix

    My 2017 MacBook Pro stopped charging and refused to accept power from any power supply on any port. I was restoring my My computer to my previous laptop, a 2015 MacBook Pro and encountered the above. I tried using the day-before's back up, but this didn't work.

    I received the following message when booting from a Time Machine drive:

    An Error Occurred Restoring from Backup

    An Error Occurred Restoring from Backup

    To Try Restoring from a different backup, click choose other Backup.

    To reinstall macOS, click install macOS. During the install you can chooes to restore your information from a Time Machine Backup.

    To Boot from an existing macOS installation...

    I've seen some high tier fixes, like harryfear.co.uk's fix but there's an easier route and the clue is in the error message.

    More carefully reading the message, I booted off the Recovery partition and then reinstalled macOS. Then once completed, on the Migration Assistant I selected the option to transfer information over from my Time Machine drive. This isn't a true 1:1, I noticed some things missing such as /etc Apache2 modifications but some of the geek stuff like, HomeBrew and its many CLI applications (Heroku) made the cut. Beyond renewing SSH keys and running Docker builds, my computer was good to go. Standard Mac applications had no issues.

    Summary

    If a restore fails, fear not. Restores are faster but you will not lose your important files.

    1. Boot off a recovery partition, reinstall macOS
    2. At the end of the installation, you will see the Migration Assistant. Select transfer files from another computer/device/Time Machine then select your time machine drive

    I suspect for most users, self-included, the harryfear fix is overshooting the problem and Apple's solution is "good enough".


    Chrome does not support media queries on video source tags + a workaround

    Sometimes you encounter something that'll surprise you, and yesterday was one of those days: Chrome does not support inline media queries on the source tag within videotag. (you can test it here) Worse, plain media queries will not stop multiple videos from loading, which effectively doubles your data, so it requires a JS solution. CSSTricks has an article from 2012 using jQuery, but there's no follow up and I wasn't that enthralled. I saw, thenewcode: Make HTML5 Video Adaptive With Inline Media Queries but it fails to mention Chrome's refusal to support it.

    Javascript to the rescue

    First, I wanted to prevent any request to be made, so I created an empty video tag with my two videos as attributes. Easy right? Now that all major browsers support MPEG4, I could safely assume the only legacy users are IE and Safari as the browsers are tied to OS updates, whereas Chrome and FireFox are not thus very few users would not be using a recent browser. Safari and IE both support MPEG4. There's not a good reason for me to want to support WebM.

    <video
      preload="auto" autoplay="" loop="" muted="" playsinline=""
      data-desktop-vid="https://iconaircraft.s3.amazonaws.com/ICON_Web+4.0_Loop_16x9_DRAFT190723_26sec+3700.mp4"
      data-mobile-vid="https://iconaircraft.s3.amazonaws.com/ICON_Web+4.0_Loop_1x1_DRAFT190723_26sec-mobile.mp4"
      >
    </video>
      

    I didn't want to rely on any framework, jQuery document ready meant the JS wouldn't fire until the rest of the page loaded, and es6 meant leaving out old browsers. Thus, I'm limited to ES5.

    First, I needed to get all the videos on the page. This creates a variable that contains an array of objects, even if only one is found on the entire page.

    //get all vids
    var video =  document.querySelectorAll('video')

    Next, I needed to create a source for the video tag. The source tag needs an src and type. After that we need to append the newly created DOM element back to an element. This function doesn't need to know how many videos are on the page or what the screen size. It just will return a source to a video tag.

    //add source to video tag
    function addSourceToVideo(element, src) {
        var source = document.createElement('source');
        source.src = src;
        source.type = 'video/mp4';
        element.appendChild(source);
    }

    Next is where the logic happens. If a screen size over a predetermined value, I will load the desktop or mobile version. Since I have two data-attributes to work off of depending on the screen size depends on which one I want to use. If the screen is above a certain size, it will grab the desktop version instead of the mobile version, to feed to addSourceToVideo. Easy enough, right?

    //determine screen size and select mobile or desktop vid
    function whichSizeVideo(element, src) {
        var windowWidth = window.innerWidth ? window.innerWidth : $(window).width();
        if (windowWidth > 800 ) {
            addSourceToVideo( element, src.dataset.desktopVid);
        } else {
            addSourceToVideo(element, src.dataset.mobileVid);
        }
    }

    Now that we've written code to determine write sources to empty video tags, it needs to init and be able to handle multiple videos. Remember our array of objects? It's time to use it. There's no point of running the code if there aren't any videos on the page, so we need to check to see if the var videos contains any data. If it does, then we need to loop over our array and return an individual video in case we have multiple videos on our page.

    //init only if page has videos
    function videoSize() {
      if (video !== undefined) {
        video.forEach(function(element, index) {
                whichSizeVideo(
                    element, //element
                    element  //src locations
                );
        });
      }
    }
    videoSize();

    Notably, you could tie the above code to a resize event incase a user resizes the window and have it trigger videoSize. I chose not not to for simplicity. You can see the working version of the above code, on CodePen. I didn't embed it in this post, so those using a slower connection aren't being hit with 30 MB of video data. Place this script inlne or as seperate file below your videos, but before the rest of your JS payload for maximum performance.


    On the realm of personal branding

    I'm always drawn with complete morbid fascinating with "influencer" culture as I largely have avoided social media. I have exactly one social media account of the major, Facebook (arguably the worst of them all) but deleted the app off my phone about 4 years ago. I'm including YouTube in this, as it should be considered a social network and reddit, which I avoid. I'm convinced that vast majority of social media is an anecdoche.

    So when I read about influencers asking for free meals at restaurants, I experience a clash of contradictory thoughts simultaneously: "Who has the gall to cold-call a restaurants for a free meal because they have 50,000 Instagram followers?", "I'm surprised this happens", "I am not surprised in the least bit", "this is scam verges on genius", "major brands will give famous people free shit, why not small businesses give those with tiny soap box for cheap advertising?", and "everything about this is idiotic" followed by general self-satisfied feeling of being above it all, despite my immediate desire to share/discuss it with my friends.

    Over marketization of all facets of life has made even the most mudane activity a transactional exchange that can be sold thanks to social media. Its all viewed through the nihilistic world view that anyone has a "personal brand". Any experience, even a wedding proposal is a marketing opportunity, and people flock to toxic lakes. The irony is influencers are internet points that may or may not mean a damn thing, an influencer with 2 million followers couldn't sell 36 t-shirts.

    If there's one thing that is certainly true, it's further evidence of the enshittening.


    Basic Architecture of Designing Gutenberg Blocks in Wordpress 5.0

    There's plenty of tutorials on creating your own custom Gutenberg blocks, but I found that the between beginner and advanced was lacking. I'm going to skip the basics but rather a short list of things to understand to more effectively work with Gutenberg. So from a trial-by-fire experience of working on two Wordpress 5.0 Gutenberg websites, here's what I've learned. The guiding principal is to re-use functionality when possible and try to replicate the Wordpress UI.

    Rule 0: Understand React's role

    Wordpress chose to use React to create it's UX for its new Gutenberg block editor. However, instead of using React directly, Wordpress uses Element, an abstraction layer over React. If you're wondering why someone would want to do this, Wordpress has a very concise list:

    • In many applications, especially those extended by a rich plugin ecosystem as is the case with WordPress, it’s wise to create interfaces to underlying third-party code. The thinking is that if ever a need arises to change or even replace the underlying implementation, it can be done without catastrophic rippling effects to dependent code, so long as the interface stays the same.
    • It provides a mechanism to shield implementers by omitting features with uncertain futures (createClass, PropTypes).
    • It helps avoid incompatibilities between versions by ensuring that every plugin operates on a single centralized version of the code.

    This means you'll be writing code and making imports to wp.element, wp.components and wp.blocks. React only exists in the admin side of things, and all the content found within a block is saved statically meaning you won't be able to create React experiences simply by creating Gutenberg blocks. The



    Rule 1: Bundle your custom blocks in one plugin

    It's pretty easy to bundle all your custom blocks into one plugin. Unless you're looking to distribute your custom blocks across many sites, it's saner for development and deployment to make one master plugin for all your blocks. On the dev side, this means a single webpack instance to spin up as opposed to one for each block. A good example of this in action is Zac Gordon's WPForJSCourse example. While his course isn't free, this plugin is. This includes everything you need: A sane structure, a webpack config and a setup. All the custom blocks are registered in one nifty index js file.

    It's pretty easy to do, but it was a design pattern that I didn't realize would benefit me when I started in on Gutenberg.



    Rule 2: Learn Innerblocks and reuse core blocks

    Most tutorials seem to stop short of innerblocks, and Innerblocks are probably one of the most important features of Gutenberg. Innerblocks allow you to load blocks inside of blocks. Below is a super basic example for a slideshow, allowing the user to enter as many images using the core/image Gutenberg block but restricting the user from entering any more.

        edit: props => {
            const { attributes: {selectControl},
                className, setAttributes, isSelected, } = props;
                const ALLOWED_BLOCKS = [ 'core/image' ];
            return [
                div className="slideshow-super-simple">
                <strong>Note: </strong> all slides are visible in editor<br />
              <InnerBlocks
                allowedBlocks={ ALLOWED_BLOCKS }
              />
              </div>
            ];
        },
        save: props => {
            const { attributes: { selectControl } } = props;
            return (
                <div className="icon-simple-slideshow" >
                  <div className={ selectControl} ><InnerBlocks.Content /></div></div>
            );
        },
        

    Innerblocks aren't simply limited to allowing and restricting other blocks, they can also accept templates, which is a set of pre-defined blocks. This allows assembling a very complicated UI or layout widget out of any number of prebuilt or custom blocks. There's no reason to reinvent the wheel as Wordpress gives you quite a few different blocks. Below I've included a list of all the common blocks by category.

    Common blocks category
    • core/paragraph
    • core/image
    • core/heading
    • (Deprecated) core/subhead — Subheading
    • core/gallery
    • core/list
    • core/quote
    • core/audio
    • core/cover (previously core/cover-image)
    • core/file
    • core/video
    Formatting category
    • core/table
    • core/verse
    • core/code
    • core/freeform — Classic
    • core/html — Custom HTML
    • core/preformatted
    • core/pullquote
    Layout Elements category
    • core/button
    • core/text-columns — Columns
    • core/media-text — Media and Text
    • core/more
    • core/nextpage — Page break
    • core/separator
    • core/spacer
    Widgets category
    • core/shortcode
    • core/archives
    • core/categories
    • core/latest-comments
    • core/latest-posts
    • core/calendar
    • core/rss
    • core/search
    • core/tag-cloud
    Embeds category
    • core/embed
    • core-embed/twitter
    • core-embed/youtube
    • core-embed/facebook
    • core-embed/instagram
    • core-embed/wordpress
    • core-embed/soundcloud
    • core-embed/spotify
    • core-embed/flickr
    • core-embed/vimeo
    • core-embed/animoto
    • core-embed/cloudup
    • core-embed/collegehumor
    • core-embed/dailymotion
    • core-embed/funnyordie
    • core-embed/hulu
    • core-embed/imgur
    • core-embed/issuu
    • core-embed/kickstarter
    • core-embed/meetup-com
    • core-embed/mixcloud
    • core-embed/photobucket
    • core-embed/polldaddy
    • core-embed/reddit
    • core-embed/reverbnation
    • core-embed/screencast
    • core-embed/scribd
    • core-embed/slideshare
    • core-embed/smugmug
    • core-embed/speaker
    • core-embed/ted
    • core-embed/tumblr
    • core-embed/videopress
    • core-embed/wordpress-tv
    Dummy Image

    Pictured: Mock up of a hypothetical user page

    Let's break down the above design: It's two columns consisting of:

    Column 1 (core/column) Column 2 (core/column)
    Image (core/image) Headline (custom)
    Sub-Headline (custom)

    Paragraph (custom)

    With Gutenberg, simple layouts like the above can potentially be done using the code block, but it isn't desirable as it requires a bit of mastery of Wordpress, with a high margin of error if it taps into using custom CSS. We have a two column design, consisting of an Image in the first column, and two fields in the second, followed by text beneath the columns. So let's look a template code.

         edit: props => {
                const { attributes: { paragraph },
                    className, setAttributes, isSelected } = props;
                    const TEMPLATE = [
                      [ 'core/columns', {columns: 2,className: "profile-outer-column"}, [
                          [ 'core/column', { className: "profile-inner-column" }, [
                            ['core/image', { className: "profileImage"}],
                          ], ],
                          [ 'core/column', {className: "profile-inner-column"}, [
                            ['mycustomblocks/profile-title', { className: "profileTitle"}],
                            ['mycustomblocks/profile-name', {  className: "profileName"}]
                          ],],
                      ],],
                      ['mycustomblocks/profile-bio', { className: "profileBio"}]
    
                    ];
                return [
                    <div className={ className + " my-profile-editor"}>
                      <InnerBlocks template={TEMPLATE} />
                    </div>
                ];
            },
            save: props => {
                  const { paragraph,className } = props.attributes;
                return (
                    <div className={className + " my-profile"}><InnerBlocks.Content  /></div>
                );
            },
       

    Using the templates, I'm able to place inside columns a mixture of custom and factory Gutenberg blocks! Innerblocks aren't infallible, you can template lock blocks so users cannot add more blocks but occasionally this creates issues. Also, the custom styling for blocks does not work on any block that features an innerblock (yet). Perhaps this will change but as of writing this, it hasn't.



    Rule 3: Restricting dependent blocks

    Often you'll create a block that shouldn't appear in your list of plugins from the Wordpress gutenberg GUI. Any custom block can be easily restricted to being only accessible from a certain block type. In my previous example I had two custom blocks, mycustomblocks/profile-title and mycustomblocks/profile-name. These two blocks are very simple blocks, but I do not want them polluting my list of Gutenberg blocks. This only requires declaring the parent

    export default registerBlockType(
        'mycustomblocks/profile-title',
        {
            title: __( 'Profile Title', 'mycustomblocks' ),
            description: __( 'This field is for the user profile's job title.', 'mycustomblocks'),
            category: 'common',
            keywords: [
                __( 'text', 'mycustomblocks' ),
                __( 'MediaUpload', 'mycustomblocks' ),
                __( 'Message', 'mycustomblocks' ),
            ],
            parent: ['mycustomblocks/slideshow-slide'],

    See the parent flag? It really is that easy.



    Rule 4: Learn to use the custom toolbar and form fields

    To truly make a plugin feel native, you'll need to tap into the UX that Gutenberg uses, the toolbar and the sidebar containing form fields, using InspectorControls and BlockControls. Again, Zac Gorden's JSForWordpress tutorial repo has a great example of each.

    InspectorControls appear in the sidebar of a block, zgordon also has a nice tutorial on it as well as Eudes' medium post. Also, be sure to see the official documentation on InspectorControls.

              <InspectorControls>
                  <PanelBody
                      title={ __( 'High Contrast', 'jsforwpblocks' ) }
                  >
                      <PanelRow>
                          <label
                              htmlFor="high-contrast-form-toggle"
                          >
                              { __( 'High Contrast', 'jsforwpblocks' ) }
                          </label>
                          <FormToggle
                              id="high-contrast-form-toggle"
                              label={ __( 'High Contrast', 'jsforwpblocks' ) }
                              checked={ highContrast }
                              onChange={ toggleHighContrast }
                          />
                      </PanelRow>
                  </PanelBody>
              </InspectorControls>

    BlockControls appear in the editable area of a block as inline controls. Also, be sure to see the official documentation on Toolbars and Inspector

    <BlockControls>
        <AlignmentToolbar
            value={ textAlignment }
            onChange={ ( textAlignment ) => props.setAttributes( { textAlignment } ) }
        />
        <Toolbar>
            <Tooltip text={ __( 'High Contrast', 'jsforwpblocks' )  }>
                <Button
                    className={ classnames(
                        'components-icon-button',
                        'components-toolbar__control',
                        { 'is-active': highContrast },
                    ) }
                    onClick={ toggleHighContrast }
                >
                    {icons.contrast}
                </Button>
            </Tooltip>
        </Toolbar>
    </BlockControls>

    It will feel a little strange but the edit return is returned as an array, for the main editable area mark up.


    The Return: Mac Pro 2019

    As a minor (I stress minor) pundit on all things Mac Pro after my definitive Mac Pro Upgrade guide I figure I should weigh in to the ever expanding sea of opinions. For the first time in a very long time the WWDC really hit the right notes: the iPad is growing up, the Photos app is beautiful and even more compelling, iTunes is no more and finally broken apart, the Watch has a de-tethered experience, Mac OS now will sport screen mirroring natively on an iPad, but Apple seemed to sense that the most important announcement was the return to the professional with the almighty Mac Pro.

    Apple stands smug

    Pictured: Apple feeling itself with the new Mac Pro. Don't be fooled by the monitor, it's 39.7 lbs/18.0 kg

    Apple delivered but for $$$$

    The presentation was oh-so-Apple like, and then it wasn't. Apple talked big numbers, lots of numbers, the kind of numbers that make average-people glaze over in boredom: 8k, 6k, 3.0 16x PCIe, billions of pixels, 2000 audio tracks, GPUs, multiple GPUs. It's enough to make someone throw up their hands and rhetorically ask: Who cares? But we care. We always have. There was a surreal moment when Apple showed how the case opened. They invited you inside, and look, there are slots! So many slots. What-in-the-name-of-Ives was going on?

    The new Mac Pro is a monster, there's no other way to say it:

    • Up to 28 Cores
    • Up to 1.5TB of RAM
    • Eight PCIe slots
      • Two "MPX" slots with Thunderbolt 3 passthrough (optional 16x ports if the slots aren't obfuscated with large cards)
      • 3 full length slots (1 16x, two 8x)
      • 1 half length slot (8x) with I/O connectivity from factory
    • Two SSD slots (unclear if natively NVMe)
    • Two 10Gb Enet
    • Two USB-C / Thunderbolt 3 / two USB-A ports front facing
    • Two Thunderbolt 3 ports on I/O card
    • Headphone jack, internal speaker
    • 802.11ac/Bluetooth 5
    • Custom additional co-processor for video that allows for three 8k streams to be played back at once.

    It's an absolute monster beast of a computer. Dare I say, this may very well be the best-designed desktop Apple has done. Visually, it may be a little too avant-garde. It's alien-looking but with a very clear nod and a wink to the cheese grater. It's built to last, just like how my 3.1 Mac Pro still works a decade layer. The hitch is the entry price of $5999.... ouch, oh and that pretty monitor? $4999 with a very understandable groan from the audience when the stand price tag of $999. The monitor is too much but I guess time to start saving for the Mac Pro. It's been a very long time since I've had this level of interest in a computer.

    I've been following social media conversations and have some additional thoughts that I felt necessary to expand on.

    More thoughts on price

    The Mac Pro 2019 was what the people wanted, Xeons are more expensive than ever, with the CPU itself making up for roughly 1/3rd of the price even in the entry model. The Mac Pro 2019 also is the most upgradable Mac we've seen on several accounts. The PowerMac 9600, a monster in its own right had six PCI slots, twelve RAM slots, SCSI, ethernet and a serial port (3x 5.25 inch drives), debuting in 1997 at $3,700 (Roughly $5800 adjusted for 2019). The 2019 Mac Pro may lack the drive bays (only two SSD slots), it also has eight PCIe slots with an additional four Thunderbolt 3 ports. Let that sink in, Thunderbolt 3 adds the rough equivilent of four more 4x PCIe slots. It's better to think of the Mac Pro 2019 as having twelve PCIe slots. It has effectively double the physical PCIe slots of the Mac Pro 2006-2012, triple if you consider Thunderbolt 3, and far more CPU configuration options despite one CPU (which can scale to 28 cores, 56 virtual cores). You are effectively getting double the computer than the previous generation Mac Pro. It's expensive. Really expensive but go look at other Xeon workstations. It's price competitive. You wanted upgradable? Here it is.

    Lastly, part of the price hike problem is the stagflation for most workers. This isn't to say that the problem isn't with Apple's price-point as even in 1997 a $3700 computer was a bitter pill to swallow, rather that it is exacerbated by the lack of meaningful raises that'd make investments like this more attainable. Also Mac users haven't kept up with the current Xeon prices, they are damn expensive. There's not really an affordable gap between an Intel i9 and the Xeons sadly. What a bulk of users wanted (self-included) was a $3000 Mac Pro. The i9 line certainly provides the CPU horse-power but is handicapped in the I/O department, only 40 lanes of PCIe and maximum memory and 128 GB. However, at 68 lanes maximum is certainly beefy enough but still locked at 128 GB of RAM, less than the 2010 Mac Pro.

    As for the monitor? Nope, I can't justify it. Charging $1000 for the stand is not a good look. While a 6k true-10 bit display is impressive, I'm sure within 2 years we'll see equal displays for less money. Also for wallet injury, Apple had the gall suggest connecting 6 of these monitors to the Mac Pro, $36,000 worth of displays. Riiiight.

    Thoughts on GPUs

    If you hoped for NVidia support, well.... you placed your hopes on the wrong company. Apple and NVidia have not reconciled. There was absolutely no reason to expect them to resolve their issues. Expect to remain disappointed for the foreseeable future. I want NVidia as much as anyone, but it's not happening. Should Apple allow NVidia GPUs? Of course. Will they? Probably not. I've been wrong about a good many things though.

    Thoughts on chipset

    I'm not crazy about the Xeon price tags, and I've seen several people arguing that Apple should have gone AMD with like the thread ripper series. So far the Threadrippers are limited to: max 128 GB/s of RAM, 64 PCIe lanes and no Thunderbolt 3. While personally, these specs would be fine for me on a desktop (in fact I'd personally like a more modest config for affordability), they aren't for the target market: too few PCIe lanes, not enough ram and no Thunderbolt 3 is a deal breaker.

    Also, the commitment to Xeons lends credence for a personal theory of mine, that the mac lineup might end up a split CPU architecture. Windows already is. The MacBook and MacBook Air line could end up on ARM where the I/O abilities of ARM are much less and the limitations of size mean that an Apple GPU would be more viable than Intel's built-in offerings. Is this true? Who-the-hell-knows outside of infinite loop but Apple speculation is a past time and I figure I should put this in writing just to see how accurate I really am.

    Thoughts on I/O chipset

    Now here's one that took me for a loop and I'm surprised that no-one I've seen has talked about this: The I/O card has two of the Thunderbolt 3 ports. This means (possibly) that more ThunderBolt 3 ports could be added to the computer or better, possibly whatever I/O (thunderbolt 4?) could be added down the road. This certainly improves the shelf life.

    Thoughts on Power Draw

    It's an easy target for people to skewer Apple on, but GPUs need juice. Their power requirements aren't going down anytime soon with 8k and VR. Perhaps focus on performance-per-watt will become needed in soon in the desktop arena as watts-in = heat out, creating issues for everyday users but we haven't seen the worst of it yet. The real question is the idling power, which is bound to be much lower than the ma

    Thoughts PCIe 3.0 / DDR4

    A few people are angry about the lack of PCIe 4.0 and DDR4. This one is easy. PCIe 4.0 isn't a shipping spec and Intel may skip it altogether, DDR5 is still more than a year out. The Mac Pro delayed even further. The real test is will Apple update to PCIe 4.0 and DDR5 when it comes time? If you hoped for Thunderbolt 4, it isn't even a thing yet.

    Thoughts on looks

    I find it strange, not hideous, but beauty is in the eye of the beholder. I won't lie; it made me smile though that they brought back the cheesegrater motif. Long as its quiet and doesn't glow with neon LEDs, I'm happy. Maybe I'm an odd man out, but my Mac Pro is on the floor next to my favorite piece of furniture, a drab grey McDowell & Craig 1940s all-American all steel desk that weighs roughly 200 pounds that inherited from my grandfather. It'll look just fine next to it but each to their own.


    The Definitive Trashcan Mac Pro 6.1 (Late 2013) Upgrade Guide

    Mac Pro 2013 is Oscar's home

    Contents

    Introduction

    To mark the first anniversary of my wildly successful blog post (garnering tens of thousands of views), The Definitive Classic Mac Pro (2006-2012) Upgrade Guide, I'm proud to announce a sequel. The Definitive Trash can Mac Pro 2013 upgrade guide started in jest on social media as the guide no one wanted, seeing as the Mac Pro 2013 is kinda in itself a joke as it over-promised and under-delivered, and is considerably less upgradeable than its predecessor. Is there a need or demand for such a guide? I don't know, but here we are, and while the origins are jocular the rest of this guide is serious. While most users (and Apple engineers) probably prefer moniker "cylinder," the trash can title stuck due to its obvious physical characteristics.

    The Mac Pro 2013 has the dubious honor as the longest produced Macintosh, besting the Macintosh plus, which was produced from 1986 to 1990 without an upgrade. The 2013 Mac Pro was conceived as the successor of the original Mac Pro, eschewing the modularity for a (debatably) stylish and certainly radical redesign. After a few positive reactions by publications for its foreign looks, it quickly became snubbed for its lack of upgradability, stability, and Apple's complete and absolute antipathy (verging on enmity) towards it.

    The Mac Pro 2013 has been prone to high rates of failures due to heat, with a nameless Apple exec quoted as saying, "think we designed ourselves into a bit of a thermal corner if you will". Apple also took steps to extend its repair program, but problems persist. Despite the naysayers, the Mac Pro 2013 isn't without its fans (no pun intended), as at the time of its unveiling, it was a powerful, quirky computer in a diminutive form factor. Despite its limited upgradability, the computer is a modular design, and nearly every part of significance can be replaced. No Mac produced after it has allowed for the range of upgrades (although the iMac 5k is a close second). It's the bridge to a by-gone era, where CPUs and storage and even GPUs were removable. Perhaps the 2019 Mac Pro a return to PCIe, but more than likely, 2013 will be the template.. Edit: The Mac Pro 2019 marks an expensive return to PCIe.




    Know your Mac Pro Models

    The Mac Pro line debuted in 2006 and has had six major iterations by Apple's nomenclature, 1.1, 2.1, 3.1, 4.1, 5.1, and 6.1. These are also generally referred to by year, 2006 (1.1, 2,1), less commonly 2007 (2,1), 2008 (3,1), 2009 (4,1), 2010-2012 (5,1) and 2013 (6,1). The other terms for these computers are divided between "Cheesegrater" (2006-2012) and "Trash can" (late 2013) or "Cylinder". For the purpose of this guide, I will refer to the Mac Pro "trash can" as 2013 (as does much of the internet).

    Please note This guide only covers the 2013 Mac Pro. For all other models, I've written a massive guide, The Definitive Classic Mac Pro (2006-2012) Upgrade Guide.

    Configurations

    Apple has only shipped a grand total of 3 base configurations with a forth build-to-order option for the 12 core CPU. Apple has only made one minor change in the past six years to the Mac Pro 2013, by removing the original base configuration and lower the prices of the remaining models.

    • Apple Mac Pro "Quad Core" 3.7 GHz, 12 GB of RAM, 256 GB SSD, and dual FirePro D300 2 GB of GDDR5 (4 GB total). Discontinued April 4, 2017*
    • Apple Mac Pro "Six Core" 3.7 GHz, 12 GB of RAM (16 GB after April 4th), 256 GB SSD, and dual FirePro D500 3 GB of GDDR5 (6 GB total). Discontinued April 4, 2017*
    • Apple Mac Pro "Eight Core" 3.0 GHz, 12 GB of RAM (16 GB after April 4th), 2256 GB SSD, and dual FirePro D500 6 GB of GDDR5 (12 GB total).
    • Apple Mac Pro "Twelve Core"* 2.7 GHz, 12 GB of RAM (16 GB after April 4th), 256 GB SSD, and dual FirePro D500 6 GB of GDDR5 (12 GB total). This is a build to order option only.



    CPU Upgrades

    Apple has never acknowledged the upgradability of the Mac Pro CPU, but the Mac Pro 2013's CPU is not soldered in thus making it upgradeable. Only four CPU configurations were offered by Apple, E5-1620v2, E5-1650v2, E5-1680v2, and the E5-2697v2, but users soon discovered that the E5 v2 family was compatible. Unlike the previous Mac Pros, the Mac Pro 2013 was only offered in a single CPU configuration.

    From personal observation, the E5-2697 can be found cheaper on Ebay and local used markets (in the US) vs the E5-2695 is considerably cheaper on aliexpress. This varies based on your local markets as the European markets tend to be much more expensive than North America.

    Credit to the CPU list goes to Mac Rumors forum member ActionableMango.

    Architecture Cores CPU-Model GHz Turbo RAM Watt
    Ivy-Bridge 12 core E5-2697 V2 2.7 3.5 1866 130W
    Ivy-Bridge 12 core E5-2696 V2 2.5 3.3 1866 130W
    Ivy-Bridge 12 core E5-2695 V2 2.4 3.2 1866 115W
    Ivy-Bridge 10 core E5-2690 V2 3.0 3.6 1866 130W
    Ivy-Bridge 10 core E5-2680 V2 2.8 3.6 1866 115W
    Ivy-Bridge 8 core E5-2687W V2 3.4 4.0 1866 150W
    Ivy-Bridge 8 core E5-2667 V2 3.3 4.0 1866 130W
    Ivy-Bridge 8 core E5-2673 V2 3.3 4.0 1866 110W
    Ivy-Bridge 8 core E5-1680 V2 3.0 3.9 1866 130W
    Ivy-Bridge 6 core E5-1660 V2 3.7 4.0 1866 130W
    Ivy-Bridge 6 core E5-1650 V2 3.5 3.9 1866 130W
    Ivy-Bridge 4 core E5-1620 V2 3.7 3.9 1866 130W

    Useful Links




    GPU Upgrades

    Yes, the Mac Pro's GPUs can be swapped out, but only three different GPUs were ever produced for it, the AMD FirePro D300 2 GB, D500 3 GB, or D700 6 GB. Apple has kept tight control on these (any official repairs require the GPUs to be returned to Apple), and thus few-to-none exist on the aftermarket, and the two higher GPUs are prone to failures thanks to a wattage ceiling. For most intents and purposes, it is cheaper to buy a Mac Pro 2013 than to track down two GPUs. Apple discontinued the entry-level Mac Pro 2013 that sported the D300. All-new Mac Pros sold after April 4th, 2017, have either a D500 or D700.

    For other GPU options, see the eGPU section.

    Useful Links




    OS Upgrades

    Currently, the Mac Pro 2013 is still supported hardware (as it should be as Apple stopped selling it only in 2019), but the relatively low sales likely mean it may be dropped in future Mac OS updates. It can run Mac OS 10.15 Catalina but does not support Sidecar (as of yet).

    Notably, all 32-bit binaries are no longer executable, meaning users of legacy software should really check before upgrading.




    Firmware upgrades

    The Mac Pro 2013 has had a few firmware upgrades. Unlike previous Mac Pros that a firmware upgrade allowed for faster CPUs/RAM, AFPS, and NVMe booting for certain models, the Mac Pro 2013 has been more meager. The MP61.0120.B00 boot ROM included support for NVMe booting (found in the High Sierra update). Most recently, the boot ROM version 128.0.0.0.0 was included in the 10.14.4 Developer Preview. With some firmware upgrades, some users found 4k displays no longer supporting 60 Hz, which requires an SMC reset and removing the offending PLists, see the useful links below. Previously the updates were distributed separately from the OS ,but in 10.13+, these were folded into OS updates. The current BootROM version is 131.0.0.0.0 and ships with 10.14.6.

    Notable, some users cannot update the bootrom without the Apple SSD. It's recommended hanging onto the original SSD with a copy of MacOS to perform Firmware updates.

    To check your firmware version, go to About This Mac -> System Report, it will be listed on the first creen under Boot Rom.

    Useful Links




    Storage Upgrades

    There's a large number of external storage upgrades for the Mac Pro 2013, from USB 2.0/3.0 to ThunderBolt 2.0, and listing them all would be an exercise in futility. What's important to understand is that there are many multi-drive enclosures, spanning everything from RAID to multiple SSDs. External SSDs perform well in Thunderbolt 2, able to achieve roughly 1.2 GB/s depending on the storage solution in various tests.

    Internally, The Mac Pro does feature one SSD slot, using a custom Apple SSD running at PCIe 2.0 x4, capable of a maximum of 2 GB/s. Very few native third-party solutions exist, but they are out there, by makers like OWC and Transintl.

    That said... users have figured out how to shoe-horn NVMe drives in the Mac Pro offering top-tier performance and much better prices. Unfortunately, no one has taken the time to compile a list, so the known so far are: Samsung 960, Samsung 970 Pro, Toshiba XG3, and Crucial P1. Samsung released a firmware fix for certain models as well, including the 970 Pro,

    The Mac Pro 2013 uses the same interface as the 2013-2015 MacBooks. There's a cottage economy of NVMe adapters now floating around. The first adapters that users tackled, such as the GFF M.2 PCIe SSD Card, required a bit of filing and tape to successfully mount the card, which users on MacRumors were able to pull off. NVMe with ST-NGFF2013-C; Vega Internal GPU; Mac Pro 2013 (6,1). Later adapters like the Sintech ngff m.2 NVMe SSD adapter do not require modification. The quick summary is you'll need a Mac Pro running 10.13+, an adapter and an NVMe SSD with a Sintech adapter, if you for some reason choose the GFF adapter, you'll need tape, a file and some free time.

    Currently, the only vector for multiple M.2 NVMe drives internally is the Amfeltec Angelshark Carrier Board. This keeps the original port intact and thus allows for three internal NVMe drives.

    Working SSD list

    This list is from MacRumors by the user maxthackray, so all credit goes to him.

    • Adata NVMe SSD : SX6000, SX7000, SX8200, SX8200 Pro etc.
    • Corsair NVMe SSD : MP500, MP510
    • Crucial NVMe SSD : P1
    • HP NVMe SSD : ex920, ex950
    • OCZ RD400 (and all Toshiba XG3-XG4-XG5-XG5p-XG6 line)
    • Intel NVMe SSD : 600p, 660p, 760p etc.
    • MyDigital NVMe SSDs : SBX - BPX
    • Kingston NVMe SSD : A1000, A2000, KC1000
    • Sabrent Rocket
    • Samsungs Polaris NVMe SSD : 960 Evo, 960 Pro, 970 Evo, 970 Pro
    • WD Black NVMe SSD v1, v2 and v3

    Drives in red require, NVMe drives with 4K sector sizes which require changing.

    Incompatible NVMes

    • Samsung PM981
    • Samsung 950 Pro
    • Samsung 970 Evo Plus*

    *Firmware update fixes this particular SSD

    Useful Links




    RAM/Memory upgrades

    Officially most sites list the maximum ram for the 2013 as 128. The Mac Pro 2013 uses PC3-15000 DDR3 ECC (1866 MHz) RAM, with 4 RAM slots. The Maximum DIMM size is 32 GB. Maxing out the RAM can be a somewhat pricey endeavor, but sites like aliexpress and eBay, mean this can be done for under $450 USD.




    ThunderBolt 2 to PCIe

    There's a fair amount of options today on the market like the Sonnet Technologies Echo Express SE1 - 1 PCIe Slot (roughly $200), and it scales up rather quickly.

    The biggest modifications to the Mac Pro 2013 aren't internal, but rather massive PCIe enclosures that generally cost in the $1500-4000 range, making them often as expensive as the computer itself. There are a few options on the market like the Sonnet xMac Pro Server, which adds 3 full-length PCIe slots (you can see it on youtube), and the absolutely absurd JMR Quad Slot Expander adding 4 PCIe slots and 8 drive bay just to name a few. For the truly curious, you can see the JMR expansion system innards.

    Not all PCIe enclosures support eGPUs. I've included in the eGPU section is a list of enclosures that support GPUs.

    Additional Notes on Thunderbolt 2

    There's a wide variety of Thunderbolt 2 products, chiefly storage systems (including RAID setups), and ThunderBolt 2 docks still on the market. Due to the sheer amount I'm unable to list them all, but it's important to remember that a fair amount of functionality missing from the 2013 can be recaptured with Thunderbolt 2, like previously mentioned, PCIe slots, eGPUs and the like.

    The Mac Pro 2013 to date includes the six Thunderbolt ports, the most found on any Mac before or since. To obtain peak performance, it's recommended that displays be connected separately from other high bandwidth utilities like external storage.

    The Mac Pro 2013 can drive three 4k displays or six 2560 x 1600 displays, and with the June 16, 2015 firmware update, three 5k displays (using two ThunderBolt ports and the HDMI port) internally.




    Thunderbolt 3 / USB 3.1c

    The Mac Pro 2013 can't be upgraded to Thunderbolt 3 bus speeds, but that doesn't mean it can't use Thunderbolt 3 / USB 3.1c devices (at the speed of Thunderbolt 2). Apple has a Thunderbolt 3 (USB-C) to Thunderbolt 2 Adapter, which is bi-directional, meaning the same adapter can also be used for Thunderbolt 3 Macs to use Thunderbolt 2 devices. Notably, not all Thunderbolt 3 devices are backward compatible, so you may want to check with the manufacturer for compatibility.




    eGPUs

    It's nearly impossible to talk about the Mac Pro 2013 without mentioning eGPUs. Mac OS now supports AMD eGPUs (almost) natively, and macOS 10.14.x does not allow for modern nVidia support making it nearly a one-way path in eGPU. NVidia support for later eGPUs is limited to a maximum of Mac OS 10.13.x, and that does not appear to be changing due to a disagreement between Apple and NVidia. Unless this changes, this guide will not list Mojave incompatible NVidia eGPUs, despite the later GPUs being supported in Mac OS 10.12.x and 10.13.x. Currently, the RX (580x, 570x) line and the Vega (Vega, 48, 56, FE ) line by AMD are Mojave compatible, and the Keppler line by NVidia are Mojave compatible. The eGPU.io community has a searchable database. If going for an eGPU, I highly recommend upgrading to Mac OS 10.13+ as it includes more native support, thus much easier to set up, to the point of being (nearly) plug and play.

    Note: All Thunderbolt 2 Macs require disabling SIP and running Purge Wrangler to enable eGPU support.

    macOS Supported AMD eGPUs, * 10.13 required

    • Vega FE*
    • RX Vega 64 Liquid*
    • RX Vega 64*
    • Vega 56*
    • Pro WX 7100
    • Pro WX 5100
    • Pro WX 4100
    • RX 580
    • RX 570
    • RX 560
    • R9 Fury X
    • RX 480
    • RX 470
    • RX 460

    macOS 10.14 Mojave Supported NVidia eGPUs - Only Keppler series GPUs are supported

    • GTX 650
    • GTX 660
    • GTX 670
    • GTX 680
    • GTX Titan

    *eGPUs require Mac OS 10.12 or above.

    Confirmed working Enclosures with Mac Pro 2013

    • Akitio Thunder2
    • AKiTiO Node
    • Asus XG Station 2
    • Blackmagic eGPU
    • Mantiz Venus
    • Razer Core X
    • Sonnet Breakaway 350

    Useful Links




    Cooling

    Outside of the extreme JMR solutions PCIe slot Rackmount cases, Mac Pro 2013 cooling solutions remain pretty slim. Most users elect to use various laptop cooling pads to place under Mac Pros (which do seem to help). If anyone has any information about physical mods or Mac Pro 2013 specialty cases, I'm all ears, and please reach out to me (see the bottom of this post).

    Useful Links




    Repairs

    The Mac Pro 2013 earns the distinction of sporting a modular design. There's not a lot to say here since iFixit gave it an 8 out of 10 for repairability and has pretty much every part in its Mac Pro Late 2013 Repair Guide. If you can do it, they probably have a beautiful step-by-step pictorial guide.




    Mac Pro 2013 won't sleep

    MacRumors members note that Hand-off can affect a 2013's ability to sleep. Disabling seems to be the fix.




    Communities & Blogs

    You're not alone. There are more people out there than you'd think who still love the Mac Pro 2013.

    • MacRumors Mac Pro Forum - The center of the Mac Pro universe.
    • MacProUpgrade - a private but very popular facebook group, primarily classic "Cheesegrater" Mac Pro users with some 2013 users.
    • Mac Pro Users - The another major FaceBook group for Mac Pro users, smaller but still helpful and it has the benefit of being public too (no sign-up process and can be browsed without a facebook account).
    • eGPU.io - The go-to place for eGPUs.



    Collected Articles




    Buying used Mac Pro 2013s

    Most forums when this question is posed is don't. The updated Mac Mini may have a soldered on CPU and storage but with the Core i7-8700B is much faster than the 12 Core Mac Pro in single-core performance and spitting distance of the multicore in Geekbnech scores, and packs Thunderbolt 3, which is double the bandwidth for the inevitable eGPU, and comes with USB 3.1c support out of the box, and doesn't have a history of frying itself. Plus, it's new, comes with a warranty and is even smaller. Then there's the iMac 5k, which has an upgradeable CPU making for faster than the base iMac Pro when tricked out too. I personally would not buy a Mac Pro 2013 with much better and cheaper alternatives. The 2009-2012 Mac Pros, which pack oodles more upgrades and stupidly better GPU options or the aforementioned Mac Mini, even with an eGPU would be roughly the same cost of a lower end used 2013. Unless the used market prices drastically change, the Mac Pro 2013's shortcomings are too significant to make me ever consider one.

    at the computer is booting, the GPUs are fine. They may have had their GPUs replaced with working ones. Next to the lower the AMD GPU model, the more chance it will remain problem-free. Unfortunately, Apple stopped selling the D300 Mac Pros long ago, so it's better tracking down a D500 model. Next up, many users have placed their Mac Pro 2013s on laptop coolers to help with the thermals. Due to the exceptionally tiny case, there are no internal cooling hacks beyond turning the fan up using 3rd party software. Lastly, have an exit strategy, you may live a full problem-free existence with a 2013 Mac Pro, but you may also end up with it's GPUs failing. Apple has closed its free replacement program as of April 2018 for the GPUs, and internet prices list anywhere from $700-$1200 from Apple or authorized service centers to replace the GPUs. At this price, it is effectively cheaper to buy a replacement Mac Mini. Working GPUs in the 3rd party sector are virtually impossible to find, and the rare ones that pop up fetch the price of Apple replacements. To be fair, this is the same problem laptop users face. While it is common sense, if you contract or freelance or work where you provide your own hardware, always have a plan that minimizes downtime. Despite being a modular design, the most failure-prone component is the absolute hardest to replace due to the lack of any inventory. Also, Apple quotes 3-5 days for a Mac Pro 2013 GPU replacement. This isn't to say it will fail, but there's plenty of horror stories on the internet. This could be the case relatively small, vocal group, but the general consensus is that the Mac Pro 2013 is not the most stable design.




    Changelog

    Oscar over the Mac Pro 2013

    Due to the ever-evolving list of possible upgrades and hacks, this guide is a living document. Thus the information contained may change, I've included a robust log of recent changes to help repeat visitors discover new content. Making and maintaining this guide takes a fair amount of work, and feedback from users is greatly appreciated to make this the most accurate/best guide possible. If you have new information not included here, suggestions, corrections or edits, please feel free to contact me at: blog@greggant.com. I get a fair amount of questions, and I try to answer them to best I can. I'd recommend asking the MacRumors forum or MacProUpgrade group first as I'm just one person vs. the collective intelligence of a community. Notably, I do not own nor have I ever owned a Mac Pro 2013 (not that I wouldn't take one, but it is cost prohibitve), so anyone who can provide more accurate information, please do!

    10/15/19 - Added note on Catalina and 32-bit + firmware versions. Badly needed copy editing.

    10/07/19 - It's catalina time. Added OS Section, fixed an error about max RAM, included RAM specs, included link to the Amfeltec NVMe M2 adapter. Added another two links to eGPU section.

    07/05/19 - Added notes on sleep issues, mild intro update.

    05/07/19 - a second update, Thanks to the feedback of Brennan F and Daniel C for feedback on SSDs and eGPUs and some copy editing to boot.

    05/07/19 - First release and one year anniversary of my first Definitive Mac Pro Upgrade Guide. Fun fact, this guide is over 2300+ words whereas my other guide is 13,000+ words. Part of the amount of writing can be chalked up to having to discuss different models, five in total, spanning 6 years. This guide covers another 6-year span and only one model. It goes to show how upgradeable the previous Mac Pros were and how much less Apple has cared about them since.