The Definitive Trashcan Mac Pro 6.1 (Late 2013) Upgrade Guide

    Mac Pro 2013 is Oscar's home



    To mark the first anniversary of my wildly successful blog post (garnering tens of thousands of views), The Definitive Classic Mac Pro (2006-2012) Upgrade Guide, I'm proud to announce a sequel. The Definitive Trash can Mac Pro 2013 upgrade guide started in jest on social media as the guide no one wanted, seeing as the Mac Pro 2013 is kinda in itself a joke as it over-promised and under-delivered, and is considerably less upgradeable than its predecessor. Is there a need or demand for such a guide? I don't know, but here we are, and while the origins are jocular the rest of this guide is serious. While most users (and Apple engineers) probably prefer moniker "cylinder," the trash can title stuck due to its obvious physical characteristics.

    The Mac Pro 2013 has the dubious honor as the longest produced Macintosh, besting the Macintosh plus, which was produced from 1986 to 1990 without an upgrade. The 2013 Mac Pro was conceived as the successor of the original Mac Pro, eschewing the modularity for a (debatably) stylish and certainly radical redesign. After a few positive reactions by publications for its foreign looks, it quickly became snubbed for its lack of upgradability, stability, and Apple's complete and absolute antipathy (verging on enmity) towards it.

    The Mac Pro 2013 has been prone to an abnormal rate of failures due to heat, with a nameless Apple exec quoted as saying, "think we designed ourselves into a bit of a thermal corner if you will". Apple also took steps to extend its repair program, but problems persist. Despite the naysayers, the Mac Pro 2013 isn't without its fans (no pun intended), as at the time of its unveiling, it was a powerful, quirky computer in a diminutive form factor. Despite its limited upgradability, the computer is a modular design, and nearly every part of significance can be replaced. No Mac produced after it has allowed for the range of upgrades (although the iMac 5k is a close second). It's the bridge to a by-gone era, where CPUs and storage and even GPUs were removable. Perhaps the 2019 Mac Pro a return to PCIe, but more than likely, 2013 will be the template.. Edit: The Mac Pro 2019 marks an expensive return to PCIe.

    Know your Mac Pro Models

    The Mac Pro line debuted in 2006 and has had six major iterations by Apple's nomenclature, 1.1, 2.1, 3.1, 4.1, 5.1, and 6.1. These are also generally referred to by year, 2006 (1.1, 2,1), less commonly 2007 (2,1), 2008 (3,1), 2009 (4,1), 2010-2012 (5,1) and 2013 (6,1). The other terms for these computers are divided between "Cheesegrater" (2006-2012) and "Trash can" (late 2013) or "Cylinder". For the purpose of this guide, I will refer to the Mac Pro "trash can" as 2013 (as does much of the internet).

    Please note This guide only covers the 2013 Mac Pro. For all other models, I've written a massive guide, The Definitive Classic Mac Pro (2006-2012) Upgrade Guide.


    Apple has only shipped a grand total of 3 base configurations with a forth build-to-order option for the 12 core CPU. Apple has only made one minor change in the past six years to the Mac Pro 2013, by removing the original base configuration and lower the prices of the remaining models.

    • Apple Mac Pro "Quad Core" 3.7 GHz, 12 GB of RAM, 256 GB SSD, and dual FirePro D300 2 GB of GDDR5 (4 GB total). Discontinued April 4, 2017*
    • Apple Mac Pro "Six Core" 3.7 GHz, 12 GB of RAM (16 GB after April 4th), 256 GB SSD, and dual FirePro D500 3 GB of GDDR5 (6 GB total). Discontinued April 4, 2017*
    • Apple Mac Pro "Eight Core" 3.0 GHz, 12 GB of RAM (16 GB after April 4th), 2256 GB SSD, and dual FirePro D500 6 GB of GDDR5 (12 GB total).
    • Apple Mac Pro "Twelve Core"* 2.7 GHz, 12 GB of RAM (16 GB after April 4th), 256 GB SSD, and dual FirePro D500 6 GB of GDDR5 (12 GB total). This is a build to order option only.

    CPU Upgrades

    Apple has never acknowledged the upgradability of the Mac Pro CPU, but the Mac Pro 2013's CPU is not soldered in thus making it upgradeable. Only four CPU configurations were offered by Apple, E5-1620v2, E5-1650v2, E5-1680v2, and the E5-2697v2, but users soon discovered that the E5 v2 family was compatible. Unlike the previous Mac Pros, the Mac Pro 2013 was only offered in a single CPU configuration.

    From personal observation, the E5-2697 can be found cheaper on Ebay and local used markets (in the US) vs the E5-2695 is considerably cheaper on aliexpress. This varies based on your local markets as the European markets tend to be much more expensive than North America.

    Credit to the CPU list goes to Mac Rumors forum member ActionableMango.

    Architecture Cores CPU-Model GHz Turbo RAM Watt
    Ivy-Bridge 12 core E5-2697 V2 2.7 3.5 1866 130W
    Ivy-Bridge 12 core E5-2696 V2 2.5 3.3 1866 130W
    Ivy-Bridge 12 core E5-2695 V2 2.4 3.2 1866 115W
    Ivy-Bridge 10 core E5-2690 V2 3.0 3.6 1866 130W
    Ivy-Bridge 10 core E5-2680 V2 2.8 3.6 1866 115W
    Ivy-Bridge 8 core E5-2687W V2 3.4 4.0 1866 150W
    Ivy-Bridge 8 core E5-2667 V2 3.3 4.0 1866 130W
    Ivy-Bridge 8 core E5-2673 V2 3.3 4.0 1866 110W
    Ivy-Bridge 8 core E5-1680 V2 3.0 3.9 1866 130W
    Ivy-Bridge 6 core E5-1660 V2 3.7 4.0 1866 130W
    Ivy-Bridge 6 core E5-1650 V2 3.5 3.9 1866 130W
    Ivy-Bridge 4 core E5-1620 V2 3.7 3.9 1866 130W

    Useful Links

    GPU Upgrades

    Yes, the Mac Pro's GPUs can be swapped out, but only three different GPUs were ever produced for it, the AMD FirePro D300 2 GB, D500 3 GB, or D700 6 GB. Apple has kept tight control on these (any official repairs require the GPUs to be returned to Apple), and thus few-to-none exist on the aftermarket, and the two higher GPUs are prone to failures thanks to a wattage ceiling. For most intents and purposes, it is cheaper to buy a Mac Pro 2013 than to track down two GPUs. Apple discontinued the entry-level Mac Pro 2013 that sported the D300. All-new Mac Pros sold after April 4th, 2017, have either a D500 or D700.

    For other GPU options, see the eGPU section.

    Useful Links

    OS Upgrades

    Currently, the Mac Pro 2013 is still supported hardware (as it should be as Apple stopped selling it only in 2019), but the relatively low sales likely mean it may be dropped in future Mac OS updates. It can run Mac OS 10.15 Catalina but does not support Sidecar (as of yet).

    Notably, all 32-bit binaries are no longer executable, meaning users of legacy software should really check before upgrading.

    Firmware upgrades

    The Mac Pro 2013 has had a few firmware upgrades. Unlike previous Mac Pros that a firmware upgrade allowed for faster CPUs/RAM, AFPS, and NVMe booting for certain models, the Mac Pro 2013 has been more meager. The MP61.0120.B00 boot ROM included support for NVMe booting (found in the High Sierra update). Most recently, the boot ROM version was included in the 10.14.4 Developer Preview. With some firmware upgrades, some users found 4k displays no longer supporting 60 Hz, which requires an SMC reset and removing the offending PLists, see the useful links below. Previously the updates were distributed separately from the OS ,but in 10.13+, these were folded into OS updates. The current BootROM version is and ships with 10.14.6.

    Notable, some users cannot update the bootrom without the Apple SSD. It's recommended hanging onto the original SSD with a copy of MacOS to perform Firmware updates.

    To check your firmware version, go to About This Mac -> System Report, it will be listed on the first creen under Boot Rom.

    Useful Links

    Storage Upgrades

    There's a large number of external storage upgrades for the Mac Pro 2013, from USB 2.0/3.0 to ThunderBolt 2.0, and listing them all would be an exercise in futility. What's important to understand is that there are many multi-drive enclosures, spanning everything from RAID to multiple SSDs. External SSDs perform well in Thunderbolt 2, able to achieve roughly 1.2 GB/s depending on the storage solution in various tests.

    Internally, The Mac Pro does feature one SSD slot, using a custom Apple SSD running at PCIe 2.0 x4, capable of a maximum of 2 GB/s. Very few native third-party solutions exist, but they are out there, by makers like OWC and Transintl.

    That said... users have figured out how to shoe-horn NVMe drives in the Mac Pro offering top-tier performance and much better prices. Unfortunately, no one has taken the time to compile a list, so the known so far are: Samsung 960, Samsung 970 Pro, Toshiba XG3, and Crucial P1. Samsung released a firmware fix for certain models as well, including the 970 Pro,

    The Mac Pro 2013 uses the same interface as the 2013-2015 MacBooks. There's a cottage economy of NVMe adapters now floating around. The first adapters that users tackled, such as the GFF M.2 PCIe SSD Card, required a bit of filing and tape to successfully mount the card, which users on MacRumors were able to pull off. NVMe with ST-NGFF2013-C; Vega Internal GPU; Mac Pro 2013 (6,1). Later adapters like the Sintech ngff m.2 NVMe SSD adapter do not require modification. The quick summary is you'll need a Mac Pro running 10.13+, an adapter and an NVMe SSD with a Sintech adapter, if you for some reason choose the GFF adapter, you'll need tape, a file and some free time.

    Currently, the only vector for multiple M.2 NVMe drives internally is the Amfeltec Angelshark Carrier Board. This keeps the original port intact and thus allows for three internal NVMe drives.

    Working SSD list

    This list is from MacRumors by the user maxthackray, so all credit goes to him. Generally, it can be assumed that NVMe drives will work long as they do not use 4k sectors by default.

    • Adata NVMe SSD : SX6000, SX7000, SX8200, SX8200 Pro etc.
    • Corsair NVMe SSD : MP500, MP510
    • Crucial NVMe SSD : P1
    • HP NVMe SSD : ex920, ex950
    • OCZ RD400 (and all Toshiba XG3-XG4-XG5-XG5p-XG6 line)
    • Intel NVMe SSD : 600p, 660p, 760p etc.
    • MyDigital NVMe SSDs : SBX - BPX
    • Kingston NVMe SSD : A1000, A2000, KC1000
    • Sabrent Rocket
    • Samsungs Polaris NVMe SSD : 960 Evo, 960 Pro, 970 Evo, 970 Pro
    • WD Black NVMe SSD v1, v2 and v3

    Drives in red require, NVMe drives with 4K sector sizes which require changing.

    Incompatible NVMes

    • Samsung PM981
    • Samsung 950 Pro
    • Samsung 970 Evo Plus*

    *Firmware update fixes this particular SSD

    Useful Links

    RAM/Memory upgrades

    Officially most sites list the maximum ram for the 2013 as 128. The Mac Pro 2013 uses PC3-15000 DDR3 ECC (1866 MHz) RAM, with 4 RAM slots. The Maximum DIMM size is 32 GB. Maxing out the RAM can be a somewhat pricey endeavor, but sites like aliexpress and eBay, mean this can be done for under $450 USD.

    ThunderBolt 2 to PCIe

    There's a fair amount of options today on the market like the Sonnet Technologies Echo Express SE1 - 1 PCIe Slot (roughly $200), and it scales up rather quickly.

    The biggest modifications to the Mac Pro 2013 aren't internal, but rather massive PCIe enclosures that generally cost in the $1500-4000 range, making them often as expensive as the computer itself. There are a few options on the market like the Sonnet xMac Pro Server, which adds 3 full-length PCIe slots (you can see it on youtube), and the absolutely absurd JMR Quad Slot Expander adding 4 PCIe slots and 8 drive bay just to name a few. For the truly curious, you can see the JMR expansion system innards.

    Not all PCIe enclosures support eGPUs. I've included in the eGPU section is a list of enclosures that support GPUs.

    Additional Notes on Thunderbolt 2

    There's a wide variety of Thunderbolt 2 products, chiefly storage systems (including RAID setups), and ThunderBolt 2 docks still on the market. Due to the sheer amount I'm unable to list them all, but it's important to remember that a fair amount of functionality missing from the 2013 can be recaptured with Thunderbolt 2, like previously mentioned, PCIe slots, eGPUs and the like.

    The Mac Pro 2013 to date includes the six Thunderbolt ports, the most found on any Mac before or since. To obtain peak performance, it's recommended that displays be connected separately from other high bandwidth utilities like external storage.

    The Mac Pro 2013 can drive three 4k displays or six 2560 x 1600 displays, and with the June 16, 2015 firmware update, three 5k displays (using two ThunderBolt ports and the HDMI port) internally.

    Thunderbolt 3 / USB 3.1c

    The Mac Pro 2013 can't be upgraded to Thunderbolt 3 bus speeds, but that doesn't mean it can't use Thunderbolt 3 / USB 3.1c devices (at the speed of Thunderbolt 2). Apple has a Thunderbolt 3 (USB-C) to Thunderbolt 2 Adapter, which is bi-directional, meaning the same adapter can also be used for Thunderbolt 3 Macs to use Thunderbolt 2 devices. Notably, not all Thunderbolt 3 devices are backward compatible, so you may want to check with the manufacturer for compatibility.


    It's nearly impossible to talk about the Mac Pro 2013 without mentioning eGPUs. Mac OS now supports AMD eGPUs (almost) natively, and macOS 10.14.x does not allow for modern nVidia support making it nearly a one-way path in eGPU. NVidia support for later eGPUs is limited to a maximum of Mac OS 10.13.x, and that does not appear to be changing due to a disagreement between Apple and NVidia. Unless this changes, this guide will not list Mojave incompatible NVidia eGPUs, despite the later GPUs being supported in Mac OS 10.12.x and 10.13.x. Currently, the RX (580x, 570x) line and the Vega (Vega, 48, 56, FE ) line by AMD are Mojave compatible, and the Keppler line by NVidia are Mojave compatible. The community has a searchable database. If going for an eGPU, I highly recommend upgrading to Mac OS 10.13+ as it includes more native support, thus much easier to set up, to the point of being (nearly) plug and play.

    Note: All Thunderbolt 2 Macs require disabling SIP and running Purge Wrangler to enable eGPU support.

    Lastly, Catalina requires some changes with eGPUs, and I highly recommend, - State of epgu for Macs - Catalina 10.15, the short answer is PurgeWrangler continues to be the most common vector for support.

    AMD GPUs

    Note: Minimum OS list required may not be correct, please contact me if incorrect

    AMD GPU Min OS Support Supports Metal
    R7 260X 10.12 - Curr Yes
    R9 270 10.12 - Curr Yes
    R9 280X 10.12 - Curr Yes
    R9 290X 10.12 - Curr Yes
    R9 380 10.12 - Curr Yes
    R9 380x 10.12 - Curr Yes
    R9 390 Requires hack Yes
    R9 Fury 10.12 - Curr Yes
    R9 Fury X 10.12 - Curr Yes
    Radeon 450 10.12 - Curr Yes
    Radeon 455 10.12 - Curr Yes
    Radeon 460 10.12 - Curr Yes
    Radeon 470 10.12.6 - Curr Yes
    Radeon 480 10.12.6 - Curr Yes
    Radeon 555 10.12.6 - Curr Yes
    Radeon 555x 10.12.6 - Curr Yes
    Radeon 560 10.12.6 - Curr Yes
    Radeon 560x 10.12.6 - Curr Yes
    Radeon 570 10.12.6 - Curr Yes
    Radeon 570x 10.12.6 - Curr Yes
    Radeon 580 10.12.6 - Curr Yes
    Radeon 580x 10.12.6 - Curr Yes
    Radeon Pro WX 2100 10.12- Curr Yes
    Radeon Pro WX 3100 10.12- Curr Yes
    Radeon Pro WX 4100 10.12- Curr Yes
    Radeon Pro WX 4130 10.12 - Curr Yes
    Radeon Pro WX 4150 10.12 - Curr Yes
    Radeon Pro WX 4170 10.13? - Curr Yes
    Radeon Pro WX 5100 10.13? - Curr Yes
    Radeon Pro WX 7100 10.13? - Curr Yes
    Radeon Pro WX 8100 10.13? - Curr Yes
    Radeon Pro WX 9100 10.13? - Curr Yes
    Vega 56 10.12.6 - Curr Yes
    Vega 64 10.12.6 - Curr Yes
    Vega Frontier Edition 10.13 - Curr Yes
    Radeon VII 10.14.5 - Curr Yes
    Radeon 5500 XT 10.15.2 - Curr Yes
    Radeon 5600 XT 10.15.3 - Curr Yes
    Radeon 5700 10.15.2 - Curr Yes
    Radeon 5700 XT 10.15.2 - Curr Yes

    macOS 10.14 Mojave Supported NVidia eGPUs - Only Keppler series GPUs are supported

    • GTX 650
    • GTX 660
    • GTX 670
    • GTX 680
    • GTX Titan

    *eGPUs require Mac OS 10.12 or above.

    Confirmed working Enclosures with Mac Pro 2013

    • Akitio Thunder2
    • AKiTiO Node
    • Asus XG Station 2
    • Blackmagic eGPU
    • Mantiz Venus
    • Razer Core X
    • Sonnet Breakaway 350

    Useful Links


    Outside of the extreme JMR solutions PCIe slot Rackmount cases, Mac Pro 2013 cooling solutions remain pretty slim. Most users elect to use various laptop cooling pads to place under Mac Pros (which do seem to help). If anyone has any information about physical mods or Mac Pro 2013 specialty cases, I'm all ears, and please reach out to me (see the bottom of this post).

    Useful Links


    The Mac Pro 2013 earns the distinction of sporting a modular design. There's not a lot to say here since iFixit gave it an 8 out of 10 for repairability and has pretty much every part in its Mac Pro Late 2013 Repair Guide. If you can do it, they probably have a beautiful step-by-step pictorial guide.

    Mac Pro 2013 won't sleep

    MacRumors members note that Hand-off can affect a 2013's ability to sleep. Disabling seems to be the fix.

    Communities & Blogs

    You're not alone. There are more people out there than you'd think who still love the Mac Pro 2013.

    • MacRumors Mac Pro Forum - The center of the Mac Pro universe.
    • MacProUpgrade - a private but very popular facebook group, primarily classic "Cheesegrater" Mac Pro users with some 2013 users.
    • Mac Pro Users - The another major FaceBook group for Mac Pro users, smaller but still helpful and it has the benefit of being public too (no sign-up process and can be browsed without a facebook account).
    • - The go-to place for eGPUs.

    Collected Articles

    Buying used Mac Pro 2013s

    Most forums when this question is posed is don't. The updated Mac Mini may have a soldered on CPU and storage but with the Core i7-8700B is much faster than the 12 Core Mac Pro in single-core performance and spitting distance of the multicore in Geekbnech scores, and packs Thunderbolt 3, which is double the bandwidth for the inevitable eGPU, and comes with USB 3.1c support out of the box, and doesn't have a history of frying itself. Plus, it's new, comes with a warranty and is even smaller. Then there's the iMac 5k, which has an upgradeable CPU making for faster than the base iMac Pro when tricked out too. I personally would not buy a Mac Pro 2013 with much better and cheaper alternatives. The 2009-2012 Mac Pros, which pack oodles more upgrades and stupidly better GPU options or the aforementioned Mac Mini, even with an eGPU would be roughly the same cost of a lower end used 2013. Unless the used market prices drastically change, the Mac Pro 2013's shortcomings are too significant to make me ever consider one.

    at the computer is booting, the GPUs are fine. They may have had their GPUs replaced with working ones. Next to the lower the AMD GPU model, the more chance it will remain problem-free. Unfortunately, Apple stopped selling the D300 Mac Pros long ago, so it's better tracking down a D500 model. Next up, many users have placed their Mac Pro 2013s on laptop coolers to help with the thermals. Due to the exceptionally tiny case, there are no internal cooling hacks beyond turning the fan up using 3rd party software. Lastly, have an exit strategy, you may live a full problem-free existence with a 2013 Mac Pro, but you may also end up with it's GPUs failing. Apple has closed its free replacement program as of April 2018 for the GPUs, and internet prices list anywhere from $700-$1200 from Apple or authorized service centers to replace the GPUs. At this price, it is effectively cheaper to buy a replacement Mac Mini. Working GPUs in the 3rd party sector are virtually impossible to find, and the rare ones that pop up fetch the price of Apple replacements. To be fair, this is the same problem laptop users face. While it is common sense, if you contract or freelance or work where you provide your own hardware, always have a plan that minimizes downtime. Despite being a modular design, the most failure-prone component is the absolute hardest to replace due to the lack of any inventory. Also, Apple quotes 3-5 days for a Mac Pro 2013 GPU replacement. This isn't to say it will fail, but there's plenty of horror stories on the internet. This could be the case relatively small, vocal group, but the general consensus is that the Mac Pro 2013 is not the most stable design.


    Oscar over the Mac Pro 2013

    Due to the ever-evolving list of possible upgrades and hacks, this guide is a living document. Thus the information contained may change, I've included a robust log of recent changes to help repeat visitors discover new content. Making and maintaining this guide takes a fair amount of work, and feedback from users is greatly appreciated to make this the most accurate/best guide possible. If you have new information not included here, suggestions, corrections or edits, please feel free to contact me at: I get a fair amount of questions, and I try to answer them to best I can. I'd recommend asking the MacRumors forum or MacProUpgrade group first as I'm just one person vs. the collective intelligence of a community. Notably, I do not own nor have I ever owned a Mac Pro 2013 (not that I wouldn't take one, but it is cost prohibitve), so anyone who can provide more accurate information, please do!

    05/12/20 - Massive GPU list updated. SSD updated. Catalina notes on eGPU updated. Minor visual update.

    10/15/19 - Added note on Catalina and 32-bit + firmware versions. Badly needed copy editing.

    10/07/19 - It's catalina time. Added OS Section, fixed an error about max RAM, included RAM specs, included link to the Amfeltec NVMe M2 adapter. Added another two links to eGPU section.

    07/05/19 - Added notes on sleep issues, mild intro update.

    05/07/19 - a second update, Thanks to the feedback of Brennan F and Daniel C for feedback on SSDs and eGPUs and some copy editing to boot.

    05/07/19 - First release and one year anniversary of my first Definitive Mac Pro Upgrade Guide. Fun fact, this guide is over 2300+ words whereas my other guide is 13,000+ words. Part of the amount of writing can be chalked up to having to discuss different models, five in total, spanning 6 years. This guide covers another 6-year span and only one model. It goes to show how upgradeable the previous Mac Pros were and how much less Apple has cared about them since.

    12 weeks with Figma: A review from a developer

    Figma icon


    I think it's important to say what this review is and isn't since it'll be a reflection a certain perspective. Rather than tackle this as full break down of all the features vs. Sketch, I'm approaching my opinions as a developer. As front end developer, I'm more hands-on graphics utilities that most devs, capable of performing design myself but electing to let people who are better at it than I do it. Early in my career, I'd of described me as a designer who could code, but for the last six years of my life, it's been the coder who can design. Through my career, I've seen a lot of utilities for designing webpages, Pagemill, Golive, FrontPage, Dreamweaver to shoe-horned attempts with Photoshop, Illustrator, Affinity Designer, and even Indesign, Quark and Pagemaker. You name the asset, and I've probably been handed it. Name the design app and more than likely I've toyed with it. Now that we've cleared that...

    A bit of history

    Web design and UI Design has been a strange arch. Once upon a time, we had Photoshop and Fireworks, Fireworks being more a utility than designing tool. Photoshop itself predates the internet as we know it today, debuting in 1990. For better or for worse, Photoshop today still mostly feels like Photoshop 3.0 which introduced layers, although it wouldn't be until 5.0 until we had automation, layer effects, multiple undo, editable type (vector), and the 4.0 features like multicolor gradients, grids, PNG/PDF support and actions... in 1996. To double down on this point, if you had stumbled in a time warp two decades ago and popped up in 2019, as a Photoshop user, you'd have all the fundamentals. The internet though has changed wildly in scope and functionality in during the same two decades, broadly replacing/supplanting entire industries (travel agents, music stores, print media, movie rental chains, book stores just to name a few). This isn't a knock against Photoshop. Its purpose is revealed in its namesake, and it's wasn't meant to be a UX utility. However, since the early web being primarily restricted to fixed width designs and bitmaps made Photoshop the go-to tool for web designers.

    Macromedia (the more web savvy of the two design giants) sensing the shortcomings of Photoshop (especially in the world of optimization and gifs) filled in the gaps with Fireworks in 1998. Outside of the oddity of Flash websites, the playing field for designers didn't change much on the design application side despite the evolution and adoption of CSS2.0 and CSS 2.1 during the late 90s through the 2000s. Some designers opted to use layout applications or Illustrator, with a desire to treat the webpage as page, knowing the shortcomings of Photoshop for layout. As designers and developers, we entered a tacit agreement that 960px was the width of a webpage which worked until we needed a mobile web.

    A New Challenger

    Like any large philosophical change, it is generally a reflection of adaption to the state of the world around it rather than attaining enlightenment first. Ideas do not exist in vacuums. It required a fundamental environmental change to create the necessity for a better way.

    In 2010, an unknown Dutch studio, flying under the name, Bohemian Coding released Sketch in 2010 while the same year Apple introduced the world to high-density screens with the iPhone 4. The timing was impeccable. It was a vector app but designed for presentation on pixels. While vector applications for years had acknowledged that not all illustrations endpoint was print, they still were print-forward. The simple act of snapping points to pixels, and focusing on vector features that reliably exported SVGs meant that the (finally) widely supported vector format could generate assets that looked good anywhere or be prerendered to a PNG or JPEG. Values for common assets like font stacks and color also were displayed in CSS, which made developers happy too. By 2013, high-density displays made the jump to desktops with Apple's "Retina" display desktops. Websites didn't need to be just responsive; they also needed to be high resolution*. Sketch made it easier to tackle both problems. Thus, it largely dethroned Photoshop for web design. Photoshop was now left to do what it did best: edit photos.

    * It shouldn't come as any surprise that backlash against skeuomorphic design came in the wake of widescale support for web fonts, SVGs and high-density displays.

    Since the rise of Sketch, there's been a cottage economy of ancillary utilities, from simple plugins to the ambitious like Principal and Flinto for UX motion graphics/interaction design, all built around Sketch. The world of design and prototyping has exploded with tools like Adobe XD (and forgotten attempts like Muse and Proto) and the latest hip entry, Figma.

    All hail Figma?

    Figma Screenshot

    Figma seems like it might be as revolutionary as Sketch. Figma is a different beast as a web app, and using some avant-garde tech, as it uses Webassembly to distribute a C binary to make it far more performant than a usual web app. Being a web app also opens itself up to something that Adobe never has quite figured out: collaborative design. All designs are automatically stored in the cloud, which isn't a problem until it is. Instead of handing off to a client a PSD or Sketch file (or use a 3rd party service like Zeplin and iDoc), you share a Figma URL. If they don't have Figma, no worries, you can still use the viewing mode just fine which allows for exporting. There's even a free tier to get started on.

    Due to the nature of being web stored, means any designers on a team can invite other users to view and even edit the same file... at the same time. This means any changes are immediate. One can even watch in real time as a designer corrects a design or quickly mocks up a new piece of content. Also, there isn't any worry about an individual having the latest version of Figma as there are no separate versions. In this regard, Figma shines. Any time I'm viewing a design, I can rest assured that I am viewing the current design (assuming the designer keeps to that flow).

    The app is surprisingly fast, in a web browser or even as an Electron app. In fact, the Electron deeper OS interactions are largely ignored, sans system menus that correlate to the application. This makes updates to Figma in the Electron app happen inside the Electron app as opposed to having to download a new version. Updates aren't entirely silent, but when they arrive, it's quick. Running as a web app, the overhead bloat that comes with a web app requiring a browser to run, the Webssembly core makes the app feel zippier than other popular Electron framework apps like Slack and Atom that can choke up when your computer is under stress or otherwise. Some designers have even stated that Figma is faster than Sketch. I find this claim dubious, but as a web app, I can't name anything faster. It's damn impressive. It's also a memory hog. It also can seriously slow down if you have more than a few people viewing a document and have several apps open. There's no way to turn off the "multi-player" mode.

    As a design utility, Figma has a mild departure from other vector applications, chiefly how it handles vector drawing. Figma allows for multiple points from a single point, called Vector Networks which honestly of all the innovative features probably is the most singularly interesting from a purely design sense. It's so simple when you see in action, yet powerful. I have a feeling this will become widely copied if it hasn't already. I'd rank this as even more impressive than the collaborative designing as we've seen collaborative design inadvertently via Google's G-Suite, and in online games (while I haven't even played it, Minecraft's long appeal seems to be collaborative designing).

    vector networks

    Pictured: Vector network allow for multi-point vector points

    That said the limitations of view-only do not allow a user to manipulate it. When assets are handed off to me in Sketch, often move things around and generally make a mess of the file, extracting, manipulating, measuring. Figma does not have a user like me in mind. Instead, all manipulations are meant to be towards the end game of a finalized design, not a deconstructed one. Sketch continually annoys me with its autosaving, and no real "save as" function, but I've learned to live with it by duplicating a file and denoting it as "final" vs. "edited," to know which I've messed with. Figma, when I'm not granted ownership, doesn't let me duplicate to a file I can manage, and do my Jack-The-Ripper disemboweling to get the assets I want.

    Figma's handling of images isn't nearly as exciting. Placing images means uploading, which is a step backwards but part of the process. Cropping is well done. For basic crops, unlike most vector apps that require creating a vector object, and placing as a mask, Figma lets you more easily quick and dirty crop on the image. The area outside of the crop is designed when adjusting as transparent. It's a nice touch as you can easily see the remainder of the image and its contents. Also as options are CSS-like abilities to cover, fit and contain within the basic crop. I especially like this as its akin to how a web browser treats an image with object-fit. Opening large projects though can take a few seconds images to appear, as they are being downloaded. It's tolerable.

    Figma also allows for basic prototyping, akin from what I've seen more like MarvelApp than anything nearly as robust as the interaction designer tools like Flinto and Principal. It's functional, easy to use, and comes baked-in as opposed to Sketch's requirement of a separate utility. It's worth mentioning but I haven't played with it much. It's reasonable that Figma could be used for static wireframing/prototyping.

    Then the rough edges start to show

    Figma Screenshot

    Currently, there isn't an export option for the original asset. You can export your cropped image, but not easily grab the original asset. This isn't so great if you're using an image as a background image or an image that'll be using object-fit. Plus, there's no real "native" resolution displayed. If I export a 2x or 3x version of an image, will I get a blurry version? Who knows. There isn't a native export or native resolution listed for an image. I can't copy and paste out either as Figma's non-native experience means the clipboard isn't a truly native experience. Sketch allows you to copy as CSS, copy as SVG, copy Styles or copy the object in Sketch. Figma does not.

    In fact, Figma's handling as an asset manager sucks. Unlike a Photoshop smart layer, or at least in Sketch, the ability to grab the source contained within the sketch file, Figma is a black box.

    Figma also has a weirdness when copying out text (the only asset you can extract from a document to other applications) often adding in extra return carriages to the beginning. It's hardly a deal breaker but adds an element of unpredictability and clean up. This isn't something that one normally faces in design applications so it's worth noting.

    Probably the issue that draws the most ire from me is stupidity of the onscreen measuring, that is obsessed with the document borders and not relationships between objects. Figma loves to tell you how far an object is from the edge of screen but can be pain to get it to show relationships. Highlight one object, and you can hover over another object and it'll show you the distance but sometimes to the corner and not edge. It's also really bad about text, only calcuating to the container for the text and not the text itself. This feels significantly behind Sketch.

    Then there's SVGs. Figma is annoying, select any two vector elements, and you'll get two separate SVGs. Exporting a group is mystifying experience, and all strokes are converted to fills making for not the smallest SVGs. Trying to get an icon export lead a designer on my team re-drawing the icon in Illustrator. That's.... not good. Exporting gets even dumber, select two objects? You'll get two files. Oy. Sketch it is not. The rulers seem surprisingly dumber too.

    It also doesn't win any awards for multipage layouts which took Sketch awhile to figure out but, a multipage design exists as basically Pages, in an all or nothing design. It's there, but there's little beyond what one is used from Illustrator to Sketch.

    Also, being a web app means a complete lack of plugins. Unlike, say, Atom which is probably the pinnacle of Electron applications, Figma has no such architecture for plugins... for better or for worse. This isn't surprising but one of the big draws for Sketch is major services releasing plugins for Sketch, like Sketch For Marvel and Craft by InvisionApp. This may scale in importance depending on who you are, but Sketch has plugins, Figmas does not.

    Figma also touts that it outputs CSS code and more than Sketch. Output it does but useful it is not. It gets the basics right: font-family, border, color, font-size, which is great. The rest of what it outputs is a head-scratcher. It's a bunch of absolutely positioned insanity, I suppose in some sort of nightmarish world, one could make a non-responsive website using, but for just about any sane person it's useless. It's a slightly lesser version of Marketch.

    A higher tier

    Figma also offers a secondary tier.... a very expensive one at $45 a month, per user offering a suite of features that I did not have access to thus cannot comment on, beyond that they exist and they are expensive, almost exclusively related to access controls to designs and ownership.

    Personal Take

    Figma's best features over Sketch are:

    • Ability to share projects easily, no software to download to view.
    • Cross Platform.
    • Collaborative design on the same file in real time. I don't know anyone who works this way but it's easy to see the value.
    • Vector Networks. It's so simple yet so powerful, why haven't other vector drawing utilities had this? Truly a leg up on other vector apps when it comes to pure illustration.
    • A free-tier that's meaningful for people looking to learn (no private designs, limited undo history to a month, limited a month of people per project)

    Sketch's best features over Figma are:

    • Predictable and clear exporting
    • Native OS functionality making for importing/copy and pasting such
    • Exceptionally strong community support via mature plugins and visual libs
    • More adherance to SVG standard
    • Better text editing.
    • Better onscreen measuring.
    • Better asset handling.
    • No permissions when handed file needed to be granted to gain full access.
    • No required subscription for levels of access, less expensive ($99 for a year of updates + no requirement to continue subscription vs $144 and requirement for subscription for private designs/$45 a month for organizational users) and includes Sketch Cloud access for free.
    • Most of Figma's big features are matched with Sketch's plugins (sans real-time collaborative design and Vector Networks)

    There's plenty of head-to-heads of Figma vs. Sketch, but as a developer, end of the day, currently, I prefer Sketch, versioning issues and all. I can pull and edit and pick apart assets, and the exporting is vastly superior. The auto measuring in Sketch is also far more useful, measuring to edges and spacing seems more intelligent. Sketch also feels like a native app, whereas Figma comes as close as I've seen as a web app, in the same tier as Atom and Slack but with those "this isn't right" moments like when copy and pasting. Sketch's cloud feature suffers the same issues that Creative Cloud does: everyone needs to buy in to fully appreciate it. The best analogy I can use is Figma is Google Docs. Sketch is Apple's Pages. (Don't over think it too much).

    It's also fair to say I like Figma but I don't love it. It's not nearly as polished as I'd expect for an application that has the audacity to ask for either $144 or $540 a year for a seat. Figma, you're interesting but $144 is pricey and holy hell you are not worth $540 a year.

    Sketch isn't problem free, I still hate its file saving. Manipulating text still isn't as powerful as I'd like either. Auto-save is great, but it also makes for strange moments when you want to "Save As" and leave your original intact. I imagine this could be because of the paradigm I grew up with in the 90s.

    The future lies somewhere between Sketch and Figma. Some features are a novelty: I can see value having multiple designers able to work on a singular design but I also wonder how many projects need this ability. That said, the killer app is handing a URL to a client or internal team member and knowing that it will remain current, and anyone can view it natively with or without a license. That for some might tip the scale a good number of users and I don't blame them.

    After a decade, FireFox is winning me back.

    Something happened that seemed unshakable even a year ago. To say I have a lot of opinions about web browsers is putting it mildly as someone who's spent the last decade his life coding for them. Like most web devs, I went Chrome around 2009-2010 and stayed. Chrome started feeling less like a choice but a natural law. I already felt like too much of life was encapsulated by Google, from Gmail, to maps, to Chrome, not mention my dev accounts for API keys and the usual dev related nuggets of info Google has on me. I've tried to de-google myself, but each active choice seems to have me crawling back. I used Bing for years and still sorta, but it doesn't work for my daily workflow. DuckDuckGo lasted all of a week before I switched back to Google. Apple Maps has been mildly successful as an iPhone user. I still end up using Google Maps 75% of the time making it my most successful de-googlifcation attempt. I haven't even bothered with Gmail. However, something happened. My primary home browser is now FireFox. Unlike previous attempts, I didn't wake up and eschew Chrome but gradually over time I've found myself using FireFox more. Why?

    Reason #1: FireFox Quantum is radically more performant

    The guys at Mozilla knew that the performance was an issue, and it was even more so on MacOS. In a previous generation, the best way to experience FireFox was Chimera/Camino, a beautiful port of FireFox's Gecko to Objective C. FireFox has never felt mac like, and always laggier than other browsers. When Safari shook up the browser world, it was clear how much better of an experience Safari was than FireFox on OSX. It was fast. Ever since Safari, FireFox has been a 2nd tier experience on macOS. The lack of multi-threaded support was a big kicker as Webkit added this years ahead of Mozilla. Quantum brought this with its new Stylo CSS engine, and it shows. It's fast, is it faster than Chrome or Webkit? Probably not, but its damn close.

    Reason #2: FireFox has containers!

    I'm not much for social media. Name a service, and I probably don't use it (although I do like GoodReads and Untappd). I do however have FaceBook account dating back to when I first signed up for it in college. Facebook easily the most problematic of the social media companies. FireFox Quantum has a concept called containers, a way to isolate browsing experiences (cookies/caching) from other portions of the browser. The FaceBook container effectively puts FaceBook in jail, and I love it. Sorting sites also means placing various experiences separately from each other to prevent cross tracking. It's effective and wonderful. As much as FireFox touts privacy, it's never really been able to differentiate from Apple on this front.

    Reason #3: Better UI

    It's a goofy complaint, but the UI of FireFox before Quantum was relatively ugly and had a lot of wasted space. I wouldn't call the new FireFox beautiful, but it's solid and minimal, and I actually do not mind it at all. It has a distinctly Windows X look, but it's a big step up.

    Reason #4: DNS over HTTPS

    Chrome and FireFox are both adding this, but FireFox has it without having to use flags right now. DNS for years has been overexposed, so it's about time we have DNS over HTTPS.

    Bonus reason: Firefox screenshots

    Honestly, I don't use this as much as I should, relying on OSX's wonderful screenshotting but it is a nice feature.

    The limitations of integrating Pardot Forms into React or modern framework

    Here's a post that hopefully saves someone a few hours of googling. I have a client who has a React-based Kiosk application and they want to use Pardot Forms to capture leads. This seems like a relatively straightforward, even reasonable ask and I assume, you dear reader thought so too. So you've probably tried something like this, assuming that you can post to a form like some jackass who expects a modern framework that assumes that this is feasible and you envision and/or wrote something like the following in beautiful ES6/ESnext syntax:

    submitSignup(e) {
         const data = new FormData(document.getElementById('signupForm'));
         fetch('url-to-form-paradot-submission', {
           method: 'POST',
           dataType: 'jsonp',
           body: data,

    This seems like it should work but, of course, it does not. Instead, you'll be greeted with a wonderful CORS error but the Pardot forms do not have any CORS controls. There's little information, and the developer documentation is sparse but I was able to unearth from the salesforce docs the following:

    Submissions Using Ajax

    Pardot doesn't support submitting data to form handlers via Ajax requests. When attempting to submit data to a form handler using Ajax, you will likely see errors like:

    XMLHttpRequest cannot load {}. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin '{page from which form handler should be getting submitted on client's website}' is therefore not allowed access.

    This is what's known as CORS (Cross-Origin Resource Sharing). Pardot doesn't currently support CORS or JSONP for form handlers. It is possible to simulate a JSONP response by setting the Success and Error URLs for the form handler to be JavaScript URLs that execute Success and Error callbacks, respectively.

    So the answer is that it doesn't support AJAX and the only way is to hack the form responses to trigger JS scrips. I managed to find a GitHub project where a user created a Pardot Form AJAX Handler which shows an example with callbacks. Oy.

    Bonus: As total newbie to Pardot, I found these videos helpful for navigating the Pardot interface. They're out of date, the UI has changed, but the core features can mostly be found in the same places.

    The Mac Pro should be the most boring Mac.

    At the event, Apple also plans to debut new software features for its devices, including a dark mode for easier nighttime viewing and new productivity tools for the iPad. The company has also internally weighed previewing a new version of the high-end Mac Pro, according to people familiar with the deliberations.
    - Apple Plans on Combining iPhone, iPad, Mac Apps by 2021, Mark Gurman, Bloomberg

    Buried in the bombshell is something of particular interest to me and this blog. I find this worrisome, Apple really doesn't seem to get it. The Mac Pro should be a boring box with the latest whatever internals with a few drive bays and PCIe slot with big numbers that people like myself only care about. What it shouldn't be is worthy of a press event or representing a paradigm shift or some other Jonny Ives goofball design. Want a beautiful shiny "wow-your-clients" pro computer? That's the iMac Pro, not the Mac Pro. What people want is user upgradable storage, ram, and PCIe: for NVMe storage, for GPUs, for esoteric I/O upgrades. How do I know this? There's certainly a demand as I've had well over 40,000 different users access my Mac Pro Upgrade Guide, and my blog isn't even a blip in the digital ocean.

    Testing accessiblity with the CERN WorldWideBrowser

    Here's a fun one for any web developers out there, you can play with a JS reconstruction fo the WorldWideWeb Browser (the first web browser) for the NeXT OS. Go to

    Its fun in it's own right, but the CERN browser is a good measuring stick of how accessible a website truly is without CSS or Javascript or even a full HTML 1.0 spec. Without any support of POST methods, WorldWideWeb can't even interact with forms but information is often visible.

    Take, for example, my blog. It's easy to consume, meaning anything can traverse it and index it with minimal effort. Anyone with an out-of-date browser arguably can access it. Incredibly many news sites are usable albeit awkward, like and NYTimes fair well. ABCNews does not, nor will anything requiring a form like

    Information architects should take note; there's something to be said about resilient markup languages are.

    Playing Dune 2 and Dune 2000 on Mac OS (and Command and Conquer Tiberian Dawn and Red Alert)

    Every now and again, I get a hankering for retro gaming and it ends up on this blog. I never played Dune II: The Building of a Dynasty on a PC, only the Sega Genesis port Dune: The Battle for Arrakis so it was news to me that you could play Dune II on MacOS. I assume anyone who is reading this probably knows the place that Dune plays in gaming history, but it's largely considered the title that defined the genre of the real-time strategy (RTS) or the first real time strategy (even if not entirely correct.

    I can't say I have an special affinity for the genre, as pretty much the only other RTSes I've played are the original Command and Conquer and Warcraft 2, but I always liked Dune: The Battle for Arrakis. I've revisited via emulation a few times. I hoped Dune 2 or Dune 2000 would end up on a service like but sadly, it hasn't. Thanks to open source, both Dune II: The Building of a Dynasty and its sequel, Dune 2000, can be played on Mac OS, natively and with some modern improvements.

    Disclaimer: By the letter-of-the-law, abandonware isn't 100% legal but there's no real legal vector to obtain these games, each over 2 decades old. I don't see a moral quandry here, but you can always obtain the original game disks if you see fit.

    Dune II using Dune Legacy

    Dune Legacy on macOS 10.14

    Dune Legacy gives a nice modern twist to the original shortcomings of Dune II, including better AI, head-to-head, ability to group select units, more hotkeys, modern resolutions, HD graphics, and so on.

    1. Search "Dune II Abandonware" in your favorite search engine, it'll come up on many sites. Download it.
    2. Download Dune Legacy
    3. Open the DMG, and drag the Dune Legacy app to your Applications folder. Also, decompress the PC copy of the abandonware Dune II
    4. Right Click the Dune Legacy App, and click Show Contents. Open within the app, Contents -> Resources
    5. Drag all the .PAK files from decompressed Dune II into the Dune Legacy -> Contents -> Resources
    6. Double click to start, you most likely will need to whitelist it in GameKeeper. (Go to system preferences -> Security and Privacy)

    Dune 2000 using Open RA (and Command and Conquer)

    OpenRA Dune 2000 on macOS 10.14

    Open RA is for the Red Alert series, but also includes Dune 2000 support much like Dune Legacy modern screen resolutions and minor tweaks. Unlike some of the other OpenRA ports, OpenRA Dune focuses on delivering a recreation rather than improvements.

    1. Optional: Nab the Dune II ISO for Windows, from a site like myabandonware or such.
    2. Download OpenRA
    3. Unless you've previously installed Mono, you'll also need to download Mono, an open source implementation of Microsoft's .NET Framework
    4. Install Mono and then OpenRA - Dune 2000, Dune 2000 will automatically download the necessary graphic and sound assets, but if you'd like FMV or Movies you'll need the ISO, these can be installed at any time in the Management.
    5. Also, see d2kplus for mods, some are supported in OpenRA. you most likely will need to whitelist it in GameKeeper. (Go to system preferences -> Security and Privacy)

    For the Command and Conquer series, the install process is the same: find the ISOs for the music and movies and install Mono and OpenRA. Enjoy!

    the iMac hasn't been updated for 602 days

    As noted in the MacRumors Buyer's Guide and discussed in the MacRumors forums, it has now been 602 days since Apple last updated its iMac lineup, a new record for the longest span between iMac refreshes ever. The previous record was 601 days between October 2015 and June 2017 refreshes. - Joe Rossignol, MacRumors

    Ouch. Just as I was spit-balling the future of the Mac Pro, Apple has let another computer rot in the supply chain. Apple's major press-opuses should be relegated to only large scope updates, and frequent refreshes with much less fanfare, assuming this is even a bottleneck. Really though, we should see nearly yearly updates with CPU and/or GPU refreshes to keep in step with PC manufacturers. This isn't even a particularly astute observation or even remotely original. I'm guessing this has more to do with maximizing profit vs. the hassle of changing the magical JIT manufacturing. It's not a hopeful sign if even Apple's signature desktop has been given the cold shoulder. There's a feeling among the Mac faithful that Apple doesn't care about the Macintosh. With numbers like these, it's easy to see why.

    Year of the Mac Pro?

    Today the Macintosh is 35 years old. Rather than a retrospective, I'm more interested today in the future of the Macintosh as my most trafficked blog post I've written is an upgrade guide the classic Mac Pro which are now 13-7 years of age depending on the model. In mid-2018, Apple announced that the Mac Pro would be revamped in 2019 and yet many Mac loyalists were irked that at the WWDC Apple didn't announce a Mac Pro. It was never going to be a one-more-thing, as it's very unlike Apple to announce a schedule for a future product.

    Apple's desktop line up is far more crowded than it's been in some time, with the revamped Mac Mini, iMac, iMac Pro, and Mac Pro (which is half decade without an update). The Apple mainstay has a been at two or three computer company per formfactor, squarely divided between desktop and laptop (and briefly servers and eMacs) since roughly 2005 for desktops (Mac Mini), and 2008 for laptops (MacBook Air). The formula has been (outside of the illfated G4 Cube):

    Classic Era (2000 - 2005)

    • Entry Level - (iMac/iBook)
    • Professional - (PowerBook / PowerMac)

    Intel Era Line Up (2006-2016ish)

    • Budget/Small form factor - (Mac Mini/MacBook Air)
    • Mid-level - (MacBook / iMac)
    • Profesional - (MacBook Pro / Mac Pro)

    Current Era (2017ish - Current?)

    • Budget - (Mac Mini, MacBook Air, iMac 21 Standard Definition, MacBook?)
    • Mid-level - (MacBook? MacBook Air? / MacBook Pro 13 / iMac 4k/5k)
    • Profesional - (MacBook Pro Touchbar / Mac Pro / iMac Pro)

    Any iteration of the Mac Pro is going to confuse the Mac line-up further. The MacBook Air, MacBook Pro, MacBook are all within $100 of each other at for base models and 1.4 pounds, 1 hour of battery life and 1 inch of screen size. The Pro is clear-cut as the performer and the MacBook as the traveler and the Air as the bridge... for some reason. It'd only really take adding a second port (perhaps Thunderbolt) to the MacBook to negate the under performing Air.

    The iMac Pro is undoubtedly a powerful machine but at a king's ransom, starting at $5,000. If a Mac Pro lands with anything resembling a dedicated PCIe slot, user serviceable RAM and CPUs, I can't imagine anyone opting for an iMac Pro and especially if starts at the still very expensive $3000 entry point as previous Mac Pros have. The iMac Pro is powerful but also, at it's price-point with non-upgradable GPU, and terrible user serviceability, hardly a compelling buy. Then there's the Mac Mini, if the Mac Pro keeps its current G4 Cube influenced hostile-to-power-users design, then anyone who can survive on more modest CPU and 64 GB of RAM is likely to eat the cost of a Thunderbolt PCIe case (or do the same with an iMac 5k). Based on my interactions with the Mac power users of this planet, we're all after the same thing: PCIe, User serviceable RAM, and upgradable CPUs and storage. This really should be Apple's most straightforward release year-over-year, the form factor of the classic Mac Pro is perfectly fine. Dust it off, update the ports from FireWire/USB2.0 to Thunderbolt and USB 3.x, slap in a modern motherboard with the latest specs and call it a day. Ideally, Apple would sunset the iMac Pro as a nice experiment in industrial design. As much as Apple dislikes user-control, the one segment where the users know better than Apple is professional work, see the fiascos of Final Cut Pro X and the Mac Pro 2013 which lead to the deep pockets of Hollywood abandoning Apple for the likes of PCs and AVID. The iMac's DNA never has been to be performance monster, although it evolved from entry level to a nice mid-level computer, sporting a beautiful integrated display. If the Mac Pro is modular, than the iMac Pro becomes the next G4 Cube.

    Recently though, with the reintroduction of the MacBook Air, Apple has shown a willingness to confuse the Mac line up with no clear price point. This should be a bad thing but it isn't for the Mac Pro. So where does that leave us? I'm mildly hopeful. Just mildly.

    Edit 01/29/19: NYTimes writes A Tiny Screw Shows Why iPhones Won’t Be ‘Assembled in U.S.A.’, which blames Mac Pro production yields on a lack of a certain screw. The idiocy here is the over-engineering and probably a healthy dose of hostility to user serviceablity, to echo myself Right To Repair Law Should Be The Rally Call Every Mac / iPhone User.

    Marco Rubio leads the charge to ban states rights from protecting consumer privacy

    "Any national privacy law must provide clear, consistent protections that both consumers and companies can understand, and the FTC can enforce. That is why my bill leans heavily on the Privacy Act framework," Rubio wrote.

    Rubio's bill would have the FTC establish a process in which individuals can contact companies to request access to their personal data. Companies would have to either provide the data to consumers or delete the data. If a company lets an individual view the data, the company would have to correct any mistakes if the person demonstrates that the records are "not accurate, relevant, timely, or complete." Companies would only have to delete the data if they choose not to provide it to consumers upon consumers' requests.

    -- Jon Brodkin, Sen. Marco Rubio wants to ban states from protecting consumer privacy, Ars Technica,

    This isn't even a "protection", as it's often unclear who even has your personal information and places all the onerous labor on the consumer. Deleting the data just means the company has to re-collect the information.

    This is on the heels of Tim Cook's call for comprehensive US privacy laws and Motherboard's bombshell, U.S. Carriers Are Selling Customers’ Real-Time Location Data . We've seen time and time again how callous and morally bankrupt companies can be when it comes to selling personal information time and time again. There shouldn't even be an argument, there's clearly a need for privacy protection... and yet here we are, in the midst of a Kakistocracy. For those keeping score, congress voted in 2017 for a rule to remove a FCC privacy rule, and The Verge compiled a list fo the 265 members of Congress and how much it cost to buy them off. All were Republicans.

    On the other side of the isle, Oregon's own Senator Ron Wyden has proposed a privacy law send company execs to prison for 20 years.

    Anyone who gives one royal damn about internet privacy should best review the sharp divide among party lines, such as the fight over municipal broadband. If there's one thing that's been apparent when it comes to digital rights advocacy, you can count on the Republicans to oppose it.

    Lab Rats: How Silicon Valley Made Work Miserable for the Rest of Us by Dan Lyons - A book review

    Dan Lyons is one of the more unlikely critics of Silicon Valley culture despite being a long time satirist, making his splash with his Fake Steve Jobs (FSJ) blog (and mediocre novelization). His irreverent portrayal of a smack-talking, faux new-age Steve, seems a bit short in retrospect. It was clever, candid and most of all funny, but never eclipsed the caricature of the on-the-spectrum, eccentric, once-hippie tech billionaire. In the end, in the cannon of Steve, Lyon helped lionize (yeah, you had to see that coming) Jobs, with the endless speculation of who FSJ real identity was. It was a simpler time.

    As a seasoned tech journalist, watching tech giants cannibalize his own industry, Lyons ended up regurgitating in the soliloquy, "if you cant' beat 'em, join 'em," and thus fully embraced the mantra when he took a tour of duty at Hubspot.

    What followed was his book, Disrupted, a highly cynical view of the lauded unicorn companies of the Silicon Valley, where ageism, sexism, and even old-fashioned systemic racism run amok. Lyons learned the brutal truth behind the smoke-and-mirrors act, where Hubspot succeeded behind banal new-age corpo-speak, armies of call-center drones, using the oldest sales techniques in the book. His experience struck a nerve, that perhaps our so-called unicorns weren't special, other than the ability fib that they were more than anything than a donkey with a paper cone, making an ass out everyone who for buying into such a shallow sham.

    This go around, Lyons drops entirely his satirical lens refocused to far serious with more precision, with a deeply skeptical view of tech companies of all stripes, and argues that they are accelerating the wealth-income gap (Spoiler: they are), sowing the seeds of worker discontentment, destabilizing the economy and dehumanizing people by treating them as actual cogs in a machine, or lab rats in an experiment.

    There are interviews from anonymous interviews, to people willing to go on the record about their personal stories in the churn of the new workplace. The cast is extensive and of many backgrounds, be it newly minted fresh college grads suffering depression from being fired for not being a culture fit, workers feeling the burn of masked-racism, to the truly dystopian, workers who have to camp in freezing weather in England to save money while working for Amazon warehouses.

    The most poignant chapter is the damning of Amazon, who's piddly $15 raise still is insufficient, ending on the eerie Nick Hanauer interview about Jeff Bezos.

    “Hanauer, the billionaire-turned-activist, was at one time close to Bezos. I asked him if he had ever talked to his old friend about paying workers better and treating them more humanely. “I took a crack at getting him to care about it,” Hanauer said. Apparently, Bezos wasn’t persuaded. In recent years, “I have lost touch with Jeff,” Hanauer said. He was reluctant to say more.
    For years Hanauer has been trying to convince legislators to raise the minimum wage to $15 an hour, more than double the current minimum wage of $7.25. Even that $15 wage would not be enough to make things square, but it would at least be a start.
    “If the minimum wage had tracked the growth of productivity since 1968, it would now be $22,” Hanauer says. “If it tracked the top 1 percent, it would be $29.”
    “The reason to give back the money, he says, would be so that the one percent can save their own skins. As Hanauer sees it, the election of Donald Trump might be only the first step toward something much worse. “People were hurting, and they lashed out—by voting for the guy who was lashing out, too.”
    If we don’t shift wealth back toward workers and just keep carrying on the way we are now, Hanauer predicts we will end up in a real-life Mad Max movie: “If you don’t give it back, things are not going to get better. Oh, dude, we are in for a bumpy ride. This is going to get way worse before it gets better. I think the country is in trouble. The West is in trouble. We have institutionalized a set of dynamics which benefit the few and immiserate the many.
    “People are not going to get less pissed. People’s lives are going to get worse. People are going to be even more angry and more polarized. The talk will get even crazier. Plan on violence. Plan on it. People do stupid shit when they’re angry. It’s not going to be good. I think we’re going to have a lot of civil unrest. Hopefully we will avoid a civil war. The last time the country was in crisis like this was 1968. Remember that? We had hundreds of bombings. We had riots. Well, it’s been fifty years. We’re right on cycle.”

    It might sound alarmist or even silly out-of-context, but listening to NPR's Morning Edition with person-on-the-street-interviews, the idea of civil-conflict is often echoed by both the right-wing and left-wing alike. This isn't some lame "horseshoe theory" armchair analysis. Both sides blame each other for the problems, being divided on wedge issues instead of class issues.

    After so many books on my reading list, from Chris Hedge's America the Farewell Tour, Charlie LeDuff's "Sh*tshow: The Country's Collapsing and the ratings are great", Anand Giridharadas' "Winners Take All: The Elite Charade of Changing the World" all in the past two months, there's one very loud reverberating echo: The rise of populism can clearly be laid at the feet of the fear or reality of being left behind. James Carville campaign strategist of Bill Clinton once said, "it's the economy, stupid." He's correct, and yet wildly-off-base, as our neoliberal economic platform ushered by Reaganomics was realized in the under the tech-happy hand of Clinton. The economy, stupid, is now the cross-to-bear with the Silicon Valley being the chief architect of its demise.

    The final chapters focus on companies bucking the trend and succeeding while doing so. There's also a fair amount of evidence cited that companies with actual diversity and well-compensated, fairly treated employees tend to perform better. There’s a bit too much faith in companies with social agendas, but his end thesis supports this view. Lyons’ in the end doesn’t suggest an end to capitalism, (as the saying goes, capitalism is the worst system until you consider all the others) but an end to Miltonite religious zeal for shareholder capitalism. It’s a mammoth ask, but it is a salient point: maximizing short term gains for the quick buck isn’t the way to build the future.

    Most of all, the book is easily digestible for a dreary topic and made me laugh out loud... which probably wasn’t intentional.

    I originally posted this on GoodReads. I like GoodReads, but the community features are of little interest to me beyond reviews. I've had very little interaction with the community. It's mostly a place for me to log the books I read and leave notes to myself.

    I noticed yesterday though, five months ago I had a comment on my review of Lab Rats. The comment in question was about a particular phrasing that I never recalled using. I didn't see it in my original review (maybe I edited it out? I have a habit of obsessively reediting things). So I replied, not expecting a reply after a 5 month old comment but what ensued was gold, better than I ever expected. I figured it was worth screen-capping this exchange.

    A quick blog update - coming soon book reviews

    Not that I have regular readers, but I have a habit of announcing format changes to my blog, such as when I decided to focus on long form posts, my transition to jekyll from Tumbr, and that time I added view by topics to my blog or HTTPS.

    On that note, I'll start porting the book reviews I wrote on GoodReads to my blog on tech related reads since I burn through a lot. From my experience, anything I do on this blog is bound to get far more exposure than anything I do on social media, and I'd rather contribute to the open-internet instead of mega-properties. It seems silly that I've dedicated the time to write several long-winded meaningful reviews on GoodReads only to get flak from a neckbeard for a pointing out a book series turned into a cringey harem fantasy.

    Mostly Complete list of HEIF / HEIV (heic) on macOS in Q1 2019

    HEIF (High efficiency image format) as known as H.265 and MPEG-H Part 2 was introduced to iOS 11, and to later macOS 10.13.4 on March 29, 2018. It's been less than a year and support has rolled out at a reasonable pace. I've elected not to list HEVC (High efficiency video codec) itself, as it is housed the .mov container format and most video applications using Mac OS's internal video engine will support it. Also, open source libraries like ffmpeg have added support for HEVC. The .heic (HEIF) file format is a much bigger grab bag from my experience. Many video applications now support HEIC such as Premiere / Final Cut Pro / Motion / DaVinci Resolve hence they are on this list.

    I've tried to compile a complete list of known applications that handle HEIC. Undoubtedly I'm missing a few so if anyone has any others I'm not listing, feel free to let me know. Moving forward, .heic support is likely to be assumed. Notably, Affinity Photo on the desktop doesn't support HEIC (yet). I'll try to maintain this list at least until the one year anniversary if not a bit longer.


    • Preview (macOS 10.13+)
    • Lightroom CC 1.4+, LightRoom CC 7.4+ (macOS 10.13+)
    • ImageMagick (macOS 10.13+)
    • Graphics Converter 10.4.3+ (macOS 10.13+)
    • Pixelmator/Pixelmator Pro (macOS 10.13+)
    • Acorn 6+ (macOS 10.13+)
    • Omnigraffle (macOS 10.13+)
    • Sketch (macOS 10.13+)
    • Adobe Photoshop Elements 2019
    • Adobe Photoshop CC 2018+
    • Adobe Premiere Elements 2019
    • Adobe Premiere CC 2018+
    • Apple Pages (macOS 10.13+ warns about possible iPad support)
    • Apple Keynote (macOS 10.13+ warns about possible iPad support)
    • Apple Final Cut Pro (macOS 10.13+)
    • Apple Motion (macOS 10.13+)
    • Apple Compressor 4.4 (macOS 10.13+)
    • DaVinci Resolve 15+ (macOS 10.13+)


    This list is incomplete

    • Preview (macOS 10.13+)
    • ImageMagick (macOS 10.13+)
    • Graphics Converter 10.4.3+ (macOS 10.13+)
    • Pixelmator/Pixelmator Pro (macOS 10.13+)

    Browser Support


    HEIC surprisingly is not supported by Safari. Seeing as the HEIV/HEIF family is part of the MPEG group, the patents likely will likely limit its adoption. H.264 wasn't widely supported by holdouts like Mozilla until Cisco bought the patent and made it open.

    HEIF still is mostly treated as an intermediate format. Transferring HEIF from iOS to macOS with a Mac running a compatible OS will transfer images as HEIF. HEIF has expanded quite a bit since landing on Mac OS. Windows users are left hanging with the Adobe suite outside of Lightroom. For avant-garde browser-supported formats, see Getting started with Webp, JPEG2000, and JPEG-XR.

    Welcome to the Enshittening: where everything is bullshit.

    Fake people with fake cookies and fake social-media accounts, fake-moving their fake cursors, fake-clicking on fake websites — the fraudsters had essentially created a simulacrum of the internet, where the only real things were the ads.

    How much of the internet is fake? Studies generally suggest that, year after year, less than 60 percent of web traffic is human; some years, according to some researchers, a healthy majority of it is bot. For a period of time in 2013, the Times reported this year, a full half of YouTube traffic was “bots masquerading as people,” a portion so high that employees feared an inflection point after which YouTube’s systems for detecting fraudulent traffic would begin to regard bot traffic as real and human traffic as fake. They called this hypothetical event “the Inversion.”

    Max Read, "How Much of the Internet Is Fake? Turns Out, a Lot of It, Actually.", NY Mag

    This is a brilliant must-read linking some of my favorite articles recently like Rising Instagram Stars Are Posting Fake Sponsored Content (if that makes your soul hurt, then you're not alone). The article doesn't even account for some of the many fraudsters like BuzzFeed's investigative report "Apps Installed On Millions Of Android Phones Tracked User Behavior To Execute A Multimillion-Dollar Ad Fraud Scheme" or Fake Retail apps in the iOS app store. But if NYmag were to attempt to report on all the online scams not listed in this article from 2018, the entire tenor of the publication would need to shift into a security blog.

    We're surfing on a river of bullshit while each of us contributes our own tiny tributary of turds. We are all to blame for what is happening. To borrow from Esquire's Magazine, it's the enshittening, the active participation in shittifying the web.

    Amazon is a terrible place to shop, filled with fake reviews, counterfeits, fake products, stealth Amazon owned brands which they try and trick you into buying and we still shop at it.

    FaceBook doesn't give two shits about it's effect on politics, be it hate-speech that lead to murders, enabling Russian meddling, anyone's well being, or the gazillion infractions against privacy and we still use it, continuing to enshitten ourselves. Then there's our Enshitter-In-Chief, a one-man colon catastrophe when it comes to bullshit. He's so full of bullshit that he managed to make 28 disprovable statements (lie) publically a day last month. Even his underlings are more than gitty to perpetuate bullshit.

    To lazy to make up your own bullshit? Want to build an empire of bullshit? There's an entire bullshit industry you're probably not even familiar with, even if you've heard of low-rent bullshit like Fiver. There are full-on bullshit mills to generate bullshit on your behalf!

    We don't need "Virtual Reality," we're already living in one. Everything is bullshit — Happy New Year.

    Edge and Internet Explorer are dead - retrospective

    I hate Internet Explorer. I say that present-tense as its zombie-corpse still haunts the internet, holding back front-end web development. It's not a surprise nor even controversial. It's quite banal, so much so that my contrarian tendency makes me want to point out that IE was better than Netscape, (which it was). Even then with it certainly was not better than the many browsers after it.

    I used anything I could in the early days of Mac OS X to get away from IE 5.5: Mozilla Suite, Omniweb, Phoenix (later to become FireBird and finally FireFox), Chimera (Later renamed to Camino, a wonderful Cocoa/Objective-C port of FireFox).

    My ire for IE grew as I progressed a developer, there were those painful moments when a simple console.log would stop IE from executing a javascript file dead in its tracks, or creating conditionals like [if lte IE 7]. Internet Explorer often would take a design I slaved over in Safari and FireFox only to see a :last-child pseudo selector not work, and wonder why selectivizr.js wasn't working. So many polyfills and hacks...

    Edge was a different beast. Trident was revamped and renamed as EdgeHTML. It was fitting, rendering engine was modern (at least from a MS perspective). Things... worked. My opinion was suddenly, a nice lukewarm "It doesn't suck". I never thought I'd be sad to see Microsoft go.

    So I ended up writing up a post fitting for Internet Explorer and Edge, titled: How Microsoft lost the Browser Wars, the long form article style I prefer. It covers the a brief history of the browser wars and my personal take on how IE faltered and Edge failed.

    Making the cite tag sane: Announcing WebCites

    I wrote about a codepen I wrote last week. I found that there are some great libraries for citations but most are wanton overkill for what I wanted, a wikipedia-like simple citation system.

    I've since created a much smarter, dependency-free version and you can find the project on Web<cite>.

    Project goals

    • Generate a list from all the citations found within the article.
    • Detect multiple instances of the same source.
    • Number each <cite> instance, if used before, use the previous instance's number.
    • Uses anchor tags to link to the source at the bottom of the page to a specified div
    • No jQuery
    • Very easy to use

    Planned Features

    • Code fixes and some basic error checking
    • Arguments passed in as object for better configuration
    • Demo files
    • ES5 & ES6 variants
    • Source Title (optional)
    • Date retrieved (optional)
    • Author (optional)
    • CSS attributes for each property so lists can easily customized
    • Date Retrieved vs Article Date published
    • Repository contains minified distribution ready code
    • Multiple citation lists on the same page (multi-article support)
    • Optional demo Scss/CSS stylings
    • Generate as an ordered list instead of spans


    See the Pen Simple Auto-generated citations using <cite&rt; and javascript no jQuery by Greg Gant (@fuzzywalrus) on CodePen.

    Find this project at:

    How to use CSS media queries for dark mode!

    Dark mode is all the rage. Safari Technology Preview added support for Media Queries. Chrome has already announced it will soon support Dark mode. The requirements are (as of writing this):

    I'm personally not a fan of using dark mode and I'm no more correct than the person who prefers it. The point is that preference matters. Some users find it either more pleasing or perhaps reduces eye strain. The best part though is its a fun and easy hack:

    p { color: white;}
    @media (prefers-color-scheme: dark) {
      p { color: black;}

    Notably you can also detect light mode:

    @media (prefers-color-scheme: light) {
      /* do stuff */

    It's that easy. If you have a simple site, it's quick to retrofit. More complex sites (especially with image backgrounds or poorly written CSS) probably will require a lot more work. It took me about 10 minutes total to make a beta of dark mode. More info at One of the more promising features is eventually we will have media-queries for inverted-colors, prefers-reduced-motion, prefers-reduced-transparency and prefers-contrast which is a big win for both designers and accessibility. Apple OSes already have a bulk of these features (macOS has invert, reduce motion, dark mode and low contrast, whereas iOS has increase contrast, reduce motion and invert) You can see them all at Allowing web apps and sites to sport these features probably will become a small but important design trend as users can exert greater control for experience over their OS and content they consume for the best visual experience based on preference and requirements.

    If you visit this website now with dark mode enabled and the correct browser, you'll get to see dark mode in action! Now the real question: are we going to have a "dark mode first" movement? ;)

    Making the cite tag sane

    While writing a long-form retrospective on how Microsoft lost the browser wars, I realized managing a citation list is a royal pain the ass. With 50+ sources, I want a very simple light-weight way to manage a source list. The requirements were as follows for my auto-citation list generate:

    • Generate a list from all the citations found within the article.
    • Detect multiple instances of the same source.
    • Number each <cite> instance, if used before, use the correct number.
    • Use anchor tags to link to the source at the bottom of the page.

    I opted to use jQuery although I might kill the dependency as the biggest thing I used it for was queries, each, appending and writing HTML. All easier to write with jQuery but in the era of wide support for querySelectorAll() hardly necessary. If I go that far, I might just package it up into an ultra-lightweight javascript plugin with a few basic configuration options: a target for the list of citations, what information to collect, and maybe one or two citiation options. For now, it's a very simple citation script meant for non-academic purposes.

    The process was pretty easy, the script creates an array of objects based off of the jQuery object, iterates through the array each time a new item is added to make sure there isn't a duplicate URL, and if there isn't already a duplicate, it's that entry to the list. The assembled array is iterated through so its data can be written the DOM. Duplicates are detected using the source URL.

    I styled it after Wikipedia and may add in the link to the instance of the citation in the article like Wikipedia. It'd be easy to add additional information to the citation if needed such as "data retrieved", but in the case of my blog post, that'd be the date written.

    See the Pen Simple Auto-generated citations using <cite&rt; and javascript by Greg Gant (@fuzzywalrus) on CodePen.

    Edge comes to the Mac... sorta

    Today we’re announcing that we intend to adopt the Chromium open source project in the development of Microsoft Edge on the desktop to create better web compatibility for our customers and less fragmentation of the web for all web developers.
    Microsoft Edge will now be delivered and updated for all supported versions of Windows and on a more frequent cadence. We also expect this work to enable us to bring Microsoft Edge to other platforms like macOS. Improving the web-platform experience for both end users and developers requires that the web platform and the browser be consistently available to as many devices as possible. To accomplish this, we will evolve the browser code more broadly, so that our distribution model offers an updated Microsoft Edge experience + platform across all supported versions of Windows, while still maintaining the benefits of the browser’s close integration with Windows. - Joe Belfiore , Microsoft Blog

    In a two-in-one announcement, MS is finally abandoning Trident (forked to EdgeHTML), the once iron-fist scourge of the web. Edge was too little, too late, and still too broken. Browser engines though come and go, with names like KTHML (Konqueror) and Presto (Opera), to be replaced with WebKit, Blink (once fork of WebKit), and the obscure like Goanna (a fork of Gecko). Granddaddy, Gecko still stands tall and gets the last laugh as Netscape's ghost did the unthinkable: outlast both Internet Explorer and Edge.

    It's entirely unsurprising MS is bringing Edge to MacOS as its a pretty low-lift with Chromium but doubtfully will gain any market share, as it joins the hoards of Webkit/Chromium reskins: Opera, Vivaldi, Yandex, Brave, Comodo Dragon, Amazon Silk, Samsung Internet, Torch, Slimjet, Steam's internal browser and so on.

    It makes my life easier but makes Edge effectively uninteresting and forgettable.

    Right To Repair Law Should Be The Rally Call Every Mac / iPhone User

    The CBC's report 'Complete control': Apple accused of overpricing, restricting device repairs is a must read/watch although imperfect as it's a sample size of 1. There's a few reads from my estimation and none of them are particularly good:

    • Apple purposely recommends fixes that users do not need like a scandelous mechanic. I find this unlikely but it isn't implausible.
    • Apple's genius techs have degraded in quality due to shortcomings with Apple as an employer and. I find this compelling, I emailed this story to Nick Heer of Pxlnv and he replied with this, so credit to him.
    • Apple keeps a tight lease on it's repairs, and will only perform certain operations due to volume of repairs it makes and is uninterested in low hanging fixes. Apple has a paint-by-numbers repair shop that doesn't account for things like replacing a single cable, but rather an entire display as these are "known" fixes that reliably fix a host of problems, eliminating the guesswork and downplays the individual tech's required diagnosis. This acheives a few goals: problems are fixed with impunity. Techs are required to do little to no guesswork. Techs can be trained to do several big tasks instead of potentially hundreds of small tasks. Apple maintains a steap profit margin by selling the expensive-yet-effective service (or selling a new computer). This is personal theory (and probably corrolates with the above.).
    • Lastly, the CBC encountered an edge-case/outlier, and the tech who proposed the fix was in error or a sub-standard tech. That said, the CBC isn't the first person to make this argument, plenty of bloggers/youtubers and techs have accused Apple of proposing ludicrous fixes.

    This all compounded by the fact Apple purposely makes its devices and computers non-userserviceable, going as far as to engineer non-standard screws, using glue to seat components, not provide 3rd parties with any manuals and the clamping down on authorized Apple service locations, and making user hostile designs. While the complexity of the ever increasing desire to shrink designs has stressed component placement, Apple takes extra steps to discourage users from exerting control on their own devices. This is where the CBC's report picks up. Instead of a simple hit piece on only Apple, as the world's most profitable company and arguably the most popular electronics maker, they are used as the canary in a coalmine and a segue to the right to repair.

    If you haven't heard of right-to-repair, I suspect it's going to be a larger movement it extends far beyond just consumers and their gadgets, look no further than John Deere's war against farmers and in 2014, a minor victory occurred when pledged to honor right to repair, although Tesla seems less inclined to do so.

    I feel my stake as someone who's written an 11,000+ word guide on upgrading/fixing/maintaining Mac Pros and have often lamented planned obsolescence and the death of modular computing.

    If you have a minute, I suggest taking up the cause. It's easy, the following three organizations all have comprehensive ways to take political action and links to legislation.

    Right to Repair Action