Under-the-hood blog updates

    Over the break, I went on a binge of minor changes to this blog.

    • The privacy policy and contact now exist on their own pages. Google supposably prefers this. Before both items existed on the about page.
    • This blog now supports a JSON Feed, looked at Apple News but screwed up the process for importing my RSS feed. May revisit that later but with the low traffic most of this blog sees, not really worth the effort.
    • Improved the JS. To reduce requests, I've concatted four JS files into one. I upgraded jQuery 1.12 to 3.x as it is faster and smaller, and its no longer hosted on a CDN.
    • Fixed the canonical URL declaration in the head.
    • Removed a few errant CSS classes, now this serves an absurdly low 2.9k of CSS down from roughly 3k.

    Sometimes I think the lack of visual flourish is mistaken for a lack of design but I like minimalism.


    How to import Feedly feeds into NetNewsWire

    NetNewsWire 5.0 was just released as open source. Ever since Google Reader shut down, I've been using Feedly as an aggregator and Reeder for iOS. Reeder makes it very easy to import Feedly feeds but it isn't as straight forward for NetNewsWire. Fortunately, Feedly and NetNewsWire both support OPML (Outline Processor Markup Language) to import/export feeds. Feedly buries this so here's a quick step-by-step to get up and running with NetNewsWire and Feedly.

    Steps

    1. Sign in to Feedly and click the gear.

      Feedly Gear Icon

    2. From the settings, next to the import OPML click and the export button.

      Feedly Export OPML

    3. From NetNewsWire, select File -> Import Subscriptions and import your OPML file

      NetNewsWire import OPML

    That's it! Enjoy. I highly recommend Reeder for iOS.


    RSS feed is fixed

    I'm pretty sure since the inception of this blog, the URLs in the RSS feed have been broken. I just noticed writing about NetNewsWire. Now they are fixed. You can add the RSS Feed here to your favorite RSS reader.

    Expect a slow trickle of tiny improvements to this blog, like the printable blog posts.


    Stupid Scary or Scary Stupid

    This isn't a blog about politics or even my opinions, but I'm going to go on the record and say nuking hurricanes is a bad idea. This seems like a silly thing to say, but here we are, having a national discussion about it, and why it is a bad idea. It's worth noting our sitting president suggested multiple times according to Axios, that suggested nuking hurricanes to stop them from hitting the U.S..

    A lot of people are clowning this idea (because it's patently stupid), but it's not entirely wrong, rather a problem of scale. We simply do not have enough nuclear weapons on the planet to end all hurricanes. If truly want to nuke our way out of hurricanes, we'll have to invest a lot more into Nuclear weapons. No more earth = no more hurricanes. Checkmate, hurricanes.

    Jokes aside, Axios may not be the household name, nor even my go-to source for journalism and Trump denies it, but it is believable. Let us consider that this is fresh on the heels of Trump asserts, "I could win that war in a week. I just don’t want to kill 10 million people". Then there are the long-standing reports that Trump asked multiple times why can't use nukes? multiple times and would use nuclear weapons in response to a terrorist attack by ISIS. Even if the latest is "fake news," the other accusations are damning enough.

    "The biggest problem we have is nuclear ... having some maniac, having some madman go out and get a nuclear weapon." - Donald Trump, 2016

    I couldn't agree more. This is clearly a person who is unfit to be given responsibility for the most destructive weapon humankind has created. As the great political journal of our age, TeenVogue*, wrote, "Yes, Trump could instigate a nuclear war without anyone stopping him."

    We are all at the mercy of a tweet.

    *Not a satirical comment.


    Git Hotfix workflow for Pantheon.io

    We've all been there, you have changes that haven't been QAed, but there's a hotfix that needs to go out yesterday. Pantheon is a great host, but it has one major gotcha: you can't switch code branches. The way Pantheon works is the Pantheon Remote git library gets one, and only one branch. There's no way to switch branches. Code can only be promoted from staging->test->live. This is problematic, especially if you're coming from various deployment management utilities that let you switch branches or platform-as-a-service like Heroku.

    Pantheon's recommended work flow

    Image Credit: Pantheon, Use the Pantheon Workflow

    Pantheon does offer multi-dev which essentially creates a separate branch for testing (and which can be promoted to the main chain) but still doesn't fix the hotfix issue.

    Pantheon Hotfix Flow

    1. Create a local branch and reset to the last commit that was made live (this is a pain as Pantheon doesn't show last commit git hashes)
    2. Make changes locally. Commit the changes to your new branch.
    3. git push -f Pantheon YOURBRANCHHERE:master
    4. Promote from Dev -> Test -> Live from the control panel
    5. Make sure your hotfix is merged into your master local (and your origin)
    6. Reset Pantheon’s Dev to the master branch git push -f Pantheon master:master

    Despite being warned that you should never use git force, this is the cleanest method. You can push up your desired hotfix and leave it on the live environment until your normal deploy chain overwrites it.


    Printable blog posts

    Using the magic of @media print, I've included a mild update to this blog to be more printer / PDF friendly, mostly for a singular post, The Definitive Mac Pro Upgrade Guide that drives the majority of my traffic.

    • Images are now capped to 75% of the page width as my images are only double density, optimized for screens, not printers.
    • The main content now expands to the full page width, so users can set the margins within their print prefs.
    • The main body copy, and line height has been reduced from 16px to 15px, and 2 em to 1.75 em.
    • I've included a link back to the original blog post at the top of each page, visible when printed so PDF users can easily return to the blog post in a web browser.

    Happy printing, you weird PDF-loving bastards!


    Time Machine: An Error Occurred Restoring from Backup + Fix

    My 2017 MacBook Pro stopped charging and refused to accept power from any power supply on any port. I was restoring my My computer to my previous laptop, a 2015 MacBook Pro and encountered the above. I tried using the day-before's back up, but this didn't work.

    I received the following message when booting from a Time Machine drive:

    An Error Occurred Restoring from Backup

    An Error Occurred Restoring from Backup

    To Try Restoring from a different backup, click choose other Backup.

    To reinstall macOS, click install macOS. During the install you can chooes to restore your information from a Time Machine Backup.

    To Boot from an existing macOS installation...

    I've seen some high tier fixes, like harryfear.co.uk's fix but there's an easier route and the clue is in the error message.

    More carefully reading the message, I booted off the Recovery partition and then reinstalled macOS. Then once completed, on the Migration Assistant I selected the option to transfer information over from my Time Machine drive. This isn't a true 1:1, I noticed some things missing such as /etc Apache2 modifications but some of the geek stuff like, HomeBrew and its many CLI applications (Heroku) made the cut. Beyond renewing SSH keys and running Docker builds, my computer was good to go. Standard Mac applications had no issues.

    Summary

    If a restore fails, fear not. Restores are faster but you will not lose your important files.

    1. Boot off a recovery partition, reinstall macOS
    2. At the end of the installation, you will see the Migration Assistant. Select transfer files from another computer/device/Time Machine then select your time machine drive

    I suspect for most users, self-included, the harryfear fix is overshooting the problem and Apple's solution is "good enough".


    Chrome does not support media queries on video source tags + a workaround

    Sometimes you encounter something that'll surprise you, and yesterday was one of those days: Chrome does not support inline media queries on the source tag within videotag. (you can test it here) Worse, plain media queries will not stop multiple videos from loading, which effectively doubles your data, so it requires a JS solution. CSSTricks has an article from 2012 using jQuery, but there's no follow up and I wasn't that enthralled. I saw, thenewcode: Make HTML5 Video Adaptive With Inline Media Queries but it fails to mention Chrome's refusal to support it.

    Javascript to the rescue

    First, I wanted to prevent any request to be made, so I created an empty video tag with my two videos as attributes. Easy right? Now that all major browsers support MPEG4, I could safely assume the only legacy users are IE and Safari as the browsers are tied to OS updates, whereas Chrome and FireFox are not thus very few users would not be using a recent browser. Safari and IE both support MPEG4. There's not a good reason for me to want to support WebM.

    <video
      preload="auto" autoplay="" loop="" muted="" playsinline=""
      data-desktop-vid="https://iconaircraft.s3.amazonaws.com/ICON_Web+4.0_Loop_16x9_DRAFT190723_26sec+3700.mp4"
      data-mobile-vid="https://iconaircraft.s3.amazonaws.com/ICON_Web+4.0_Loop_1x1_DRAFT190723_26sec-mobile.mp4"
      >
    </video>
      

    I didn't want to rely on any framework, jQuery document ready meant the JS wouldn't fire until the rest of the page loaded, and es6 meant leaving out old browsers. Thus, I'm limited to ES5.

    First, I needed to get all the videos on the page. This creates a variable that contains an array of objects, even if only one is found on the entire page.

    //get all vids
    var video =  document.querySelectorAll('video')

    Next, I needed to create a source for the video tag. The source tag needs an src and type. After that we need to append the newly created DOM element back to an element. This function doesn't need to know how many videos are on the page or what the screen size. It just will return a source to a video tag.

    //add source to video tag
    function addSourceToVideo(element, src) {
        var source = document.createElement('source');
        source.src = src;
        source.type = 'video/mp4';
        element.appendChild(source);
    }

    Next is where the logic happens. If a screen size over a predetermined value, I will load the desktop or mobile version. Since I have two data-attributes to work off of depending on the screen size depends on which one I want to use. If the screen is above a certain size, it will grab the desktop version instead of the mobile version, to feed to addSourceToVideo. Easy enough, right?

    //determine screen size and select mobile or desktop vid
    function whichSizeVideo(element, src) {
        var windowWidth = window.innerWidth ? window.innerWidth : $(window).width();
        if (windowWidth > 800 ) {
            addSourceToVideo( element, src.dataset.desktopVid);
        } else {
            addSourceToVideo(element, src.dataset.mobileVid);
        }
    }

    Now that we've written code to determine write sources to empty video tags, it needs to init and be able to handle multiple videos. Remember our array of objects? It's time to use it. There's no point of running the code if there aren't any videos on the page, so we need to check to see if the var videos contains any data. If it does, then we need to loop over our array and return an individual video in case we have multiple videos on our page.

    //init only if page has videos
    function videoSize() {
      if (video !== undefined) {
        video.forEach(function(element, index) {
                whichSizeVideo(
                    element, //element
                    element  //src locations
                );
        });
      }
    }
    videoSize();

    Notably, you could tie the above code to a resize event incase a user resizes the window and have it trigger videoSize. I chose not not to for simplicity. You can see the working version of the above code, on CodePen. I didn't embed it in this post, so those using a slower connection aren't being hit with 30 MB of video data. Place this script inlne or as seperate file below your videos, but before the rest of your JS payload for maximum performance.


    On the realm of personal branding

    I'm always drawn with complete morbid fascinating with "influencer" culture as I largely have avoided social media. I have exactly one social media account of the major, Facebook (arguably the worst of them all) but deleted the app off my phone about 4 years ago. I'm including YouTube in this, as it should be considered a social network and reddit, which I avoid. I'm convinced that vast majority of social media is an anecdoche.

    So when I read about influencers asking for free meals at restaurants, I experience a clash of contradictory thoughts simultaneously: "Who has the gall to cold-call a restaurants for a free meal because they have 50,000 Instagram followers?", "I'm surprised this happens", "I am not surprised in the least bit", "this is scam verges on genius", "major brands will give famous people free shit, why not small businesses give those with tiny soap box for cheap advertising?", and "everything about this is idiotic" followed by general self-satisfied feeling of being above it all, despite my immediate desire to share/discuss it with my friends.

    Over marketization of all facets of life has made even the most mudane activity a transactional exchange that can be sold thanks to social media. Its all viewed through the nihilistic world view that anyone has a "personal brand". Any experience, even a wedding proposal is a marketing opportunity, and people flock to toxic lakes. The irony is influencers are internet points that may or may not mean a damn thing, an influencer with 2 million followers couldn't sell 36 t-shirts.

    If there's one thing that is certainly true, it's further evidence of the enshittening.


    Basic Architecture of Designing Gutenberg Blocks in Wordpress 5.0

    There's plenty of tutorials on creating your own custom Gutenberg blocks, but I found that the between beginner and advanced was lacking. I'm going to skip the basics but rather a short list of things to understand to more effectively work with Gutenberg. So from a trial-by-fire experience of working on two Wordpress 5.0 Gutenberg websites, here's what I've learned. The guiding principal is to re-use functionality when possible and try to replicate the Wordpress UI.

    Rule 0: Understand React's role

    Wordpress chose to use React to create it's UX for its new Gutenberg block editor. However, instead of using React directly, Wordpress uses Element, an abstraction layer over React. If you're wondering why someone would want to do this, Wordpress has a very concise list:

    • In many applications, especially those extended by a rich plugin ecosystem as is the case with WordPress, it’s wise to create interfaces to underlying third-party code. The thinking is that if ever a need arises to change or even replace the underlying implementation, it can be done without catastrophic rippling effects to dependent code, so long as the interface stays the same.
    • It provides a mechanism to shield implementers by omitting features with uncertain futures (createClass, PropTypes).
    • It helps avoid incompatibilities between versions by ensuring that every plugin operates on a single centralized version of the code.

    This means you'll be writing code and making imports to wp.element, wp.components and wp.blocks. React only exists in the admin side of things, and all the content found within a block is saved statically meaning you won't be able to create React experiences simply by creating Gutenberg blocks. The



    Rule 1: Bundle your custom blocks in one plugin

    It's pretty easy to bundle all your custom blocks into one plugin. Unless you're looking to distribute your custom blocks across many sites, it's saner for development and deployment to make one master plugin for all your blocks. On the dev side, this means a single webpack instance to spin up as opposed to one for each block. A good example of this in action is Zac Gordon's WPForJSCourse example. While his course isn't free, this plugin is. This includes everything you need: A sane structure, a webpack config and a setup. All the custom blocks are registered in one nifty index js file.

    It's pretty easy to do, but it was a design pattern that I didn't realize would benefit me when I started in on Gutenberg.



    Rule 2: Learn Innerblocks and reuse core blocks

    Most tutorials seem to stop short of innerblocks, and Innerblocks are probably one of the most important features of Gutenberg. Innerblocks allow you to load blocks inside of blocks. Below is a super basic example for a slideshow, allowing the user to enter as many images using the core/image Gutenberg block but restricting the user from entering any more.

        edit: props => {
            const { attributes: {selectControl},
                className, setAttributes, isSelected, } = props;
                const ALLOWED_BLOCKS = [ 'core/image' ];
            return [
                div className="slideshow-super-simple">
                <strong>Note: </strong> all slides are visible in editor<br />
              <InnerBlocks
                allowedBlocks={ ALLOWED_BLOCKS }
              />
              </div>
            ];
        },
        save: props => {
            const { attributes: { selectControl } } = props;
            return (
                <div className="icon-simple-slideshow" >
                  <div className={ selectControl} ><InnerBlocks.Content /></div></div>
            );
        },
        

    Innerblocks aren't simply limited to allowing and restricting other blocks, they can also accept templates, which is a set of pre-defined blocks. This allows assembling a very complicated UI or layout widget out of any number of prebuilt or custom blocks. There's no reason to reinvent the wheel as Wordpress gives you quite a few different blocks. Below I've included a list of all the common blocks by category.

    Common blocks category
    • core/paragraph
    • core/image
    • core/heading
    • (Deprecated) core/subhead — Subheading
    • core/gallery
    • core/list
    • core/quote
    • core/audio
    • core/cover (previously core/cover-image)
    • core/file
    • core/video
    Formatting category
    • core/table
    • core/verse
    • core/code
    • core/freeform — Classic
    • core/html — Custom HTML
    • core/preformatted
    • core/pullquote
    Layout Elements category
    • core/button
    • core/text-columns — Columns
    • core/media-text — Media and Text
    • core/more
    • core/nextpage — Page break
    • core/separator
    • core/spacer
    Widgets category
    • core/shortcode
    • core/archives
    • core/categories
    • core/latest-comments
    • core/latest-posts
    • core/calendar
    • core/rss
    • core/search
    • core/tag-cloud
    Embeds category
    • core/embed
    • core-embed/twitter
    • core-embed/youtube
    • core-embed/facebook
    • core-embed/instagram
    • core-embed/wordpress
    • core-embed/soundcloud
    • core-embed/spotify
    • core-embed/flickr
    • core-embed/vimeo
    • core-embed/animoto
    • core-embed/cloudup
    • core-embed/collegehumor
    • core-embed/dailymotion
    • core-embed/funnyordie
    • core-embed/hulu
    • core-embed/imgur
    • core-embed/issuu
    • core-embed/kickstarter
    • core-embed/meetup-com
    • core-embed/mixcloud
    • core-embed/photobucket
    • core-embed/polldaddy
    • core-embed/reddit
    • core-embed/reverbnation
    • core-embed/screencast
    • core-embed/scribd
    • core-embed/slideshare
    • core-embed/smugmug
    • core-embed/speaker
    • core-embed/ted
    • core-embed/tumblr
    • core-embed/videopress
    • core-embed/wordpress-tv
    Dummy Image

    Pictured: Mock up of a hypothetical user page

    Let's break down the above design: It's two columns consisting of:

    Column 1 (core/column) Column 2 (core/column)
    Image (core/image) Headline (custom)
    Sub-Headline (custom)

    Paragraph (custom)

    With Gutenberg, simple layouts like the above can potentially be done using the code block, but it isn't desirable as it requires a bit of mastery of Wordpress, with a high margin of error if it taps into using custom CSS. We have a two column design, consisting of an Image in the first column, and two fields in the second, followed by text beneath the columns. So let's look a template code.

         edit: props => {
                const { attributes: { paragraph },
                    className, setAttributes, isSelected } = props;
                    const TEMPLATE = [
                      [ 'core/columns', {columns: 2,className: "profile-outer-column"}, [
                          [ 'core/column', { className: "profile-inner-column" }, [
                            ['core/image', { className: "profileImage"}],
                          ], ],
                          [ 'core/column', {className: "profile-inner-column"}, [
                            ['mycustomblocks/profile-title', { className: "profileTitle"}],
                            ['mycustomblocks/profile-name', {  className: "profileName"}]
                          ],],
                      ],],
                      ['mycustomblocks/profile-bio', { className: "profileBio"}]
    
                    ];
                return [
                    <div className={ className + " my-profile-editor"}>
                      <InnerBlocks template={TEMPLATE} />
                    </div>
                ];
            },
            save: props => {
                  const { paragraph,className } = props.attributes;
                return (
                    <div className={className + " my-profile"}><InnerBlocks.Content  /></div>
                );
            },
       

    Using the templates, I'm able to place inside columns a mixture of custom and factory Gutenberg blocks! Innerblocks aren't infallible, you can template lock blocks so users cannot add more blocks but occasionally this creates issues. Also, the custom styling for blocks does not work on any block that features an innerblock (yet). Perhaps this will change but as of writing this, it hasn't.



    Rule 3: Restricting dependent blocks

    Often you'll create a block that shouldn't appear in your list of plugins from the Wordpress gutenberg GUI. Any custom block can be easily restricted to being only accessible from a certain block type. In my previous example I had two custom blocks, mycustomblocks/profile-title and mycustomblocks/profile-name. These two blocks are very simple blocks, but I do not want them polluting my list of Gutenberg blocks. This only requires declaring the parent

    export default registerBlockType(
        'mycustomblocks/profile-title',
        {
            title: __( 'Profile Title', 'mycustomblocks' ),
            description: __( 'This field is for the user profile's job title.', 'mycustomblocks'),
            category: 'common',
            keywords: [
                __( 'text', 'mycustomblocks' ),
                __( 'MediaUpload', 'mycustomblocks' ),
                __( 'Message', 'mycustomblocks' ),
            ],
            parent: ['mycustomblocks/slideshow-slide'],

    See the parent flag? It really is that easy.



    Rule 4: Learn to use the custom toolbar and form fields

    To truly make a plugin feel native, you'll need to tap into the UX that Gutenberg uses, the toolbar and the sidebar containing form fields, using InspectorControls and BlockControls. Again, Zac Gorden's JSForWordpress tutorial repo has a great example of each.

    InspectorControls appear in the sidebar of a block, zgordon also has a nice tutorial on it as well as Eudes' medium post. Also, be sure to see the official documentation on InspectorControls.

              <InspectorControls>
                  <PanelBody
                      title={ __( 'High Contrast', 'jsforwpblocks' ) }
                  >
                      <PanelRow>
                          <label
                              htmlFor="high-contrast-form-toggle"
                          >
                              { __( 'High Contrast', 'jsforwpblocks' ) }
                          </label>
                          <FormToggle
                              id="high-contrast-form-toggle"
                              label={ __( 'High Contrast', 'jsforwpblocks' ) }
                              checked={ highContrast }
                              onChange={ toggleHighContrast }
                          />
                      </PanelRow>
                  </PanelBody>
              </InspectorControls>

    BlockControls appear in the editable area of a block as inline controls. Also, be sure to see the official documentation on Toolbars and Inspector

    <BlockControls>
        <AlignmentToolbar
            value={ textAlignment }
            onChange={ ( textAlignment ) => props.setAttributes( { textAlignment } ) }
        />
        <Toolbar>
            <Tooltip text={ __( 'High Contrast', 'jsforwpblocks' )  }>
                <Button
                    className={ classnames(
                        'components-icon-button',
                        'components-toolbar__control',
                        { 'is-active': highContrast },
                    ) }
                    onClick={ toggleHighContrast }
                >
                    {icons.contrast}
                </Button>
            </Tooltip>
        </Toolbar>
    </BlockControls>

    It will feel a little strange but the edit return is returned as an array, for the main editable area mark up.


    The Return: Mac Pro 2019

    As a minor (I stress minor) pundit on all things Mac Pro after my definitive Mac Pro Upgrade guide I figure I should weigh in to the ever expanding sea of opinions. For the first time in a very long time the WWDC really hit the right notes: the iPad is growing up, the Photos app is beautiful and even more compelling, iTunes is no more and finally broken apart, the Watch has a de-tethered experience, Mac OS now will sport screen mirroring natively on an iPad, but Apple seemed to sense that the most important announcement was the return to the professional with the almighty Mac Pro.

    Apple stands smug

    Pictured: Apple feeling itself with the new Mac Pro. Don't be fooled by the monitor, it's 39.7 lbs/18.0 kg

    Apple delivered but for $$$$

    The presentation was oh-so-Apple like, and then it wasn't. Apple talked big numbers, lots of numbers, the kind of numbers that make average-people glaze over in boredom: 8k, 6k, 3.0 16x PCIe, billions of pixels, 2000 audio tracks, GPUs, multiple GPUs. It's enough to make someone throw up their hands and rhetorically ask: Who cares? But we care. We always have. There was a surreal moment when Apple showed how the case opened. They invited you inside, and look, there are slots! So many slots. What-in-the-name-of-Ives was going on?

    The new Mac Pro is a monster, there's no other way to say it:

    • Up to 28 Cores
    • Up to 1.5TB of RAM
    • Eight PCIe slots
      • Two "MPX" slots with Thunderbolt 3 passthrough (optional 16x ports if the slots aren't obfuscated with large cards)
      • 3 full length slots (1 16x, two 8x)
      • 1 half length slot (8x) with I/O connectivity from factory
    • Two SSD slots (unclear if natively NVMe)
    • Two 10Gb Enet
    • Two USB-C / Thunderbolt 3 / two USB-A ports front facing
    • Two Thunderbolt 3 ports on I/O card
    • Headphone jack, internal speaker
    • 802.11ac/Bluetooth 5
    • Custom additional co-processor for video that allows for three 8k streams to be played back at once.

    It's an absolute monster beast of a computer. Dare I say, this may very well be the best-designed desktop Apple has done. Visually, it may be a little too avant-garde. It's alien-looking but with a very clear nod and a wink to the cheese grater. It's built to last, just like how my 3.1 Mac Pro still works a decade layer. The hitch is the entry price of $5999.... ouch, oh and that pretty monitor? $4999 with a very understandable groan from the audience when the stand price tag of $999. The monitor is too much but I guess time to start saving for the Mac Pro. It's been a very long time since I've had this level of interest in a computer.

    I've been following social media conversations and have some additional thoughts that I felt necessary to expand on.

    More thoughts on price

    The Mac Pro 2019 was what the people wanted, Xeons are more expensive than ever, with the CPU itself making up for roughly 1/3rd of the price even in the entry model. The Mac Pro 2019 also is the most upgradable Mac we've seen on several accounts. The PowerMac 9600, a monster in its own right had six PCI slots, twelve RAM slots, SCSI, ethernet and a serial port (3x 5.25 inch drives), debuting in 1997 at $3,700 (Roughly $5800 adjusted for 2019). The 2019 Mac Pro may lack the drive bays (only two SSD slots), it also has eight PCIe slots with an additional four Thunderbolt 3 ports. Let that sink in, Thunderbolt 3 adds the rough equivilent of four more 4x PCIe slots. It's better to think of the Mac Pro 2019 as having twelve PCIe slots. It has effectively double the physical PCIe slots of the Mac Pro 2006-2012, triple if you consider Thunderbolt 3, and far more CPU configuration options despite one CPU (which can scale to 28 cores, 56 virtual cores). You are effectively getting double the computer than the previous generation Mac Pro. It's expensive. Really expensive but go look at other Xeon workstations. It's price competitive. You wanted upgradable? Here it is.

    Lastly, part of the price hike problem is the stagflation for most workers. This isn't to say that the problem isn't with Apple's price-point as even in 1997 a $3700 computer was a bitter pill to swallow, rather that it is exacerbated by the lack of meaningful raises that'd make investments like this more attainable. Also Mac users haven't kept up with the current Xeon prices, they are damn expensive. There's not really an affordable gap between an Intel i9 and the Xeons sadly. What a bulk of users wanted (self-included) was a $3000 Mac Pro. The i9 line certainly provides the CPU horse-power but is handicapped in the I/O department, only 40 lanes of PCIe and maximum memory and 128 GB. However, at 68 lanes maximum is certainly beefy enough but still locked at 128 GB of RAM, less than the 2010 Mac Pro.

    As for the monitor? Nope, I can't justify it. Charging $1000 for the stand is not a good look. While a 6k true-10 bit display is impressive, I'm sure within 2 years we'll see equal displays for less money. Also for wallet injury, Apple had the gall suggest connecting 6 of these monitors to the Mac Pro, $36,000 worth of displays. Riiiight.

    Thoughts on GPUs

    If you hoped for NVidia support, well.... you placed your hopes on the wrong company. Apple and NVidia have not reconciled. There was absolutely no reason to expect them to resolve their issues. Expect to remain disappointed for the foreseeable future. I want NVidia as much as anyone, but it's not happening. Should Apple allow NVidia GPUs? Of course. Will they? Probably not. I've been wrong about a good many things though.

    Thoughts on chipset

    I'm not crazy about the Xeon price tags, and I've seen several people arguing that Apple should have gone AMD with like the thread ripper series. So far the Threadrippers are limited to: max 128 GB/s of RAM, 64 PCIe lanes and no Thunderbolt 3. While personally, these specs would be fine for me on a desktop (in fact I'd personally like a more modest config for affordability), they aren't for the target market: too few PCIe lanes, not enough ram and no Thunderbolt 3 is a deal breaker.

    Also, the commitment to Xeons lends credence for a personal theory of mine, that the mac lineup might end up a split CPU architecture. Windows already is. The MacBook and MacBook Air line could end up on ARM where the I/O abilities of ARM are much less and the limitations of size mean that an Apple GPU would be more viable than Intel's built-in offerings. Is this true? Who-the-hell-knows outside of infinite loop but Apple speculation is a past time and I figure I should put this in writing just to see how accurate I really am.

    Thoughts on I/O chipset

    Now here's one that took me for a loop and I'm surprised that no-one I've seen has talked about this: The I/O card has two of the Thunderbolt 3 ports. This means (possibly) that more ThunderBolt 3 ports could be added to the computer or better, possibly whatever I/O (thunderbolt 4?) could be added down the road. This certainly improves the shelf life.

    Thoughts on Power Draw

    It's an easy target for people to skewer Apple on, but GPUs need juice. Their power requirements aren't going down anytime soon with 8k and VR. Perhaps focus on performance-per-watt will become needed in soon in the desktop arena as watts-in = heat out, creating issues for everyday users but we haven't seen the worst of it yet. The real question is the idling power, which is bound to be much lower than the ma

    Thoughts PCIe 3.0 / DDR4

    A few people are angry about the lack of PCIe 4.0 and DDR4. This one is easy. PCIe 4.0 isn't a shipping spec and Intel may skip it altogether, DDR5 is still more than a year out. The Mac Pro delayed even further. The real test is will Apple update to PCIe 4.0 and DDR5 when it comes time? If you hoped for Thunderbolt 4, it isn't even a thing yet.

    Thoughts on looks

    I find it strange, not hideous, but beauty is in the eye of the beholder. I won't lie; it made me smile though that they brought back the cheesegrater motif. Long as its quiet and doesn't glow with neon LEDs, I'm happy. Maybe I'm an odd man out, but my Mac Pro is on the floor next to my favorite piece of furniture, a drab grey McDowell & Craig 1940s all-American all steel desk that weighs roughly 200 pounds that inherited from my grandfather. It'll look just fine next to it but each to their own.


    The Definitive Trashcan Mac Pro 6.1 (Late 2013) Upgrade Guide

    Mac Pro 2013 is Oscar's home

    Contents

    Introduction

    To mark the first anniversary of my wildly successful blog post (garnering tens of thousands of views) The Definitive Classic Mac Pro (2006-2012) Upgrade Guide, I'm proud to announce a sequel. The Definitive Trash can Mac Pro 2013 upgrade guide started in jest on social media as the guide no one wanted, seeing as the Mac Pro 2013 is kinda in itself a joke as it over-promised and under-delivered, and is considerably less upgradable than its predecessor. Is there a need or demand for such a guide? Probably not, but here we are and while the origins are jocular the rest of this guide is serious. While most users (and Apple engineers) probably prefer moniker "cylinder," the trash can title stuck due to its obvious physical characteristics.

    The Mac Pro 2013 has the dubious honor as the longest produced Macintosh, besting the Macintosh plus which was produced from 1986 to 1990 without an upgrade. The 2013 Mac Pro was conceived as the successor the original Mac Pro, eschewing the modularity for a (debatably) stylish and certainly radical redesign. After a few positive reactions by publications for its foreign looks, it quickly became snubbed for its lack of upgradability, stability, and Apple's complete and absolute antipathy (verging on enmity) towards it.

    The Mac Pro 2013 has been prone to high rates of failures due to heat, with a nameless Apple exec quoted as saying "think we designed ourselves into a bit of a thermal corner if you will". Apple also took steps to extend its repair program but problems persist. Despite the naysayers, the Mac Pro 2013 isn't without its fans (no pun intended), as at the time of its unveiling, it was a powerful, quirky computer, in a diminutive form factor. Despite its limited upgradability, the computer is a modular design, and nearly every part of significance can be replaced. No Mac produced after it has allowed for the range of upgrades (although the iMac 5k is a close second). It's the bridge to a by-gone era, where CPUs and storage and even GPUs were removable. Perhaps the 2019 Mac Pro a return to PCIe but more than likely, 2013 will be the template.. Edit: The Mac Pro 2019 marks an expensive return to PCIe.




    Know your Mac Pro Models

    The Mac Pro line debuted in 2006 and has had six major iterations by Apple's own nomenclature, 1.1, 2.1, 3.1, 4.1, 5.1, and 6.1. These are also generally referred to by year, 2006 (1.1, 2,1), less commonly 2007 (2,1), 2008 (3,1), 2009 (4,1), 2010-2012 (5,1) and 2013 (6,1). The other terms for these computers are divided between "Cheesegrater" (2006-2012) and "Trash can" (late 2013) or "Cylinder". For the purpose of this guide, I will refer to the Mac Pro "trash can" as 2013 (as does much of the internet).

    Please note This guide only covers the 2013 Mac Pro. For all other models, I've written a massive guide, The Definitive Classic Mac Pro (2006-2012) Upgrade Guide.

    Configurations

    Apple has only shipped a grand total of 3 base configurations with a forth build-to-order option for the 12 core CPU. Apple has only made one minor change in the past six years to the Mac Pro 2013, by removing the original base configuration and lower the prices of the remaining models.

    • Apple Mac Pro "Quad Core" 3.7 GHz, 12 GB of RAM, 256 GB SSD, and dual FirePro D300 2 GB of GDDR5 (4 GB total). Discontinued April 4, 2017*
    • Apple Mac Pro "Six Core" 3.7 GHz, 12 GB of RAM (16 GB after April 4th), 256 GB SSD, and dual FirePro D500 3 GB of GDDR5 (6 GB total). Discontinued April 4, 2017*
    • Apple Mac Pro "Eight Core" 3.0 GHz, 12 GB of RAM (16 GB after April 4th), 2256 GB SSD, and dual FirePro D500 6 GB of GDDR5 (12 GB total).
    • Apple Mac Pro "Twelve Core"* 2.7 GHz, 12 GB of RAM (16 GB after April 4th), 256 GB SSD, and dual FirePro D500 6 GB of GDDR5 (12 GB total). This is a build to order option only.



    CPU Upgrades

    Apple has never acknowledged the upgradability of the Mac Pro CPU, but the Mac Pro 2013's CPU is not soldered in thus making it upgradable. Only four CPU configurations were offered by Apple, E5-1620v2, E5-1650v2, E5-1680v2 and the E5-2697v2, but users soon discovered that the E5 v2 family was compatible. Unlike the previous Mac Pros, the Mac Pro 2013 was only offered in a single CPU configuration.

    Credit to the CPU list goes to Mac Rumors forum member ActionableMango.

    Architecture Cores CPU-Model GHz Turbo RAM Watt
    Ivy-Bridge 12 core E5-2697 V2 2.7 3.5 1866 130W
    Ivy-Bridge 12 core E5-2696 V2 2.5 3.3 1866 130W
    Ivy-Bridge 12 core E5-2695 V2 2.4 3.2 1866 115W
    Ivy-Bridge 10 core E5-2690 V2 3.0 3.6 1866 130W
    Ivy-Bridge 10 core E5-2680 V2 2.8 3.6 1866 115W
    Ivy-Bridge 8 core E5-2687W V2 3.4 4.0 1866 150W
    Ivy-Bridge 8 core E5-2667 V2 3.3 4.0 1866 130W
    Ivy-Bridge 8 core E5-2673 V2 3.3 4.0 1866 110W
    Ivy-Bridge 8 core E5-1680 V2 3.0 3.9 1866 130W
    Ivy-Bridge 6 core E5-1660 V2 3.7 4.0 1866 130W
    Ivy-Bridge 6 core E5-1650 V2 3.5 3.9 1866 130W
    Ivy-Bridge 4 core E5-1620 V2 3.7 3.9 1866 130W

    Useful Links




    GPU Upgrades

    Yes, the Mac Pro's GPUs can be swapped out, but only three different GPUs were ever produced for it, the AMD FirePro D300 2 GB, D500 3 GB, or D700 6 GB. Apple has kept tight control on these (any official repairs require the GPUs to be returned to Apple), and thus few-to-none exist on the aftermarket, and the two higher GPUs are prone to failures thanks to a wattage ceiling. For most intents and purposes it is cheaper to buy a Mac Pro 2013 than to track down two GPUs. Apple discontinued the entry level Mac Pro 2013 that sported the D300. All new Mac Pros sold after April 4th 2017 have either a D500 or D700.

    For other GPU options, see the eGPU section.

    Useful Links




    Firmware upgrades

    The Mac Pro 2013 has had a few firmware upgrades. Unlike previous Mac Pros that a firmware upgrade allowed for faster CPUs/RAM, AFPS, and NVMe booting for certain models, the Mac Pro 2013 has been more meager. The MP61.0120.B00 boot ROM included support for NVMe booting (found in the High Sierra update). Most recently the boot ROM version 128.0.0.0.0 was included in the 10.14.4 Developer Preview. With some firmware upgrades, some users found 4k displays no longer supporting 60 Hz, which requires an SMC reset and removing the offending PLists, see the useful links below. Previously the updates were distributed separately from the OS but in 10.13+, these were folded into OS updates.

    Notable, some users cannot update the bootrom without the Apple SSD. It's recommended hanging onto the original SSD with a copy of MacOS to perform Firmware updates.

    Useful Links




    Storage Upgrades

    There's a large number of external storage upgrades for the Mac Pro 2013, from USB 2.0/3.0 to ThunderBolt 2.0, and listing them all would be an exercise in futility. What's important to understand is that there are many multi-drive enclosures, spanning everything from RAID to multiple SSDs. External SSDs perform well in Thunderbolt 2, able to achieve roughly 1.2 GB/s depending on the storage solution in various tests.

    Internally, The Mac Pro does feature one SSD slot, using a custom Apple SSD running at PCIe 2.0 x4, capable of a maximum of 2 GB/s. Very few native third-party solutions exist, but they are out there, by makers like OWC and Transintl.

    That said... users have figured out how to shoe-horn NVMe drives in the Mac Pro offering top-tier performance and much better prices. Unfortunately, no one has taken the time to compile a list, so the known so far are: Samsung 960, Samsung 970 Pro, Toshiba XG3 and Crucial P1. Samsung released a firmware fix for certain models as well,

    The Mac Pro 2013 uses the same interface as the 2013-2015 MacBooks. There's a cottage economy of NVMe adapters now floating around. The first adapters that users tackled such as the GFF M.2 PCIe SSD Card, required a bit of filing and tape to successfully mount the card, which users on MacRumors were able to pull off. NVMe with ST-NGFF2013-C; Vega Internal GPU; Mac Pro 2013 (6,1). Later adapters like the Sintech ngff m.2 NVMe SSD adapter do not require modification. The quick summary is you'll need a Mac Pro running 10.13+, an adapter and an NVMe SSD with a Sintech adapter, if you for some reason choose the GFF adapter, you'll need tape, a file and some free time.

    Working SSD list

    This list is from MacRumors by the user maxthackray, so all credit goes to him.

    • Adata NVMe SSD : SX6000, SX7000, SX8200, SX8200 Pro etc.
    • Corsair NVMe SSD : MP500, MP510
    • Crucial NVMe SSD : P1
    • HP NVMe SSD : ex920, ex950
    • OCZ RD400 (and all Toshiba XG3-XG4-XG5-XG5p-XG6 line)
    • Intel NVMe SSD : 600p, 660p, 760p etc.
    • MyDigital NVMe SSDs : SBX - BPX
    • Kingston NVMe SSD : A1000, A2000, KC1000
    • Sabrent Rocket
    • Samsungs Polaris NVMe SSD : 960 Evo, 960 Pro, 970 Evo, 970 Pro
    • WD Black NVMe SSD v1, v2 and v3

    Drives in red require, NVMe drives with 4K sector sizes which require changing.

    Incompatible NVMes

    • Samsung PM981
    • Samsung 950 Pro
    • Samsung 970 Evo Plus

    Useful Links




    RAM/Memory upgrades

    Officially most sites list the maximum ram for the 2013 as 128 but enterprising users at Mac Rumors found a maximum (similar to the previous Mac Pro) at 160 GB. The Mac Pro 2013 accepts 1866 MHz DDR3 ECC ( PC3-14900 ), but Mac Rumors users report that the Mac Pro 2013 can use non-ECC RAM as well.




    ThunderBolt 2 to PCIe

    There's a fair amount of options today on the market like the Sonnet Technologies Echo Express SE1 - 1 PCIe Slot (roughly $200) and it scales up rather quickly.

    The biggest modifications to the Mac Pro 2013 aren't internal, but rather massive PCIe enclosures that generally cost in the $1500-4000 range, making them often as expensive as the computer itself. There are a few options on the market like the Sonnet xMac Pro Server, which adds 3 full-length PCIe slots (you can see it on youtube), and the absolutely absurd JMR Quad Slot Expander adding 4 PCIe slots and 8 drive bay just to name a few. For the truly curious, can see the JMR expansion system innards.

    Not all PCIe enclosures support eGPUs. I've included in the eGPU section is a list of enclosures that support GPUs.

    Additional Notes on Thunderbolt 2

    There's a wide variety of Thunderbolt 2 products, chiefly storage systems (including RAID setups) and ThunderBolt 2 docks still on the market. Due to the sheer amount I'm unable to list them all, but it's important to remember that a fair amount of functionality missing from the 2013 can be recaptured with Thunderbolt 2, like previously mentioned, PCIe slots, eGPUs and the like.

    The Mac Pro 2013 to date includes the six Thunderbolt ports, the most found on any Mac before or since. To obtain peak performance, it's recommended that displays be connected separately from other high bandwidth utilities like external storage.

    The Mac Pro 2013 can drive three 4k displays or six 2560 x 1600 displays, and with the June 16, 2015 firmware update, three 5k displays (using two ThunderBolt ports and the HDMI port) internally.




    Thunderbolt 3 / USB 3.1c

    The Mac Pro 2013 can't be upgraded to Thunderbolt 3 bus speeds, but that doesn't mean it can't use Thunderbolt 3 / USB 3.1c devices (at the speed of Thunderbolt 2). Apple has a Thunderbolt 3 (USB-C) to Thunderbolt 2 Adapter, which is bi-directional meaning the same adapter can also be used for Thunderbolt 3 Macs to use Thunderbolt 2 devices. Notably not all Thunderbolt 3 devices are backward compatible, so you may want to check with the manufacturer for compatibility.




    eGPUs

    It's nearly impossible to talk about the Mac Pro 2013 without mentioning eGPUs. Mac OS now supports AMD eGPUs (almost) natively, and macOS 10.14.x does not allow for modern nVidia support making it nearly one-way path in eGPU. NVidia support for later eGPUs is limited to a maximum of Mac OS 10.13.x, and that does not appear to be changing due to a disagreement between Apple and NVidia. Unless this changes, this guide will not list Mojave incompatible NVidia eGPUs, despite the later GPUs being supported in Mac OS 10.12.x and 10.13.x. Currently, the RX (580x, 570x) line and the Vega (Vega, 48, 56, FE ) line by AMD are Mojave compatible and the Keppler line by NVidia are Mojave compatible. The eGPU.io community has a searchable database. If going for an eGPU, I highly recommend upgrading to Mac OS 10.13+ as it includes more native support thus much easier to set up, to the point of being (nearly) plug and play.

    Note: All Thunderbolt 2 Macs require disabling SIP and running Purge Wrangler to enable eGPU support.

    macOS Supported AMD eGPUs, * 10.13 required

    • Vega FE*
    • RX Vega 64 Liquid*
    • RX Vega 64*
    • Vega 56*
    • Pro WX 7100
    • Pro WX 5100
    • Pro WX 4100
    • RX 580
    • RX 570
    • RX 560
    • R9 Fury X
    • RX 480
    • RX 470
    • RX 460

    macOS 10.14 Mojave Supported NVidia eGPUs - Only Keppler series GPUs are supported

    • GTX 650
    • GTX 660
    • GTX 670
    • GTX 680
    • GTX Titan

    *eGPUs require Mac OS 10.12 or above.

    Confirmed working Enclosures with Mac Pro 2013

    • Akitio Thunder2
    • AKiTiO Node
    • Asus XG Station 2
    • Blackmagic eGPU
    • Mantiz Venus
    • Razer Core X
    • Sonnet Breakaway 350

    Useful Links




    Cooling

    Outside of the extreme JMR solutions PCIe slot Rackmount cases, Mac Pro 2013 cooling solutions remain pretty slim. Most users elect to use various laptop cooling pads to place under Mac Pros (which do seem to help). If anyone has any information about physical mods or Mac Pro 2013 specialty cases, I'm all ears, and please reach out to me (see the bottom of this post).

    Useful Links




    Repairs

    The Mac Pro 2013 earns the distinction of sporting a modular design. There's not a lot to say here since iFixit gave it an 8 out of 10 for repairability and has pretty much every part in its Mac Pro Late 2013 Repair Guide. If you can do it, they probably have a beautiful step-by-step pictorial guide.




    Mac Pro 2013 won't sleep

    MacRumors members note that Hand-off can affect a 2013's ability to sleep. Disabling seems to be the fix.




    Communities & Blogs

    You're not alone. There are more people out there than you'd think who still love the Mac Pro 2013.

    • MacRumors Mac Pro Forum - The center of the Mac Pro universe.
    • MacProUpgrade - a private but very popular facebook group, primarily classic "Cheesegrater" Mac Pro users with some 2013 users..
    • Mac Pro Users - The another major FaceBook group for Mac Pro users, smaller but still helpful and it has the benefit of being public too (no sign-up process and can be browsed without a facebook account).
    • eGPU.io - The go-to place for eGPUs.



    Collected Articles




    Buying used Mac Pro 2013s

    Most forums when this question is posed is don't. The updated Mac Mini may have a soldered on CPU and storage but with the Core i7-8700B is much faster than the 12 Core Mac Pro in single core performance and spitting distance of the multicore in Geekbnech scores, and packs Thunderbolt 3, which is double the bandwidth for the inevitable eGPU, and comes with USB 3.1c support out of the box, and doesn't have a history of frying itself. Plus, its new, comes with a warrenty and is even smaller. Then there's the iMac 5k which has an upgradable CPU making for faster than the the base iMac Pro when tricked out too. I personally would not buy a Mac Pro 2013 with much better and cheaper alternatives such as 2009-2012 Mac Pros which pack oodles more upgrades and stupidly better GPU options or the aformentioned Mac Mini, even with an eGPU would be roughly the same cost of a lower end used 2013. Unless the used market prices drastically change, the Mac Pro 2013's shortcomings are too great to make me ever consider one.

    However, this is a 2013 Mac Pro upgrade guide and the Mac Pro 2013 certainly a collector's/hobbyist achine with an extreme flair for quirkiness.So here's the big tips when shopping for the Mac Pro 2013. First and foremost, you will want to confirm that the GPUs are working, through the internet this may be tough but I'd recommend sticking to a service with buyer protection like Ebay. Notably, many Mac Pro 2013 users experience random freexzes, generally acknoledged to be GPU related by the community at large. Be extra sure to ask the seller that the computer is booting, the GPUs are fine, they may have had their GPUs replaced with working ones. Next the lower the AMD GPU model, the more chance it will remain problem free. Unfortunately, Apple stopped selling the D300 Mac Pros long ago, so its better tracking down a D500 model. Next up, many users have placed their Mac Pro 2013s on laptop coolers to help with the thermals. Due to the exceptionally tiny case, there's no internal cooling hacks beyond turning the fan up using 3rd party software. Laslty, have an exit strategy, you may live a full problem free existaence with a 2013 Mac Pro but you may also end up with it's GPUs failing. Apple has closed its free replacement program as of April 2018 for the GPUs, and internet prices list anywhere from $700-$1200 from Apple or authorized service centers to replace the GPUs. At this price, its effectively cheaper to buy a replacement Mac Mini. Working GPUs in the 3rd party sector are virtually impossible to find and the rare ones that pop up fetch the price of Apple replacements. To be fair, this is the same problem laptop users face. While it is common sense, if you contract or freelance or work where you provide your own hardware, always have a plan that minimizes downtime. Despite being a modular design, the most failure prone component is the absolute hardest to replace due to the lack of any inventory. Also, Apple quotes 3-5 days for a Mac Pro 2013 GPU replacement. This isn't to say it will fail but there's plenty of horror stories on the internet. This could be the case relatively small, vocal group but the general consensus is that the Mac Pro 2013 is not the most stable design.




    Changelog

    Oscar over the Mac Pro 2013

    Due to the ever-evolving list of possible upgrades and hacks, this guide is a living document, and thus the information contained may change, I've included a robust log of recent changes to help repeat visitors discover new content. Making and maintaining this guide takes a fair amount of work, and feedback from users is greatly appreciated to make this the most accurate/best guide possible. If you have new information not included here, suggestions, corrections or edits, please feel free to contact me at: blog@greggant.com. I get a fair amount of questions, and I try to answer them to best I can. I'd recommend asking the MacRumors forum or MacProUpgrade group first as I'm just one person vs. the collective intelligence of a community. Notably, I do not own nor have I ever owned a Mac Pro 2013 so anyone who can provide more accurate information, please do!

    07/05/19 - Added notes on sleep issues, mild intro update.

    05/07/19 - a second update, Thanks to the feedback of Brennan F and Daniel C for feedback on SSDs and eGPUs and some copy editing to boot.

    05/07/19 - First release and one year anniversary of my first Definitive Mac Pro Upgrade Guide. Fun fact, this guide is over 2300+ words whereas my other guide is 13,000+ words. Part of the amount of writing can be chalked up to having to discuss different models, five in total, spanning 6 years. This guide covers another 6-year span and only one model. It goes to show how upgradable the previous Mac Pros were and how much less Apple has cared about them since.


    12 weeks with Figma: A review from a developer

    Figma icon

    Introduction

    I think it's important to say what this review is and isn't since it'll be a reflection a certain perspective. Rather than tackle this as full break down of all the features vs. Sketch, I'm approaching my opinions as a developer. As front end developer, I'm more hands-on graphics utilities that most devs, capable of performing design myself but electing to let people who are better at it than I do it. Early in my career, I'd of described me as a designer who could code, but for the last six years of my life, it's been the coder who can design. Through my career, I've seen a lot of utilities for designing webpages, Pagemill, Golive, FrontPage, Dreamweaver to shoe-horned attempts with Photoshop, Illustrator, Affinity Designer, and even Indesign, Quark and Pagemaker. You name the asset, and I've probably been handed it. Name the design app and more than likely I've toyed with it. Now that we've cleared that...

    A bit of history

    Web design and UI Design has been a strange arch. Once upon a time, we had Photoshop and Fireworks, Fireworks being more a utility than designing tool. Photoshop itself predates the internet as we know it today, debuting in 1990. For better or for worse, Photoshop today still mostly feels like Photoshop 3.0 which introduced layers, although it wouldn't be until 5.0 until we had automation, layer effects, multiple undo, editable type (vector), and the 4.0 features like multicolor gradients, grids, PNG/PDF support and actions... in 1996. To double down on this point, if you had stumbled in a time warp two decades ago and popped up in 2019, as a Photoshop user, you'd have all the fundamentals. The internet though has changed wildly in scope and functionality in during the same two decades, broadly replacing/supplanting entire industries (travel agents, music stores, print media, movie rental chains, book stores just to name a few). This isn't a knock against Photoshop. Its purpose is revealed in its namesake, and it's wasn't meant to be a UX utility. However, since the early web being primarily restricted to fixed width designs and bitmaps made Photoshop the go-to tool for web designers.

    Macromedia (the more web savvy of the two design giants) sensing the shortcomings of Photoshop (especially in the world of optimization and gifs) filled in the gaps with Fireworks in 1998. Outside of the oddity of Flash websites, the playing field for designers didn't change much on the design application side despite the evolution and adoption of CSS2.0 and CSS 2.1 during the late 90s through the 2000s. Some designers opted to use layout applications or Illustrator, with a desire to treat the webpage as page, knowing the shortcomings of Photoshop for layout. As designers and developers, we entered a tacit agreement that 960px was the width of a webpage which worked until we needed a mobile web.

    A New Challenger

    Like any large philosophical change, it is generally a reflection of adaption to the state of the world around it rather than attaining enlightenment first. Ideas do not exist in vacuums. It required a fundamental environmental change to create the necessity for a better way.

    In 2010, an unknown Dutch studio, flying under the name, Bohemian Coding released Sketch in 2010 while the same year Apple introduced the world to high-density screens with the iPhone 4. The timing was impeccable. It was a vector app but designed for presentation on pixels. While vector applications for years had acknowledged that not all illustrations endpoint was print, they still were print-forward. The simple act of snapping points to pixels, and focusing on vector features that reliably exported SVGs meant that the (finally) widely supported vector format could generate assets that looked good anywhere or be prerendered to a PNG or JPEG. Values for common assets like font stacks and color also were displayed in CSS, which made developers happy too. By 2013, high-density displays made the jump to desktops with Apple's "Retina" display desktops. Websites didn't need to be just responsive; they also needed to be high resolution*. Sketch made it easier to tackle both problems. Thus, it largely dethroned Photoshop for web design. Photoshop was now left to do what it did best: edit photos.

    * It shouldn't come as any surprise that backlash against skeuomorphic design came in the wake of widescale support for web fonts, SVGs and high-density displays.

    Since the rise of Sketch, there's been a cottage economy of ancillary utilities, from simple plugins to the ambitious like Principal and Flinto for UX motion graphics/interaction design, all built around Sketch. The world of design and prototyping has exploded with tools like Adobe XD (and forgotten attempts like Muse and Proto) and the latest hip entry, Figma.

    All hail Figma?

    Figma Screenshot

    Figma seems like it might be as revolutionary as Sketch. Figma is a different beast as a web app, and using some avant-garde tech, as it uses Webassembly to distribute a C binary to make it far more performant than a usual web app. Being a web app also opens itself up to something that Adobe never has quite figured out: collaborative design. All designs are automatically stored in the cloud, which isn't a problem until it is. Instead of handing off to a client a PSD or Sketch file (or use a 3rd party service like Zeplin and iDoc), you share a Figma URL. If they don't have Figma, no worries, you can still use the viewing mode just fine which allows for exporting. There's even a free tier to get started on.

    Due to the nature of being web stored, means any designers on a team can invite other users to view and even edit the same file... at the same time. This means any changes are immediate. One can even watch in real time as a designer corrects a design or quickly mocks up a new piece of content. Also, there isn't any worry about an individual having the latest version of Figma as there are no separate versions. In this regard, Figma shines. Any time I'm viewing a design, I can rest assured that I am viewing the current design (assuming the designer keeps to that flow).

    The app is surprisingly fast, in a web browser or even as an Electron app. In fact, the Electron deeper OS interactions are largely ignored, sans system menus that correlate to the application. This makes updates to Figma in the Electron app happen inside the Electron app as opposed to having to download a new version. Updates aren't entirely silent, but when they arrive, it's quick. Running as a web app, the overhead bloat that comes with a web app requiring a browser to run, the Webssembly core makes the app feel zippier than other popular Electron framework apps like Slack and Atom that can choke up when your computer is under stress or otherwise. Some designers have even stated that Figma is faster than Sketch. I find this claim dubious, but as a web app, I can't name anything faster. It's damn impressive. It's also a memory hog. It also can seriously slow down if you have more than a few people viewing a document and have several apps open. There's no way to turn off the "multi-player" mode.

    As a design utility, Figma has a mild departure from other vector applications, chiefly how it handles vector drawing. Figma allows for multiple points from a single point, called Vector Networks which honestly of all the innovative features probably is the most singularly interesting from a purely design sense. It's so simple when you see in action, yet powerful. I have a feeling this will become widely copied if it hasn't already. I'd rank this as even more impressive than the collaborative designing as we've seen collaborative design inadvertently via Google's G-Suite, and in online games (while I haven't even played it, Minecraft's long appeal seems to be collaborative designing).

    vector networks

    Pictured: Vector network allow for multi-point vector points

    That said the limitations of view-only do not allow a user to manipulate it. When assets are handed off to me in Sketch, often move things around and generally make a mess of the file, extracting, manipulating, measuring. Figma does not have a user like me in mind. Instead, all manipulations are meant to be towards the end game of a finalized design, not a deconstructed one. Sketch continually annoys me with its autosaving, and no real "save as" function, but I've learned to live with it by duplicating a file and denoting it as "final" vs. "edited," to know which I've messed with. Figma, when I'm not granted ownership, doesn't let me duplicate to a file I can manage, and do my Jack-The-Ripper disemboweling to get the assets I want.

    Figma's handling of images isn't nearly as exciting. Placing images means uploading, which is a step backwards but part of the process. Cropping is well done. For basic crops, unlike most vector apps that require creating a vector object, and placing as a mask, Figma lets you more easily quick and dirty crop on the image. The area outside of the crop is designed when adjusting as transparent. It's a nice touch as you can easily see the remainder of the image and its contents. Also as options are CSS-like abilities to cover, fit and contain within the basic crop. I especially like this as its akin to how a web browser treats an image with object-fit. Opening large projects though can take a few seconds images to appear, as they are being downloaded. It's tolerable.

    Figma also allows for basic prototyping, akin from what I've seen more like MarvelApp than anything nearly as robust as the interaction designer tools like Flinto and Principal. It's functional, easy to use, and comes baked-in as opposed to Sketch's requirement of a separate utility. It's worth mentioning but I haven't played with it much. It's reasonable that Figma could be used for static wireframing/prototyping.

    Then the rough edges start to show

    Figma Screenshot

    Currently, there isn't an export option for the original asset. You can export your cropped image, but not easily grab the original asset. This isn't so great if you're using an image as a background image or an image that'll be using object-fit. Plus, there's no real "native" resolution displayed. If I export a 2x or 3x version of an image, will I get a blurry version? Who knows. There isn't a native export or native resolution listed for an image. I can't copy and paste out either as Figma's non-native experience means the clipboard isn't a truly native experience. Sketch allows you to copy as CSS, copy as SVG, copy Styles or copy the object in Sketch. Figma does not.

    In fact, Figma's handling as an asset manager sucks. Unlike a Photoshop smart layer, or at least in Sketch, the ability to grab the source contained within the sketch file, Figma is a black box.

    Figma also has a weirdness when copying out text (the only asset you can extract from a document to other applications) often adding in extra return carriages to the beginning. It's hardly a deal breaker but adds an element of unpredictability and clean up. This isn't something that one normally faces in design applications so it's worth noting.

    Probably the issue that draws the most ire from me is stupidity of the onscreen measuring, that is obsessed with the document borders and not relationships between objects. Figma loves to tell you how far an object is from the edge of screen but can be pain to get it to show relationships. Highlight one object, and you can hover over another object and it'll show you the distance but sometimes to the corner and not edge. It's also really bad about text, only calcuating to the container for the text and not the text itself. This feels significantly behind Sketch.

    Then there's SVGs. Figma is annoying, select any two vector elements, and you'll get two separate SVGs. Exporting a group is mystifying experience, and all strokes are converted to fills making for not the smallest SVGs. Trying to get an icon export lead a designer on my team re-drawing the icon in Illustrator. That's.... not good. Exporting gets even dumber, select two objects? You'll get two files. Oy. Sketch it is not. The rulers seem surprisingly dumber too.

    It also doesn't win any awards for multipage layouts which took Sketch awhile to figure out but, a multipage design exists as basically Pages, in an all or nothing design. It's there, but there's little beyond what one is used from Illustrator to Sketch.

    Also, being a web app means a complete lack of plugins. Unlike, say, Atom which is probably the pinnacle of Electron applications, Figma has no such architecture for plugins... for better or for worse. This isn't surprising but one of the big draws for Sketch is major services releasing plugins for Sketch, like Sketch For Marvel and Craft by InvisionApp. This may scale in importance depending on who you are, but Sketch has plugins, Figmas does not.

    Figma also touts that it outputs CSS code and more than Sketch. Output it does but useful it is not. It gets the basics right: font-family, border, color, font-size, which is great. The rest of what it outputs is a head-scratcher. It's a bunch of absolutely positioned insanity, I suppose in some sort of nightmarish world, one could make a non-responsive website using, but for just about any sane person it's useless. It's a slightly lesser version of Marketch.

    A higher tier

    Figma also offers a secondary tier.... a very expensive one at $45 a month, per user offering a suite of features that I did not have access to thus cannot comment on, beyond that they exist and they are expensive, almost exclusively related to access controls to designs and ownership.

    Personal Take

    Figma's best features over Sketch are:

    • Ability to share projects easily, no software to download to view.
    • Cross Platform.
    • Collaborative design on the same file in real time. I don't know anyone who works this way but it's easy to see the value.
    • Vector Networks. It's so simple yet so powerful, why haven't other vector drawing utilities had this? Truly a leg up on other vector apps when it comes to pure illustration.
    • A free-tier that's meaningful for people looking to learn (no private designs, limited undo history to a month, limited a month of people per project)

    Sketch's best features over Figma are:

    • Predictable and clear exporting
    • Native OS functionality making for importing/copy and pasting such
    • Exceptionally strong community support via mature plugins and visual libs
    • More adherance to SVG standard
    • Better text editing.
    • Better onscreen measuring.
    • Better asset handling.
    • No permissions when handed file needed to be granted to gain full access.
    • No required subscription for levels of access, less expensive ($99 for a year of updates + no requirement to continue subscription vs $144 and requirement for subscription for private designs/$45 a month for organizational users) and includes Sketch Cloud access for free.
    • Most of Figma's big features are matched with Sketch's plugins (sans real-time collaborative design and Vector Networks)

    There's plenty of head-to-heads of Figma vs. Sketch, but as a developer, end of the day, currently, I prefer Sketch, versioning issues and all. I can pull and edit and pick apart assets, and the exporting is vastly superior. The auto measuring in Sketch is also far more useful, measuring to edges and spacing seems more intelligent. Sketch also feels like a native app, whereas Figma comes as close as I've seen as a web app, in the same tier as Atom and Slack but with those "this isn't right" moments like when copy and pasting. Sketch's cloud feature suffers the same issues that Creative Cloud does: everyone needs to buy in to fully appreciate it. The best analogy I can use is Figma is Google Docs. Sketch is Apple's Pages. (Don't over think it too much).

    It's also fair to say I like Figma but I don't love it. It's not nearly as polished as I'd expect for an application that has the audacity to ask for either $144 or $540 a year for a seat. Figma, you're interesting but $144 is pricey and holy hell you are not worth $540 a year.

    Sketch isn't problem free, I still hate its file saving. Manipulating text still isn't as powerful as I'd like either. Auto-save is great, but it also makes for strange moments when you want to "Save As" and leave your original intact. I imagine this could be because of the paradigm I grew up with in the 90s.

    The future lies somewhere between Sketch and Figma. Some features are a novelty: I can see value having multiple designers able to work on a singular design but I also wonder how many projects need this ability. That said, the killer app is handing a URL to a client or internal team member and knowing that it will remain current, and anyone can view it natively with or without a license. That for some might tip the scale a good number of users and I don't blame them.


    After a decade, FireFox is winning me back.

    Something happened that seemed unshakable even a year ago. To say I have a lot of opinions about web browsers is putting it mildly as someone who's spent the last decade his life coding for them. Like most web devs, I went Chrome around 2009-2010 and stayed. Chrome started feeling less like a choice but a natural law. I already felt like too much of life was encapsulated by Google, from Gmail, to maps, to Chrome, not mention my dev accounts for API keys and the usual dev related nuggets of info Google has on me. I've tried to de-google myself, but each active choice seems to have me crawling back. I used Bing for years and still sorta, but it doesn't work for my daily workflow. DuckDuckGo lasted all of a week before I switched back to Google. Apple Maps has been mildly successful as an iPhone user. I still end up using Google Maps 75% of the time making it my most successful de-googlifcation attempt. I haven't even bothered with Gmail. However, something happened. My primary home browser is now FireFox. Unlike previous attempts, I didn't wake up and eschew Chrome but gradually over time I've found myself using FireFox more. Why?

    Reason #1: FireFox Quantum is radically more performant

    The guys at Mozilla knew that the performance was an issue, and it was even more so on MacOS. In a previous generation, the best way to experience FireFox was Chimera/Camino, a beautiful port of FireFox's Gecko to Objective C. FireFox has never felt mac like, and always laggier than other browsers. When Safari shook up the browser world, it was clear how much better of an experience Safari was than FireFox on OSX. It was fast. Ever since Safari, FireFox has been a 2nd tier experience on macOS. The lack of multi-threaded support was a big kicker as Webkit added this years ahead of Mozilla. Quantum brought this with its new Stylo CSS engine, and it shows. It's fast, is it faster than Chrome or Webkit? Probably not, but its damn close.

    Reason #2: FireFox has containers!

    I'm not much for social media. Name a service, and I probably don't use it (although I do like GoodReads and Untappd). I do however have FaceBook account dating back to when I first signed up for it in college. Facebook easily the most problematic of the social media companies. FireFox Quantum has a concept called containers, a way to isolate browsing experiences (cookies/caching) from other portions of the browser. The FaceBook container effectively puts FaceBook in jail, and I love it. Sorting sites also means placing various experiences separately from each other to prevent cross tracking. It's effective and wonderful. As much as FireFox touts privacy, it's never really been able to differentiate from Apple on this front.

    Reason #3: Better UI

    It's a goofy complaint, but the UI of FireFox before Quantum was relatively ugly and had a lot of wasted space. I wouldn't call the new FireFox beautiful, but it's solid and minimal, and I actually do not mind it at all. It has a distinctly Windows X look, but it's a big step up.

    Reason #4: DNS over HTTPS

    Chrome and FireFox are both adding this, but FireFox has it without having to use flags right now. DNS for years has been overexposed, so it's about time we have DNS over HTTPS.

    Bonus reason: Firefox screenshots

    Honestly, I don't use this as much as I should, relying on OSX's wonderful screenshotting but it is a nice feature.


    The limitations of integrating Pardot Forms into React or modern framework

    Here's a post that hopefully saves someone a few hours of googling. I have a client who has a React-based Kiosk application and they want to use Pardot Forms to capture leads. This seems like a relatively straightforward, even reasonable ask and I assume, you dear reader thought so too. So you've probably tried something like this, assuming that you can post to a form like some jackass who expects a modern framework that assumes that this is feasible and you envision and/or wrote something like the following in beautiful ES6/ESnext syntax:

    submitSignup(e) {
         const data = new FormData(document.getElementById('signupForm'));
         fetch('url-to-form-paradot-submission', {
           method: 'POST',
           dataType: 'jsonp',
           body: data,
         });
      }

    This seems like it should work but, of course, it does not. Instead, you'll be greeted with a wonderful CORS error but the Pardot forms do not have any CORS controls. There's little information, and the developer documentation is sparse but I was able to unearth from the salesforce docs the following:

    Submissions Using Ajax

    Pardot doesn't support submitting data to form handlers via Ajax requests. When attempting to submit data to a form handler using Ajax, you will likely see errors like:

    XMLHttpRequest cannot load {www.site.com/FormHandlerURL}. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin '{page from which form handler should be getting submitted on client's website}' is therefore not allowed access.

    This is what's known as CORS (Cross-Origin Resource Sharing). Pardot doesn't currently support CORS or JSONP for form handlers. It is possible to simulate a JSONP response by setting the Success and Error URLs for the form handler to be JavaScript URLs that execute Success and Error callbacks, respectively.

    So the answer is that it doesn't support AJAX and the only way is to hack the form responses to trigger JS scrips. I managed to find a GitHub project where a user created a Pardot Form AJAX Handler which shows an example with callbacks. Oy.

    Bonus: As total newbie to Pardot, I found these videos helpful for navigating the Pardot interface. They're out of date, the UI has changed, but the core features can mostly be found in the same places.


    The Mac Pro should be the most boring Mac.

    At the event, Apple also plans to debut new software features for its devices, including a dark mode for easier nighttime viewing and new productivity tools for the iPad. The company has also internally weighed previewing a new version of the high-end Mac Pro, according to people familiar with the deliberations.
    - Apple Plans on Combining iPhone, iPad, Mac Apps by 2021, Mark Gurman, Bloomberg

    Buried in the bombshell is something of particular interest to me and this blog. I find this worrisome, Apple really doesn't seem to get it. The Mac Pro should be a boring box with the latest whatever internals with a few drive bays and PCIe slot with big numbers that people like myself only care about. What it shouldn't be is worthy of a press event or representing a paradigm shift or some other Jonny Ives goofball design. Want a beautiful shiny "wow-your-clients" pro computer? That's the iMac Pro, not the Mac Pro. What people want is user upgradable storage, ram, and PCIe: for NVMe storage, for GPUs, for esoteric I/O upgrades. How do I know this? There's certainly a demand as I've had well over 40,000 different users access my Mac Pro Upgrade Guide, and my blog isn't even a blip in the digital ocean.


    Testing accessiblity with the CERN WorldWideBrowser

    Here's a fun one for any web developers out there, you can play with a JS reconstruction fo the WorldWideWeb Browser (the first web browser) for the NeXT OS. Go to worldwideweb.cern.ch.

    Its fun in it's own right, but the CERN browser is a good measuring stick of how accessible a website truly is without CSS or Javascript or even a full HTML 1.0 spec. Without any support of POST methods, WorldWideWeb can't even interact with forms but information is often visible.

    Take, for example, my blog. It's easy to consume, meaning anything can traverse it and index it with minimal effort. Anyone with an out-of-date browser arguably can access it. Incredibly many news sites are usable albeit awkward, like CNN.com and NYTimes fair well. ABCNews does not, nor will anything requiring a form like Google.com.

    Information architects should take note; there's something to be said about resilient markup languages are.


    Playing Dune 2 and Dune 2000 on Mac OS (and Command and Conquer Tiberian Dawn and Red Alert)

    Every now and again, I get a hankering for retro gaming and it ends up on this blog. I never played Dune II: The Building of a Dynasty on a PC, only the Sega Genesis port Dune: The Battle for Arrakis so it was news to me that you could play Dune II on MacOS. I assume anyone who is reading this probably knows the place that Dune plays in gaming history, but it's largely considered the title that defined the genre of the real-time strategy (RTS) or the first real time strategy (even if not entirely correct.

    I can't say I have an special affinity for the genre, as pretty much the only other RTSes I've played are the original Command and Conquer and Warcraft 2, but I always liked Dune: The Battle for Arrakis. I've revisited via emulation a few times. I hoped Dune 2 or Dune 2000 would end up on a service like GOG.com but sadly, it hasn't. Thanks to open source, both Dune II: The Building of a Dynasty and its sequel, Dune 2000, can be played on Mac OS, natively and with some modern improvements.

    Disclaimer: By the letter-of-the-law, abandonware isn't 100% legal but there's no real legal vector to obtain these games, each over 2 decades old. I don't see a moral quandry here, but you can always obtain the original game disks if you see fit.

    Dune II using Dune Legacy

    Dune Legacy on macOS 10.14

    Dune Legacy gives a nice modern twist to the original shortcomings of Dune II, including better AI, head-to-head, ability to group select units, more hotkeys, modern resolutions, HD graphics, and so on.

    1. Search "Dune II Abandonware" in your favorite search engine, it'll come up on many sites. Download it.
    2. Download Dune Legacy
    3. Open the DMG, and drag the Dune Legacy app to your Applications folder. Also, decompress the PC copy of the abandonware Dune II
    4. Right Click the Dune Legacy App, and click Show Contents. Open within the app, Contents -> Resources
    5. Drag all the .PAK files from decompressed Dune II into the Dune Legacy -> Contents -> Resources
    6. Double click to start, you most likely will need to whitelist it in GameKeeper. (Go to system preferences -> Security and Privacy)

    Dune 2000 using Open RA (and Command and Conquer)

    OpenRA Dune 2000 on macOS 10.14

    Open RA is for the Red Alert series, but also includes Dune 2000 support much like Dune Legacy modern screen resolutions and minor tweaks. Unlike some of the other OpenRA ports, OpenRA Dune focuses on delivering a recreation rather than improvements.

    1. Optional: Nab the Dune II ISO for Windows, from a site like myabandonware or such.
    2. Download OpenRA
    3. Unless you've previously installed Mono, you'll also need to download Mono, an open source implementation of Microsoft's .NET Framework
    4. Install Mono and then OpenRA - Dune 2000, Dune 2000 will automatically download the necessary graphic and sound assets, but if you'd like FMV or Movies you'll need the ISO, these can be installed at any time in the Management.
    5. Also, see d2kplus for mods, some are supported in OpenRA. you most likely will need to whitelist it in GameKeeper. (Go to system preferences -> Security and Privacy)

    For the Command and Conquer series, the install process is the same: find the ISOs for the music and movies and install Mono and OpenRA. Enjoy!


    the iMac hasn't been updated for 602 days

    As noted in the MacRumors Buyer's Guide and discussed in the MacRumors forums, it has now been 602 days since Apple last updated its iMac lineup, a new record for the longest span between iMac refreshes ever. The previous record was 601 days between October 2015 and June 2017 refreshes. - Joe Rossignol, MacRumors

    Ouch. Just as I was spit-balling the future of the Mac Pro, Apple has let another computer rot in the supply chain. Apple's major press-opuses should be relegated to only large scope updates, and frequent refreshes with much less fanfare, assuming this is even a bottleneck. Really though, we should see nearly yearly updates with CPU and/or GPU refreshes to keep in step with PC manufacturers. This isn't even a particularly astute observation or even remotely original. I'm guessing this has more to do with maximizing profit vs. the hassle of changing the magical JIT manufacturing. It's not a hopeful sign if even Apple's signature desktop has been given the cold shoulder. There's a feeling among the Mac faithful that Apple doesn't care about the Macintosh. With numbers like these, it's easy to see why.


    Year of the Mac Pro?

    Today the Macintosh is 35 years old. Rather than a retrospective, I'm more interested today in the future of the Macintosh as my most trafficked blog post I've written is an upgrade guide the classic Mac Pro which are now 13-7 years of age depending on the model. In mid-2018, Apple announced that the Mac Pro would be revamped in 2019 and yet many Mac loyalists were irked that at the WWDC Apple didn't announce a Mac Pro. It was never going to be a one-more-thing, as it's very unlike Apple to announce a schedule for a future product.

    Apple's desktop line up is far more crowded than it's been in some time, with the revamped Mac Mini, iMac, iMac Pro, and Mac Pro (which is half decade without an update). The Apple mainstay has a been at two or three computer company per formfactor, squarely divided between desktop and laptop (and briefly servers and eMacs) since roughly 2005 for desktops (Mac Mini), and 2008 for laptops (MacBook Air). The formula has been (outside of the illfated G4 Cube):

    Classic Era (2000 - 2005)

    • Entry Level - (iMac/iBook)
    • Professional - (PowerBook / PowerMac)

    Intel Era Line Up (2006-2016ish)

    • Budget/Small form factor - (Mac Mini/MacBook Air)
    • Mid-level - (MacBook / iMac)
    • Profesional - (MacBook Pro / Mac Pro)

    Current Era (2017ish - Current?)

    • Budget - (Mac Mini, MacBook Air, iMac 21 Standard Definition, MacBook?)
    • Mid-level - (MacBook? MacBook Air? / MacBook Pro 13 / iMac 4k/5k)
    • Profesional - (MacBook Pro Touchbar / Mac Pro / iMac Pro)

    Any iteration of the Mac Pro is going to confuse the Mac line-up further. The MacBook Air, MacBook Pro, MacBook are all within $100 of each other at for base models and 1.4 pounds, 1 hour of battery life and 1 inch of screen size. The Pro is clear-cut as the performer and the MacBook as the traveler and the Air as the bridge... for some reason. It'd only really take adding a second port (perhaps Thunderbolt) to the MacBook to negate the under performing Air.

    The iMac Pro is undoubtedly a powerful machine but at a king's ransom, starting at $5,000. If a Mac Pro lands with anything resembling a dedicated PCIe slot, user serviceable RAM and CPUs, I can't imagine anyone opting for an iMac Pro and especially if starts at the still very expensive $3000 entry point as previous Mac Pros have. The iMac Pro is powerful but also, at it's price-point with non-upgradable GPU, and terrible user serviceability, hardly a compelling buy. Then there's the Mac Mini, if the Mac Pro keeps its current G4 Cube influenced hostile-to-power-users design, then anyone who can survive on more modest CPU and 64 GB of RAM is likely to eat the cost of a Thunderbolt PCIe case (or do the same with an iMac 5k). Based on my interactions with the Mac power users of this planet, we're all after the same thing: PCIe, User serviceable RAM, and upgradable CPUs and storage. This really should be Apple's most straightforward release year-over-year, the form factor of the classic Mac Pro is perfectly fine. Dust it off, update the ports from FireWire/USB2.0 to Thunderbolt and USB 3.x, slap in a modern motherboard with the latest specs and call it a day. Ideally, Apple would sunset the iMac Pro as a nice experiment in industrial design. As much as Apple dislikes user-control, the one segment where the users know better than Apple is professional work, see the fiascos of Final Cut Pro X and the Mac Pro 2013 which lead to the deep pockets of Hollywood abandoning Apple for the likes of PCs and AVID. The iMac's DNA never has been to be performance monster, although it evolved from entry level to a nice mid-level computer, sporting a beautiful integrated display. If the Mac Pro is modular, than the iMac Pro becomes the next G4 Cube.

    Recently though, with the reintroduction of the MacBook Air, Apple has shown a willingness to confuse the Mac line up with no clear price point. This should be a bad thing but it isn't for the Mac Pro. So where does that leave us? I'm mildly hopeful. Just mildly.

    Edit 01/29/19: NYTimes writes A Tiny Screw Shows Why iPhones Won’t Be ‘Assembled in U.S.A.’, which blames Mac Pro production yields on a lack of a certain screw. The idiocy here is the over-engineering and probably a healthy dose of hostility to user serviceablity, to echo myself Right To Repair Law Should Be The Rally Call Every Mac / iPhone User.