Diskwarrior: 'Directory must be rebuilt from a recovery disk because of MacOS security restrictions' error and fix

    Diskwarrior still even two decades and change later is a lifesaver. I found my work Time Machine HDD had become corrupted, and Apple's disk utility couldn't fix it. However, I received a "Directory must be rebuilt from a recovery disk because of MacOS security restrictions" error from Diskwarrior. Fear not, you do not need to do this. Under 10.14+, you'll need to grant it the proper permissions.

    Step 1: Open the Security and Privacy panel in System Preferences

    Go to the Security & Privacy Preference pane in the System Preferences, and click the Privacy tab. Scroll Down on the left-hand side to Full Disk Access.

    Step 2: Unlock and click the Plus button

    You'll need to click the Unlock button and enter your password to ungrey out the +/- buttons. Once you've entered your password, click the + button as pictured.

     Privacy tab in Security & Privacy Preference pane in the System Preferences

    Step 3: Locate Diskwarrior and add it

    By default, Diskwarrior will install into the Applications folder, locate it (or where you store your copy of Diskwarrior) and click open.

    The + button action in Privacy tab in Security & Privacy Preference pane in the System Preferences

    Once you've completed this, you should see Diskwarrior in the Full Disk Access list.

    Diskwarrior in Full Disk Access in Privacy tab in Security & Privacy Preference pane in the System Preferences

    Congrats, relaunch Diskwarrior, and you should be able to use Diskwarrior. Fair warning, Time Machine repairs can take hours and a lot of RAM.


    Beats PowerBeats Pro Impressions

    My girlfriend at her company holiday party won a pair of PowerBeats Pro by Beats and gave them to me as she already has the Airpods. I haven't personally owned a set of wireless earbuds ever, as it never been a huge ordeal for me to use wired headphones, and Bluetooth still kind of sucks. They also have the benefit of not requiring a charge, meaning one less thing to charge repeatedly in my life. That said, the place that wireless headphones are attractive is working out, and the beats are designed for such an activity. Here are my collected impressions as someone who's finally slipped into wireless.

    • The beats aren't nearly as comfortable as other ear wrap designs. The early Shure Earbuds, the e2cs used a cable ear wrap design I found more comfortable.
    • The ear hooks are moderately goofy to put on even after a week I'm still a bit cumbersome putting them on or removing them.
    • They rest against the ear instead of in the ear. I bought foam tips hoping to change this. They only minimally block out the outside world, making them not the best for noisy environments like riding a bike to and from work or the gym.
    • Apple has completely screwed over the consumer forcing wireless headphones. I still hate it.
    • Apple knows the Bluetooth experience is shitty. Hence it has developed its own proprietary system with its W1 and now H1 chipsets as opposed to making it an open standard or at least something like MFI where manufacturers can produce there own. The experience of pairing is wonderful for the iPhone and they way wireless headphones should work, but Apple purposely has kept the H1 experience limited to the Airpods line and a few select Beats headphones.
    • The 3D presentation of the beats devices on iOS is garish and feels like an advertisement. If I'm at the point of pairing the device with my iOS device, it's pretty obvious I know what it looks like as it's in my hand.
    • I've only owned one pair of Beats headphones (the original earbuds). I bought 15 years ago for $99 at an Apple store. These aren't nearly as overblown in the soundstage but are certainly bass-heavy, but slightly thin, with an airy/breathy feel. I'd say that audio quality is less than that of 1more Triple drivers, which MSRP for $99 but have tumbled down into the $65-70 range. The PowerBeats Pro's soundstage is simply just ok, with an exaggerated soundstage, and overly bright treble and meaty bass at the expense of rich mids.
    • <liThe charging case is big. The magnet functionality, though, is undeniably cool and handy.</li>
    • The controls work just like a cord on a headphone, although I find myself hitting the play/stop button when inserting into my ears.
    • Auto-pausing when removing an earbud is nice.
    • Call quality is about as good as I've experienced from any type of headphone. It's honestly my preferred method for speaking on the phone over speaker now as there's no risk of cable noise. My girlfriend reported "clicking sounds" like I was typing when I was walking outdoors at the beginning of a call but then said it stopped shortly after. Otherwise, there's been little complaint from anyone I've talked to.
    • With corded earbuds with mic controls, often sweat causes the play/stop to trigger, it is nice that this doesn't happen when working out.
    • I can charge and listen to my iPhone. I'd be able to do this if Apple hadn't actively blocked case makers from producing cases with audio jacks.
    • The looks are in the eye of the beholder. I'm indifferent. The beats logo doesn't mean much to me, and I do like that I'm not wearing the same white earbuds everyone else seems to have. That said, the Airpod Pros seem like a better product.
    • The lack of noise blocking is my biggest point of contention with these.
    • The foam tips are better than the rubberized tips they ship with as they reduce the "thud" effect from walking.
    • Battery life seems pretty good.
    • The range on the PowerBeats Pro is much better than expected, I've walked 50 feet away and still had a connection my large office.
    • The PowerBeats Pro immediately showed up in MacOS as already authorized in MacOS which is a nice touch but there isn't a snappy "hand off" which would have been nice. I don't expect seamless uninterrupted audio, but it'd be nice if I could I could set up auto switching. I'd envision as any not system alert from the iOS device or MacOS device triggers the take-over (assuming the beats aren't already receiving an audio stream). Pause your audio on your iPhone, and hit play on your Mac and the beats auto pairs to the Mac, and then repairs to the iPhone when the Mac is paused and the iOS device plays an audio stream.

    Ultimately, had I paid $200-$250 I would have experienced some buyers remorse. These are not my daily driver earbuds as they're not as comfortable, nor do they block out sound (which is great when doing noisy chores like vacuuming, or biking etc). They're (mostly) a pleasure to use but the fit/comfort is subpar and the lack of noise blocking is surprisingly bad. I imagine its Airpods Pro or bust for the Apple headphones.


    Just a quote...

    “I never understood wind, I know windmills very much, I have studied it better than anybody. I know it is very expensive. They are made in China and Germany mostly, very few made here, almost none, but they are manufactured, tremendous — if you are into this — tremendous fumes and gases are spewing into the atmosphere. You know we have a world, right?”
    - Donald Trump, Dec. 21. 2019

    From the man, who thinks you can nuke a hurricane comes this verbatim quote.


    Setting up Operator Mono in Webstorm (or other Jetbrains/IntelliJ IDEs)

    I noticed there aren't any clear instructions on setting up Operator Mono in Webstorm or any of the Jetbrains IntelliJ IDEs. Changing the font doesn't work quite like one would expect, like the italics, but you can get it working with a little TLC. Out-of-the-box, my copy wasn't supporting alt-characters for italics and it needed to use a fallback font for characters. Webstorm like new IDEs support characters regularly not found in many fonts, called Orthographic ligatures (ligatures). These include less common font glyphs and assist by saving space and are more distinguishable. For examples of ligatures, see Medium.com Ligatures & Coding by Andreas Larsen. Adding the ligatures also seems to fix Operator Mono's italic character set although it could be Operator Mono vs Operator Mono SSM (Screensmart).

    1. Download the latest release from the operator-mono-lig project and decompress it
    2. You'll need to copy Operator Mono into original the directory located in the newly created folder, operator-mono-lig.
    3. Next, open a terminal window and install fontools, pip3 install fonttools
    4. Navigate to your newly created operator-mono-lig folder in the terminal and run npm install
    5. The run ./build.sh from the root directory
    6. Your newly created fonts will be located in the build directory
    7. Next, install your fonts. The easiest way to install the fonts is to select all the fonts in finder and double click, then click install. This will install them correctly and appear as new font family, Operator Mono Sm Lit, or something similar.
    8. Go into Webstorm, Open preferences, Editor, then Font, and select your newly created font.

    To get Webstorm to behave like Atom or VScode, you'll need to manually edit the code styles to use the italics. It only takes a few minutes. I recommend viewing code within Atom as a reference point using Operator Mono as it makes very good use of Mono's italics. I also have guides for Setting up Operator Mono for Atom and Setting up Operator Mono in Coda.

    Troubleshooting: I noticed that the original Operator Mono font I had didn't work, I redownloaded Operator Mono SSM, and it worked. SSM is the limited weight set that removes several weights like thin, ultra). File names cannot have any spaces. I'm also using the material UI for WebStorm, which may or may not make a difference.


    The Mac Pro Buyers Upgrade mini-guide

    I try and not recommend hardware based on my preferences on the Definitive Mac Pro upgrade guide, but here are my opinions for anyone looking for a cheat sheet on what to buy.

    CPU

    Mac Pro 4.1/5.1 configurations vary quite a bit but there's really only two CPUs to really consider due to pricing. The x5680 is cheap, even with a dual CPU Mac Pro, its roughly $70 to purchase 2 CPUs, making it almost the same price as a single x5690. The x5690 is the best CPU a Mac Pro 4.1/5.1 can house.

    GPU

    GPUs on the Mac Pro are limited to the AMD sphere for 10.14 Mojave and 10.15 Catalina. The RX580s are floating around often for cheap, and they're a good entry-level card. The 560 and 570s are even cheaper, but it is hard to beat the price-to-performance. The Vega 56 is probably the best overall value as its performant and can be flashed to a Vega 64 and lands just shy of the Vega 64 in performance after flashing. The Radeon VII is the king and hard to come to buy. At $700, it's expensive but much mightier in the compute benchmarks than the 5700 XT, making it a much more well-rounded card for video editing and other GPU accelerated non-gaming tasks.

    • Good: RX580 $100 (used) or $185 (new)
    • Better: Vega 56 $225 $300 (new)
    • Best: Radeon VII $600 (used) $699 (New) (Mojave and above only)

    Soon the Radeon 5700 XT will be supported for the Mac which occupies a space between the Vega 56 and Radeon VII. The Radeon 5700 XT is great for gaming but it computational scores are low, it performs much lower than the Vega 64 in Blackmagic's Da Vinci Resolve.

    Storage

    There's a lot of storage options for the Mac Pro. The Mac Pro doesn't support bifurcation, so inexpensive dual PCIe cards out a no go. The SanDisk Ultra 3D splits the price and cost intersection nicely besting a lot of the really cheap SSDs like Kingston, although the Samsung 860 is a better SATA drive. Moving the drive to SATA3 doubles the max-transfer speed. The HP EX950 again is another splitting the middle drive between price and performance, in spitting distance of the Samsung Evo 970. NVMe requires a firmware flash for 4,1/5,1s and much more leg work for 3,1s.

    Memory

    Not a lot to say, for the 4,1/5,1 buy 1333 Mhz, go for 16 GB DIMMs if you can afford it.

    I/O

    There's no reason to mess around on the cheaper solutions, Sonnet USB cards are problem-free and do not require external power. The Allego Pro and Allegro 3.1c are the same, featuring two USB 3.1 controllers with 10 Gbps (2.5 GB/s) for 4 ports. The only difference is the interconnect. The regular Allegro has one USB controller for 5 Gbps (1.25 GB/s) total bandwidth.

    Wireless

    Wireless is slightly annoying, but there are three options: Use a PCIe card, use a mini-PCIe card, or buy the parts online separately. Honestly, its best to just read the very long upgrade guide.


    The disrupters will be "disrupted"

    This example is not a hypothetical. The meal-kit company Blue Apron revealed before its public offering that the company was spending about $460 to recruit each new member, despite making less than $400 per customer. From afar, the company looked like a powerhouse. But from a unit-economics standpoint—that is, by looking at the difference between customer value and customer cost—Blue Apron wasn’t a “company” so much as a dual-subsidy stream: first, sponsoring cooks by refusing to raise prices on ingredients to a break-even level; and second, by enriching podcast producers. Little surprise, then, that since Blue Apron went public, the firm’s valuation has crashed by more than 95 percent. - Derek Thompson, The Millennial Urban Lifestyle Is About to Get More Expensive, The Atlantic

    I don't usually do much news commentary but I've been skeptical of the gig-economy and mommy-services for ages.

    The bigger question for me has, how did Silicon Valley start-ups manage to fleece investors for this long? The example that jumps to my mind is MoviePass, which was hemorrhaging money so fast that it went as far as to change user passwords to keep users from using its service. Many of these services existed as a "hack" via legal shenanigans. Labor laws are catching up, as Gavin Newsom signed AB 5.

    When these services actually charge what it costs to use them, like eScooters, or have the double-whammy of having to raise prices to be profitable and content with actually employing the people they're exploiting and be profitable, it's going to be brutal. Many customers will be priced out from food delivery to ride-hailing services. They'll be just like the services they "disrupted" but only with a nicer app to show for it.


    Visual CSS Regression with Backstop JS

    In 2015, I wrote an article, Visual CSS Regression Testing 101 for Front End Developers, where I covered the two competing philosophies of Visual Regression testing, Comparative vs. Baseline. Since then, PhantomCSS was sunsetted as PhantomJS was not as good as running Headless Chrome and BBC's Wraith works but wasn't ever as useful as I'd of liked.

    What is Visual Regression Testing?

    There are other primers on the concept, but it's worth quickly covering visual regression testing. In the course of development, CSS/JS/templating changes can potentially have unintended changes on your website or web app. Visual Regression Testing seeks to automate the laborious task of comparing visual elements to see if any unexpected changes have occurred. This is performed by running scripts with headless web browsers to render the webpage, then capturing its renderings, and using a show diff tool to compare the screenshots, flagging changed elements for review. Once approved, the latest changes are "approved" as the gold master and then saved to compare against next time you run the test.

    Now, four years later, Backstop.JS emerged, mixing (mostly) the best of both Wraith and PhantomCSS.

    Back when I first investigated visual regression, I spent time discussing baseline and comparative tools. Baseline visual regression tools in the talk I attended were complete screen renders, whereas the comparative tools could query individual DOM elements. In hindsight, The distinction between baseline and comparative is somewhat of a moot one, as comparative tools can do baseline checks as they're able to query the screen, be it the entire body. That said, tools like Wraith that only do full-page screen renders can't make individual element selection thus are far more limited. At this point, I doubt either term gets much play, nor does it need the distinction as people have gravitated naturally to a tool that can query DOM elements.

    Backstop.js

    Backstop.JS gets major points out the gate as easy to use. Just run the global npm installer, then navigate to your project directory and run backstop init. It'll create a boilerplate template ready for you to start writing tests. This a serious upgrade, considering I once wrote a 12-step guide on how to install PhantomCSS.

    Running tests is also easy, run backstop test from the root directory and backstop will take care of the rest. Approving a batch of changes is easy, just punch in backstop approve.

    Next up is formatting: All the tests are created in using JSON, which is easy to read and familiar. I've never been super into YAML, and I like JSON. Everyone likes JSON.

    Where Backstop shines is how quick I went from never having written a test to having queried a roster of visual elements found our company website. Start up by declaring a set of screen sizes, and I created my own mobile, tablet, desktop, and large desktop screen sizes.

    {
      "viewports": [
        {
          "label": "phone",
          "width": 320,
          "height": 480
        },
        {
          "label": "tablet",
          "width": 1024,
          "height": 768
        },
        {
          "label": "laptop",
          "width": 1280,
          "height": 800
        },
        {
          "label": "highdef",
          "width": 1920,
          "height": 1080
        }
      ],
    }

    My first tests were entire pages, then I quickly graduated to advanced Backstop, testing our mobile menu. The mobile menu had a few considerations:

    • It must be clicked
    • It only makes sense to test it on a mobile resolution
    • There's a delay for the animation

    {
      "label": "Emerge Menu Open",
      "cookiePath": "backstop_data/engine_scripts/cookies.json",
      "url": "https://dev-site-url",
      "referenceUrl": "",
      "readyEvent": "",
      "readySelector": "",
      "delay": 20,
      "hideSelectors": [],
      "removeSelectors": [],
      "hoverSelector": "",
      "clickSelector": ".hamburger",
      "postInteractionWait": 1000,
      "selectors": ["header #site-navigation"],
      "selectorExpansion": true,
      "expect": 0,
      "misMatchThreshold" : 0.1,
      "requireSameDimensions": true,
      "viewports": [
        {
          "label": "phone",
          "width": 320,
          "height": 480
        }
      ]
    },

    And there you have it; my mobile navigation is being tested against JS breakage and CSS changes. I'm fairly impressed. There's even integration for Running custom scripts. The only hiccups I've had is with AJAX content. I used remove element to hack out the DOM elements, which created reliable elements to test around the AJAX content, and for the AJAX content itself, I used the readySelector.

    Lastly, chaining events is a bit cumbersome as you'll be coding up scenarios, but its still much less overhead than the days of PhantomJS.

    Chaining Backstop to deploys

    The next step is to chain backstop test to deployments. The demo shows Backstop playing with Jenkins deployments. At my office, we use bitbucket pipelines. It's a matter of translations.

    Git flow

    The gif work flow is pretty straight forward with Visual Regression testing, ignore the test folders, and track the gold masters. Backstop creates a new timestamped directory for each test in /backstop_databitmaps_test for each test. Depending on the number of tests, you run, it's easy to churn out hundreds of megabytes of images, so be prepared to have a trash collection method if you're running via a deployment method that might require such.


    iOS needs better app organization

    A friend of mine switched to iOS after nearly a decade of Android usage. This spawned a lot of back and forth about iOS vs. Android. One criticism I cannot defend is iOS's icon organization and folders. In 2010, Apple created folders with iOS4, (if you need a memory jog, here's what they looked like). The original visual analogy used a visual metaphor of sliding back to expose the contents, as we were in the midst of peak skeuomorphism. At touch-interfaces were relatively new, and Apple had the monumental task of on-boarding droves of barely-digital-literate users, this serviced that App. As a UX developer, I carry a lot of opinions about interfaces, so it shouldn't be surprising that I feel the need to vent time-to-time. Here are several complaints I've harbored for years combined with some slap-dash, non-pixel-perfect UI mockups.

    1) Waste of space on a broken Metaphor

    More than half the screen is burned on a blurred-out effect, presenting a minimal amount of icons. The modern iPhones are massive compared to the era of much smaller/manageable/dare-I-say-superior-iphone-5 form factors. There's no point to a 3x3 grid. It's annoying and silly. With increased storage capabilities comes more data. Today's iPhones can come clocking in at 512 GBs of storage, capable of storing the amount of data familiar to desktop users. iOS hasn't grown to take on the desktop levels of data or applications.

    Folders are a waste of space on iOS13

    2) Custom App icons

    Visually, the mini-grid isn't a bad choice, but it's dated and loses its poignancy beyond the nine apps. Plus, at a glance, it doesn't visually 'jump out' among a mess of similar icons. It'd be easy for Apple to denote a folder icon by a slightly different change quickly. Here's my 10-minute mockup of what it could be like with a custom icon.

    The focus shouldn't be on my graphic design choices as I did this fast and dirty. Instead, the take away is folder icons could vary visually from the current App icons to make them distinctive.

    iOS should have custom folder icons

    3) Folders in Folders

    Next up is another gripe is folders within folders. Apple has done quite a bit to avoid hierarchical navigation in iOS, but it exists in the system preferences and now within the Files app. Merely transposing the visual interface in files gives a sane approach to folders. Combining custom folder icons, users can see where they are in the breadcrumbs.

    Folders in folders

    4) Vertical scrolling in folders?

    Vertical scrolling on the home screen has existed before with the jailbreaks, Infiniboard, or Springfinity. Vertical scrolling within folders would help express the folder metaphor of the past-tensed drawer and ease app migration.

    5) Make search results meaningful

    Search on iOS never shows you where files are. See below.

    Folders in folders

    I can think of a few ways to alleviate this, such as list results showing location to the right for the top for App matches. I didn't bother to mock them up as what's really important is the lack of context.

    6) Better App movement

    If you've ever had to organize an iPhone, the task is so tedious it can take hours if you have a fair amount of apps. I've seen various suggestions and honestly, at this point. I'd take any.

    7) It's time to loot macOS: Smart Folders

    iOS needs to grow up. The Files app is a nice start although imperfect compared to iFiles found in the jailbreak world. Apple already has a brilliant solution that it can port to iOS, allow the OS to do auto-organization with Smart Folders. Smart folders for the unfamiliar work by using predetermined search strings. Apple could take it further and set Smart Folders on iOS to organize based on Application types. Upon app purchase or reinstall, the user can select "Smart folder", "dock", or "custom folder" and stay ahead of organization. Brillant right?

    Bonus macOS -> iOS features

    • Loot macOS's columned view for files.
    • Allow for smaller grids and list within folders.
    • Tap and hold on icons has a "Get Info" screen so you can see how much data your application is using and its associated folders
    • In a perfect world, tabbed interface to make dragging between locations easier.
    • A font manager.

    Pagespeed Insights is useful again thanks to Lighthouse

    On November 17th, 2017, I called utter and complete stupid bullshit as Google Pagespeed was giving glowing scores to pages like Wired.com and Newsweek.com. Both garnered higher scores than my hyper minimalist blog, which has a whopping 2.9k of CSS and about 40k of JS against their megabytes of JS and images. It was so irrelevant that it caused me grief professionally as clients would be unhappy with their scores despite being fairly optmized.

    Pagespeed isn't perfect, but it is now what I'd consider fixed, and I've meant to write this article for some time. My blog's page speed has gone up to 76 from 70 on my homepage. Individual articles, such as Google Page Speed lacks common sense, now scores 90. Sanity has been restored. I'm not just saying that because my numbers are better. Let's start with my complaints.

    • Pagespeed did not care about JS bloat, long as it wasn't directly linked. If a library decided to append several megabytes of Javascript, Pagespeed wouldn't even blink.
    • It would advise on how to optimize iframes even though the user has no control over such things
    • It did not care if you used post-JPG/PNG formats
    • It made no effort to measure total requests.
    • It made no effort to measure time-to-paint
    • It failed to recognize minified HTML due to a single line-break
    • It did not compare against any other real-world dataset making it relativistic only to your website's previous scores

    Surprisingly, all my major gripes were resolved to the point where I feel like a Google engineer took umbrage with my post (let me be clear; this almost certainly didn't happen). Lighthouse is oodles better (I've been using it since mid/late 2018), trying to leverage Chrome User Experience Report for higher tier pages. What's interesting is the change philosophically from the technology bucketed approach of CSS, JS, HTML, and Server-side technologies existing in their own orbits to one that clearly has standardized goals. This makes cross-site comparisons more sane. The benchmark metrics are as follows: First Contentful Paint, First Meaningful Paint, Speed Index, First CPU Idle, Time to Interactive, and Estimated Input Latency. This lends itself to a greater understanding of the stages of a web page's life-cycle. This is the most significant change, and I approve of it. The feedback is more meaningful, as well. Below is an unorganized list of my observations.

    • If you are using Wordpress, it'll suggest plugins to assist certain tasks. Another meaningful change that I didn't think of to rant about was DOM Tree depths. Google now recommends less than 32 levels of depth.
    • It makes some executive calls like using font-display: auto; which re-enables one of the banes of web dev, Flash of Unstyled text that was expressed in shorthand, as "FOUT, FOIT, FOFT". Google prefers the FOUT.
    • Google has stepped away from suggesting minified HTML, likely because compressing HTML is far more important, as demonstrated here. You can still minify HTML to squeeze extra bytes away.
    • Lighthouse measures JS execution times, not just size.
    • It no longer suggests the wantonly silly declaritive image sizes. Prior, Pagespeed wanted you to write out the pixel value to speed up render times. This was good advice in 2000, but tragically out-of-step for the responsive web.
    • Audits can pass even if they are not met 100%, such as minify CSS or JS if the vast majority has been met.

    All in all, it's good to see Lighthouse.


    Thoughts on Apple Arcade

    Apple Arcade is everything it should be, solving the biggest problem the Apple store has had: a vector for premium/high-quality games to be delivered without leaning on In-App Purchases. Not all IAPs are bad. There are a few titles that have done them correctly. Time Locker has only a $3 purchase that's remotely required, and it does not have any consumables. The only other IAPs are optional characters. Polytopia that unlocks "races" for $1 purchases, for a grand total of 9. The most you can spend on either game is roughly $10-15, which seems right for a high-quality mobile game.

    Most though, as almost everyone knows are the detested loot boxes or in-game currency, and thus we've seen a race to the bottom. For years I lamented there wasn't a classification for full-fledged games without IAPs. I wanted a premium game store where developers could charge $10-$25 and get their fair shake but never did I consider a subscription gaming service. I don't play many mobile games, but when I do, I don't want to spend hours looking for titles that are pay-once models. When at the gym, I tend to walk for 15 minutes of warm-up, and at that time, I play silly iPhone games. Tower defense games are a personal favorite, and I play them before committing to running a 5k and off to do other activities.

    Apple Arcade is damn good value as it stands, today. I don't think I've seen any console or gaming platform launch with so much content. It's good enough that I worry though about the rest of the App Store, as there's enough content for me to work through for quite some time. Also, as an added bonus, the Apple Arcade isn't limited to iOS or iPadOS, it's coming to tvOS and most importantly, Mac OS. In one swoop, Apple has a platform that spans every compartment of gaming: mobile, tablet, console/TV, and desktop (PC). All its competitors are missing one of these buckets (Steam, Microsoft, Google, Sony) and none have games that titles that can be easily ported to between all formats. Depending on Apple's commitment to funding titles, this indeed could be a very big deal. I may eat my words later, but Apple Arcade is probably the most significant product Apple has conceived since the Apple Watch. At $5 a month, it's the cost of one Playstation or Xbox game a year, or the same price as PSN a year.

    There's still asterisks to be resolved: what does the future look like? How many games can we expect? Will we ever see ports or non-exclusive content on Apple Arcade? Apple Arcade isn't going to be the end-all-be-all for gaming but out-of-the-gates, it's competition for Sony's very successful and very well executed PSN.

    So far, I haven't mentioned one giant of gaming, arguably the most loved of them all. Apple Arcade probably won't be causing much damage to the Sony Playstation, Microsoft Xbox, or Steam platform but it's scope of more casual/family-friendly titles and whimsical nature certainly reminiscent of said company, and I wouldn't be surprised if it caused parents too Switch by giving kids hand-me-down devices or simply get them an iPod Touch. Apple this year already is the fourth largest gaming company and previously, it hasn't even tried.

    I doubt we'll see any AAA-Titles ported to the Apple Arcade be it popular sports Franchises like Madden, NBA Live, NBA2k, FIFA or any games perhaps based off professional sports leagues due to licensing. I also wouldn't expect ports of classics like Sonic The Hedgehog to be folded in. (I wouldn't rule it out either, as Sonic and Frogger both made appearences.)

    Lastly, the service isn't perfect. There's not really a Steam/PSN/Xbox Live system to it for friends lists for gaming. There's also an extreme lack of titles with much depth, many of the games I've tried are nugget sized experiences. I personally love Cricket Through the Ages, and really liked Assemble With Care, both criminally short but some of the others less so. There's only so many single-button games I want to play. Some of the more hyped titles like Saynara are beautifully shallow or feel like a demo like Red Scare or cheap knockoffs like Punch Planet. As a fan of Oceanhorn, I look forward to diving into Oceanhorn 2. I'll chalk it up to launch titles, rarely are they the pinnacle of a console sans a few rare outliers, mostly from Nintendo (Tetris, Super Mario World, Pilot Wings, Super Mario 64). That said, with my relaxed interest in gaming perhaps more nugget experiences are exactly what I'm after if delivered right.

    I'll be interested to see if I agree with myself two years from now.


    Android phones now have 12 GB of RAM Apple is still shipping laptops with 8

    Right now, there are multiple 12 GB of RAM phones on the market. They run the gamut of prices and spec, but none are more expensive than a MacBook Pro 13 inch with a factory config, save perhaps the Galaxy Fold, a curiosity.

    • Samsung Galaxy Fold $???
    • Samsung Galaxy S10 Plus $1,600
    • OnePlus 7 Pro $1299
    • Asus ROG Phone 2 €899
    • Samsung Galaxy Note 10 Plus $929
    • Xiaomi Black Shark 2 or Black Shark 2 Pro $879
    • Lenovo Z6 Pro $849
    • Xiaomi Mi 9 Explorer Edition $800
    • Nubia Red Magic 3 $700
    • Vivo iQOO $650

    I didn't look up the MSRPs but rather what seemed to be legitimate prices online to give an honest representation. That's likely an impartial list, but there are at minimum 10 Android models shipping with more RAM than a $1299 MacBook Pro, and even the $1999 model of the 13 MacBook Pro ships with 8 GB of RAM. In all 13 inch models, it must be custom ordered. There's been a bit of stagnation for laptops and RAM, partially due to chipsets, partially due to modern OSes using much more efficient RAM management via compression aided by SSD scratch disks and lastly due to the increased power draw. The last feels increasingly irrelevant as phones have caught up to laptops and the foolish TouchBar. It was only in July of 2018 that Apple addressed the lack of 32 GB RAM options for the MacBook Pro lineup.

    Just as a barometer of applications: Adobe Photoshop, Lightroom recommends 8 GB of RAM or more, Illustrator recommends 16 GB with 4 GB being the minimum, After Effects recommends 16 GB with 8 GB being the minimum. Notably, the assumption is you would not be running multiple professional applications at once, which in reality with say, After Effects, which can routinely involve any number of 2D editing applications, and even 3D apps to juggle resources. Then there's development, where Docker or VMs/Simulators, and horrid JS memory vacuums exist. I'm not even going to touch professional audio. GPUs crossed the 8 GB barrier some time ago, meaning you could connect a 16 GB AMD Radeon VII to a $1999 MacBook Pro with 8 GB of RAM. Even for general web surfing, it's easy to saturate 8 GB of RAM with a browser with poor memory management (chrome).

    None of Apple's Pro laptops should ship with less than 16 GB and the MacBook Air should have a factory model that ships with 16 GB of RAM. This would be moot if we had user-serviceable RAM upgrades. Laptops needn't be held to the modular standards of desktops, but they should be for basic specs.

    Also worth noting all iPhones 11s have 4 GB of RAM*. 4 GB of RAM is good for right now but seems a bit counterproductive until you consider the revelation that the iPhone 11 may have 2 GB of ram dedicated to the camera. iOS's memory management works mostly due to Apple's stringent background task management. My guess is the next iteration of the iPhone will probably move to 6 GB of Application RAM and 2 GB for the camera.


    Under-the-hood blog updates

    Over the break, I went on a binge of minor changes to this blog.

    • The privacy policy and contact now exist on their own pages. Google supposably prefers this. Before both items existed on the about page.
    • This blog now supports a JSON Feed, looked at Apple News but screwed up the process for importing my RSS feed. May revisit that later but with the low traffic most of this blog sees, not really worth the effort.
    • Improved the JS. To reduce requests, I've concatted four JS files into one. I upgraded jQuery 1.12 to 3.x as it is faster and smaller, and its no longer hosted on a CDN.
    • Fixed the canonical URL declaration in the head.
    • Removed a few errant CSS classes, now this serves an absurdly low 2.9k of CSS down from roughly 3k.

    Sometimes I think the lack of visual flourish is mistaken for a lack of design but I like minimalism.


    How to import Feedly feeds into NetNewsWire

    NetNewsWire 5.0 was just released as open source. Ever since Google Reader shut down, I've been using Feedly as an aggregator and Reeder for iOS. Reeder makes it very easy to import Feedly feeds but it isn't as straight forward for NetNewsWire. Fortunately, Feedly and NetNewsWire both support OPML (Outline Processor Markup Language) to import/export feeds. Feedly buries this so here's a quick step-by-step to get up and running with NetNewsWire and Feedly.

    Steps

    1. Sign in to Feedly and click the gear.

      Feedly Gear Icon

    2. From the settings, next to the import OPML click and the export button.

      Feedly Export OPML

    3. From NetNewsWire, select File -> Import Subscriptions and import your OPML file

      NetNewsWire import OPML

    That's it! Enjoy. I highly recommend Reeder for iOS.


    RSS feed is fixed

    I'm pretty sure since the inception of this blog, the URLs in the RSS feed have been broken. I just noticed writing about NetNewsWire. Now they are fixed. You can add the RSS Feed here to your favorite RSS reader.

    Expect a slow trickle of tiny improvements to this blog, like the printable blog posts.


    Stupid Scary or Scary Stupid

    This isn't a blog about politics or even my opinions, but I'm going to go on the record and say nuking hurricanes is a bad idea. This seems like a silly thing to say, but here we are, having a national discussion about it, and why it is a bad idea. It's worth noting our sitting president suggested multiple times according to Axios, that suggested nuking hurricanes to stop them from hitting the U.S..

    A lot of people are clowning this idea (because it's patently stupid), but it's not entirely wrong, rather a problem of scale. We simply do not have enough nuclear weapons on the planet to end all hurricanes. If truly want to nuke our way out of hurricanes, we'll have to invest a lot more into Nuclear weapons. No more earth = no more hurricanes. Checkmate, hurricanes.

    Jokes aside, Axios may not be the household name, nor even my go-to source for journalism and Trump denies it, but it is believable. Let us consider that this is fresh on the heels of Trump asserts, "I could win that war in a week. I just don’t want to kill 10 million people". Then there are the long-standing reports that Trump asked multiple times why can't use nukes? multiple times and would use nuclear weapons in response to a terrorist attack by ISIS. Even if the latest is "fake news," the other accusations are damning enough.

    "The biggest problem we have is nuclear ... having some maniac, having some madman go out and get a nuclear weapon." - Donald Trump, 2016

    I couldn't agree more. This is clearly a person who is unfit to be given responsibility for the most destructive weapon humankind has created. As the great political journal of our age, TeenVogue*, wrote, "Yes, Trump could instigate a nuclear war without anyone stopping him."

    We are all at the mercy of a tweet.

    *Not a satirical comment.


    Git Hotfix workflow for Pantheon.io

    We've all been there, you have changes that haven't been QAed, but there's a hotfix that needs to go out yesterday. Pantheon is a great host, but it has one major gotcha: you can't switch code branches. The way Pantheon works is the Pantheon Remote git library gets one, and only one branch. There's no way to switch branches. Code can only be promoted from staging->test->live. This is problematic, especially if you're coming from various deployment management utilities that let you switch branches or platform-as-a-service like Heroku.

    Pantheon's recommended work flow

    Image Credit: Pantheon, Use the Pantheon Workflow

    Pantheon does offer multi-dev which essentially creates a separate branch for testing (and which can be promoted to the main chain) but still doesn't fix the hotfix issue.

    Pantheon Hotfix Flow

    1. Create a local branch and reset to the last commit that was made live (this is a pain as Pantheon doesn't show last commit git hashes)
    2. Make changes locally. Commit the changes to your new branch.
    3. git push -f Pantheon YOURBRANCHHERE:master
    4. Promote from Dev -> Test -> Live from the control panel
    5. Make sure your hotfix is merged into your master local (and your origin)
    6. Reset Pantheon’s Dev to the master branch git push -f Pantheon master:master

    Despite being warned that you should never use git force, this is the cleanest method. You can push up your desired hotfix and leave it on the live environment until your normal deploy chain overwrites it.


    Printable blog posts

    Using the magic of @media print, I've included a mild update to this blog to be more printer / PDF friendly, mostly for a singular post, The Definitive Mac Pro Upgrade Guide that drives the majority of my traffic.

    • Images are now capped to 75% of the page width as my images are only double density, optimized for screens, not printers.
    • The main content now expands to the full page width, so users can set the margins within their print prefs.
    • The main body copy, and line height has been reduced from 16px to 15px, and 2 em to 1.75 em.
    • I've included a link back to the original blog post at the top of each page, visible when printed so PDF users can easily return to the blog post in a web browser.

    Happy printing, you weird PDF-loving bastards!


    Time Machine: An Error Occurred Restoring from Backup + Fix

    My 2017 MacBook Pro stopped charging and refused to accept power from any power supply on any port. I was restoring my My computer to my previous laptop, a 2015 MacBook Pro and encountered the above. I tried using the day-before's back up, but this didn't work.

    I received the following message when booting from a Time Machine drive:

    An Error Occurred Restoring from Backup

    An Error Occurred Restoring from Backup

    To Try Restoring from a different backup, click choose other Backup.

    To reinstall macOS, click install macOS. During the install you can chooes to restore your information from a Time Machine Backup.

    To Boot from an existing macOS installation...

    I've seen some high tier fixes, like harryfear.co.uk's fix but there's an easier route and the clue is in the error message.

    More carefully reading the message, I booted off the Recovery partition and then reinstalled macOS. Then once completed, on the Migration Assistant I selected the option to transfer information over from my Time Machine drive. This isn't a true 1:1, I noticed some things missing such as /etc Apache2 modifications but some of the geek stuff like, HomeBrew and its many CLI applications (Heroku) made the cut. Beyond renewing SSH keys and running Docker builds, my computer was good to go. Standard Mac applications had no issues.

    Summary

    If a restore fails, fear not. Restores are faster but you will not lose your important files.

    1. Boot off a recovery partition, reinstall macOS
    2. At the end of the installation, you will see the Migration Assistant. Select transfer files from another computer/device/Time Machine then select your time machine drive

    I suspect for most users, self-included, the harryfear fix is overshooting the problem and Apple's solution is "good enough".


    Chrome does not support media queries on video source tags + a workaround

    Sometimes you encounter something that'll surprise you, and yesterday was one of those days: Chrome does not support inline media queries on the source tag within videotag. (you can test it here) Worse, plain media queries will not stop multiple videos from loading, which effectively doubles your data, so it requires a JS solution. CSSTricks has an article from 2012 using jQuery, but there's no follow up and I wasn't that enthralled. I saw, thenewcode: Make HTML5 Video Adaptive With Inline Media Queries but it fails to mention Chrome's refusal to support it.

    Javascript to the rescue

    First, I wanted to prevent any request to be made, so I created an empty video tag with my two videos as attributes. Easy right? Now that all major browsers support MPEG4, I could safely assume the only legacy users are IE and Safari as the browsers are tied to OS updates, whereas Chrome and FireFox are not thus very few users would not be using a recent browser. Safari and IE both support MPEG4. There's not a good reason for me to want to support WebM.

    <video
      preload="auto" autoplay="" loop="" muted="" playsinline=""
      data-desktop-vid="https://iconaircraft.s3.amazonaws.com/ICON_Web+4.0_Loop_16x9_DRAFT190723_26sec+3700.mp4"
      data-mobile-vid="https://iconaircraft.s3.amazonaws.com/ICON_Web+4.0_Loop_1x1_DRAFT190723_26sec-mobile.mp4"
      >
    </video>
      

    I didn't want to rely on any framework, jQuery document ready meant the JS wouldn't fire until the rest of the page loaded, and es6 meant leaving out old browsers. Thus, I'm limited to ES5.

    First, I needed to get all the videos on the page. This creates a variable that contains an array of objects, even if only one is found on the entire page.

    //get all vids
    var video =  document.querySelectorAll('video')

    Next, I needed to create a source for the video tag. The source tag needs an src and type. After that we need to append the newly created DOM element back to an element. This function doesn't need to know how many videos are on the page or what the screen size. It just will return a source to a video tag.

    //add source to video tag
    function addSourceToVideo(element, src) {
        var source = document.createElement('source');
        source.src = src;
        source.type = 'video/mp4';
        element.appendChild(source);
    }

    Next is where the logic happens. If a screen size over a predetermined value, I will load the desktop or mobile version. Since I have two data-attributes to work off of depending on the screen size depends on which one I want to use. If the screen is above a certain size, it will grab the desktop version instead of the mobile version, to feed to addSourceToVideo. Easy enough, right?

    //determine screen size and select mobile or desktop vid
    function whichSizeVideo(element, src) {
    	var windowWidth = window.innerWidth ? window.innerWidth : $(window).width();
    	if (windowWidth > 800 ) {
    		addSourceToVideo( element, src.dataset.desktopVid);
    	} else {
    		addSourceToVideo(element, src.dataset.mobileVid);
    	}
    }

    Now that we've written code to determine write sources to empty video tags, it needs to init and be able to handle multiple videos. Remember our array of objects? It's time to use it. There's no point of running the code if there aren't any videos on the page, so we need to check to see if the var videos contains any data. If it does, then we need to loop over our array and return an individual video in case we have multiple videos on our page.

    //init only if page has videos
    function videoSize() {
      if (video !== undefined) {
      	video.forEach(function(element, index) {
      			whichSizeVideo(
      				element, //element
      				element  //src locations
      			);
      	});
      }
    }
    videoSize();

    Notably, you could tie the above code to a resize event incase a user resizes the window and have it trigger videoSize. I chose not not to for simplicity. You can see the working version of the above code, on CodePen. I didn't embed it in this post, so those using a slower connection aren't being hit with 30 MB of video data. Place this script inlne or as seperate file below your videos, but before the rest of your JS payload for maximum performance.


    On the realm of personal branding

    I'm always drawn with complete morbid fascinating with "influencer" culture as I largely have avoided social media. I have exactly one social media account of the major, Facebook (arguably the worst of them all) but deleted the app off my phone about 4 years ago. I'm including YouTube in this, as it should be considered a social network and reddit, which I avoid. I'm convinced that vast majority of social media is an anecdoche.

    So when I read about influencers asking for free meals at restaurants, I experience a clash of contradictory thoughts simultaneously: "Who has the gall to cold-call a restaurants for a free meal because they have 50,000 Instagram followers?", "I'm surprised this happens", "I am not surprised in the least bit", "this is scam verges on genius", "major brands will give famous people free shit, why not small businesses give those with tiny soap box for cheap advertising?", and "everything about this is idiotic" followed by general self-satisfied feeling of being above it all, despite my immediate desire to share/discuss it with my friends.

    Over marketization of all facets of life has made even the most mudane activity a transactional exchange that can be sold thanks to social media. Its all viewed through the nihilistic world view that anyone has a "personal brand". Any experience, even a wedding proposal is a marketing opportunity, and people flock to toxic lakes. The irony is influencers are internet points that may or may not mean a damn thing, an influencer with 2 million followers couldn't sell 36 t-shirts.

    If there's one thing that is certainly true, it's further evidence of the enshittening.