“I never understood wind, I know windmills very much, I have studied it better than anybody. I know it is very expensive. They are made in China and Germany mostly, very few made here, almost none, but they are manufactured, tremendous — if you are into this — tremendous fumes and gases are spewing into the atmosphere. You know we have a world, right?”
- Donald Trump, Dec. 21. 2019
I noticed there aren't any clear instructions on setting up Operator Mono in Webstorm or any of the Jetbrains IntelliJ IDEs. Changing the font doesn't work quite like one would expect, like the italics, but you can get it working with a little TLC. Out-of-the-box, my copy wasn't supporting alt-characters for italics and it needed to use a fallback font for characters. Webstorm like new IDEs support characters regularly not found in many fonts, called Orthographic ligatures (ligatures). These include less common font glyphs and assist by saving space and are more distinguishable. For examples of ligatures, see Medium.com Ligatures & Coding by Andreas Larsen. Adding the ligatures also seems to fix Operator Mono's italic character set although it could be Operator Mono vs Operator Mono SSM (Screensmart).
- Download the latest release from the operator-mono-lig project and decompress it
- You'll need to copy Operator Mono into
originalthe directory located in the newly created folder, operator-mono-lig.
- Next, open a terminal window and install fontools,
pip3 install fonttools
- Navigate to your newly created operator-mono-lig folder in the terminal and run
- The run
./build.shfrom the root directory
- Your newly created fonts will be located in the
- Next, install your fonts. The easiest way to install the fonts is to select all the fonts in finder and double click, then click install. This will install them correctly and appear as new font family, Operator Mono Sm Lit, or something similar.
- Go into Webstorm, Open preferences, Editor, then Font, and select your newly created font.
To get Webstorm to behave like Atom or VScode, you'll need to manually edit the code styles to use the italics. It only takes a few minutes. I recommend viewing code within Atom as a reference point using Operator Mono as it makes very good use of Mono's italics. I also have guides for Setting up Operator Mono for Atom and Setting up Operator Mono in Coda.
Troubleshooting: I noticed that the original Operator Mono font I had didn't work, I redownloaded Operator Mono SSM, and it worked. SSM is the limited weight set that removes several weights like thin, ultra). File names cannot have any spaces. I'm also using the material UI for WebStorm, which may or may not make a difference.
I try and not recommend hardware based on my preferences on the Definitive Mac Pro upgrade guide, but here are my opinions for anyone looking for a cheat sheet on what to buy.
Mac Pro 4.1/5.1 configurations vary quite a bit but there's really only two CPUs to really consider due to pricing. The x5680 is cheap, even with a dual CPU Mac Pro, its roughly $70 to purchase 2 CPUs, making it almost the same price as a single x5690. The x5690 is the best CPU a Mac Pro 4.1/5.1 can house.
- Better: x5680 - 6-Core 12M Cache, 3.33 GHz $35 (used)
- Better:x5690 - 6-Core 12M Cache, 3.46 GHz $60 (used)
GPUs on the Mac Pro are limited to the AMD sphere for 10.14 Mojave and 10.15 Catalina. The RX580s are floating around often for cheap, and they're a good entry-level card. The 560 and 570s are even cheaper, but it is hard to beat the price-to-performance. The Vega 56 is probably the best overall value as its performant and can be flashed to a Vega 64 and lands just shy of the Vega 64 in performance after flashing. The Radeon VII is the king and hard to come to buy. At $700, it's expensive but much mightier in the compute benchmarks than the 5700 XT, making it a much more well-rounded card for video editing and other GPU accelerated non-gaming tasks.
- Good: RX580 $100 (used) or $185 (new)
- Better: Vega 56 $225 $300 (new)
- Best: Radeon VII $600 (used) $699 (New) (Mojave and above only)
Soon the Radeon 5700 XT will be supported for the Mac which occupies a space between the Vega 56 and Radeon VII. The Radeon 5700 XT is great for gaming but it computational scores are low, it performs much lower than the Vega 64 in Blackmagic's Da Vinci Resolve.
There's a lot of storage options for the Mac Pro. The Mac Pro doesn't support bifurcation, so inexpensive dual PCIe cards out a no go. The SanDisk Ultra 3D splits the price and cost intersection nicely besting a lot of the really cheap SSDs like Kingston, although the Samsung 860 is a better SATA drive. Moving the drive to SATA3 doubles the max-transfer speed. The HP EX950 again is another splitting the middle drive between price and performance, in spitting distance of the Samsung Evo 970. NVMe requires a firmware flash for 4,1/5,1s and much more leg work for 3,1s.
- Good: SanDisk Ultra 3D 500 GB $65 + optional SYBA SY-PEX40039 2 Port SATA III PCI-e 2.0 $15
- Better: HPEX950 1 TB - $130 + PCIe card $30
- Best: IO Crest IO-PCE2824 (dual NVMe) $200 and Samsung 970 EVO 1TB SSD 2x $170 or Samsung 970 Pro 512 GB $150 x 2
Not a lot to say, for the 4,1/5,1 buy 1333 Mhz, go for 16 GB DIMMs if you can afford it.
- (Mac Pro 3.1) 800 MHz DDR2, FB-DIMMs
- Better: (Mac Pro 4.1/5.1) 1333 MHz DDR3 $17 per 8 GB DIMM
- Best: (Mac Pro 4.1/5.1) 1333 MHz DDR3 $28.50 per 16 GB DIMM
There's no reason to mess around on the cheaper solutions, Sonnet USB cards are problem-free and do not require external power. The Allego Pro and Allegro 3.1c are the same, featuring two USB 3.1 controllers with 10 Gbps (2.5 GB/s) for 4 ports. The only difference is the interconnect. The regular Allegro has one USB controller for 5 Gbps (1.25 GB/s) total bandwidth.
- Better: Sonnet Allegro USB 3.0 $50
- Best: Sonnet Allegro Pro USB 3.1 Type A $129 or Sonnet Allegro 4-Port USB 3.1c
Wireless is slightly annoying, but there are three options: Use a PCIe card, use a mini-PCIe card, or buy the parts online separately. Honestly, its best to just read the very long upgrade guide.
This example is not a hypothetical. The meal-kit company Blue Apron revealed before its public offering that the company was spending about $460 to recruit each new member, despite making less than $400 per customer. From afar, the company looked like a powerhouse. But from a unit-economics standpoint—that is, by looking at the difference between customer value and customer cost—Blue Apron wasn’t a “company” so much as a dual-subsidy stream: first, sponsoring cooks by refusing to raise prices on ingredients to a break-even level; and second, by enriching podcast producers. Little surprise, then, that since Blue Apron went public, the firm’s valuation has crashed by more than 95 percent. - Derek Thompson, The Millennial Urban Lifestyle Is About to Get More Expensive, The Atlantic
I don't usually do much news commentary but I've been skeptical of the gig-economy and mommy-services for ages.
The bigger question for me has, how did Silicon Valley start-ups manage to fleece investors for this long? The example that jumps to my mind is MoviePass, which was hemorrhaging money so fast that it went as far as to change user passwords to keep users from using its service. Many of these services existed as a "hack" via legal shenanigans. Labor laws are catching up, as Gavin Newsom signed AB 5.
When these services actually charge what it costs to use them, like eScooters, or have the double-whammy of having to raise prices to be profitable and content with actually employing the people they're exploiting and be profitable, it's going to be brutal. Many customers will be priced out from food delivery to ride-hailing services. They'll be just like the services they "disrupted" but only with a nicer app to show for it.
In 2015, I wrote an article, Visual CSS Regression Testing 101 for Front End Developers, where I covered the two competing philosophies of Visual Regression testing, Comparative vs. Baseline. Since then, PhantomCSS was sunsetted as PhantomJS was not as good as running Headless Chrome and BBC's Wraith works but wasn't ever as useful as I'd of liked.
What is Visual Regression Testing?
There are other primers on the concept, but it's worth quickly covering visual regression testing. In the course of development, CSS/JS/templating changes can potentially have unintended changes on your website or web app. Visual Regression Testing seeks to automate the laborious task of comparing visual elements to see if any unexpected changes have occurred. This is performed by running scripts with headless web browsers to render the webpage, then capturing its renderings, and using a show diff tool to compare the screenshots, flagging changed elements for review. Once approved, the latest changes are "approved" as the gold master and then saved to compare against next time you run the test.
Now, four years later, Backstop.JS emerged, mixing (mostly) the best of both Wraith and PhantomCSS.
Back when I first investigated visual regression, I spent time discussing baseline and comparative tools. Baseline visual regression tools in the talk I attended were complete screen renders, whereas the comparative tools could query individual DOM elements. In hindsight, The distinction between baseline and comparative is somewhat of a moot one, as comparative tools can do baseline checks as they're able to query the screen, be it the entire
body. That said, tools like Wraith that only do full-page screen renders can't make individual element selection thus are far more limited. At this point, I doubt either term gets much play, nor does it need the distinction as people have gravitated naturally to a tool that can query DOM elements.
Backstop.JS gets major points out the gate as easy to use. Just run the global npm installer, then navigate to your project directory and run
backstop init. It'll create a boilerplate template ready for you to start writing tests. This a serious upgrade, considering I once wrote a 12-step guide on how to install PhantomCSS.
Running tests is also easy, run
backstop testfrom the root directory and backstop will take care of the rest. Approving a batch of changes is easy, just punch in
Next up is formatting: All the tests are created in using JSON, which is easy to read and familiar. I've never been super into YAML, and I like JSON. Everyone likes JSON.
Where Backstop shines is how quick I went from never having written a test to having queried a roster of visual elements found our company website. Start up by declaring a set of screen sizes, and I created my own mobile, tablet, desktop, and large desktop screen sizes.
My first tests were entire pages, then I quickly graduated to advanced Backstop, testing our mobile menu. The mobile menu had a few considerations:
- It must be clicked
- It only makes sense to test it on a mobile resolution
- There's a delay for the animation
And there you have it; my mobile navigation is being tested against JS breakage and CSS changes. I'm fairly impressed. There's even integration for Running custom scripts. The only hiccups I've had is with AJAX content. I used remove element to hack out the DOM elements, which created reliable elements to test around the AJAX content, and for the AJAX content itself, I used the readySelector.
Lastly, chaining events is a bit cumbersome as you'll be coding up scenarios, but its still much less overhead than the days of PhantomJS.
Chaining Backstop to deploys
The next step is to chain backstop test to deployments. The demo shows Backstop playing with Jenkins deployments. At my office, we use bitbucket pipelines. It's a matter of translations.
The gif work flow is pretty straight forward with Visual Regression testing, ignore the test folders, and track the gold masters. Backstop creates a new timestamped directory for each test in
/backstop_databitmaps_testfor each test. Depending on the number of tests, you run, it's easy to churn out hundreds of megabytes of images, so be prepared to have a trash collection method if you're running via a deployment method that might require such.
Updating the firmware on a Mac Pro isn't difficult, but it is possible to "miss" firmware upgrades. This guide is for anyone looking to get to the latest (and most likely last) firmware released for the Mac Pro 5,1s, without having to install Mojave 10.14.x, or if you already have installed Mojave, or are looking to install Mojave. My first try, my firmware was stuck at 18.104.22.168.0.x even when running Mojave 10.14.6. Updating the firmware adds key funcitonality to the Mac Pro 5,1s, most notably native NVMe m.2 boot support. To learn more about Firmware and the Mac Pro 5,1s, see the Firmware Upgrades section of my Mac Pro Upgrade Guide.
Step 0: Remove unsupported GPUs
The biggest change for macOS Mojave is the deprecation of OpenGL and OpenCL. OpenGL has been a thorn in Apple's side for quite some time, as it's been nearly dead for years. Vulkan, the OpenGL successor, wasn't quite ready for primetime when Apple originally created Metal for iOS and thus decided to port it macOS. Despite the annoyingness of having to meet the requirements, it was a necessary evil. Mojave will not install if you have a non-metal supported GPU.
Note: some users are reporting they had to remove all PCIe cards sans their storage controller (SATA card) and GPU to install the firmware update. I did not. If you encounter issues, try removing additional PCIe cards.
Step 1: Have a 10.13 drive
Unfortunately, this is the biggest pain if you've already updated. You'll need a separate volume to boot into 10.13. Amazon and Newegg each have 120 GB SSDs for under $20 USD if you need a temporary drive to install macOS 10.13 on. (upside is you can buy a USB case and turn into a very fast USB 3.0 drive afterward or return it). You can get old versions of macOS via the Mac using DosDude1's installer if you can't access it. If you have no intention to upgrade to Mojave or already have it installed., don't worry. We won't be installing Mojave.
Step 2: Boot 10.13
The next step is pretty straight forward, boot into your install of macOS 10.13 if you haven't already.
Step 3: Download 10.14.6 Combined installer
Fortunately, firmware flashing does not require updating in a particular order. I went from 22.214.171.124.0 to 126.96.36.199.0 without any problems. There's several avenues for this, including the Mac App store, but when I used the Mac App Store route, I didn't get the combined OS installer (The Mojave installer + all the updates to Mojave). The easiest way to obtain the final combined update for Mojave is to use Dosdude1's installer. Much like before, download the OS DosDude1's installer, even though we have supported hardware but with the patcher for 10.13.
Note: You do not need to use the DosDude1 installer, as you can grab the update via the App store or other sources but I found this easy. Apparently this link was posted this on MacRumors and a few posters didn't read the full instructions and suggested that I was advocating using DOSdude1 on the OS. I am not. The Mac Pro 4,1/5.1 does not need DOSDude1, so do not run the patcher on Mojave.
- Go to DOSdude1 Mojave patcher and download it
Launch the patcher.
Depending on your security settings, your mac may suggest it's from an unverified developer. Go to the system prefs, Security and Privacy (general), and allow the app to open.
You'll be bugged one last time.
The patcher should warn you that you are on supported hardware.
This is fine, ignore the message. Within the patcher, select the download Mojave from the Tools menu.
Step 4: Launch the installer and click shut down
The installer should bring up a message about firmware and a shutdown message. This will not start the Mojave installer, only the firmware.
Step 5: Boot the Mac
Using the instructions in the previous image, press and hold the button until it blinks. If you do not have an EFI enabled GPU (see more about EFI in my Mac Pro Upgrade guide), you will not see any video output.
I trimmed down the video, as it took about 15 seconds of holding before the button flashed. After the button flashed, the internal speaker emmitted a long lowfi "boop" sound.
Step 5: Verify
Go to About this Mac, and click system report. Under the first screen, look for the "Boot Rom" text. This should list your firmware version. From here, you can continue using 10.13.6, upgrade, or boot to your 10.14 volume.
The 188.8.131.52.0 firmware works with any version of macOS your Mac Pro supports.
Updated: November 13th, 2019 MacRumors feedback
Updated: November 4th, 2019 based on Feedback from Mac Pro Users user group on Facebook.
A friend of mine switched to iOS after nearly a decade of Android usage. This spawned a lot of back and forth about iOS vs. Android. One criticism I cannot defend is iOS's icon organization and folders. In 2010, Apple created folders with iOS4, (if you need a memory jog, here's what they looked like). The original visual analogy used a visual metaphor of sliding back to expose the contents, as we were in the midst of peak skeuomorphism. At touch-interfaces were relatively new, and Apple had the monumental task of on-boarding droves of barely-digital-literate users, this serviced that App. As a UX developer, I carry a lot of opinions about interfaces, so it shouldn't be surprising that I feel the need to vent time-to-time. Here are several complaints I've harbored for years combined with some slap-dash, non-pixel-perfect UI mockups.
1) Waste of space on a broken Metaphor
More than half the screen is burned on a blurred-out effect, presenting a minimal amount of icons. The modern iPhones are massive compared to the era of much smaller/manageable/dare-I-say-superior-iphone-5 form factors. There's no point to a 3x3 grid. It's annoying and silly. With increased storage capabilities comes more data. Today's iPhones can come clocking in at 512 GBs of storage, capable of storing the amount of data familiar to desktop users. iOS hasn't grown to take on the desktop levels of data or applications.
2) Custom App icons
Visually, the mini-grid isn't a bad choice, but it's dated and loses its poignancy beyond the nine apps. Plus, at a glance, it doesn't visually 'jump out' among a mess of similar icons. It'd be easy for Apple to denote a folder icon by a slightly different change quickly. Here's my 10-minute mockup of what it could be like with a custom icon.
The focus shouldn't be on my graphic design choices as I did this fast and dirty. Instead, the take away is folder icons could vary visually from the current App icons to make them distinctive.
3) Folders in Folders
Next up is another gripe is folders within folders. Apple has done quite a bit to avoid hierarchical navigation in iOS, but it exists in the system preferences and now within the Files app. Merely transposing the visual interface in files gives a sane approach to folders. Combining custom folder icons, users can see where they are in the breadcrumbs.
4) Vertical scrolling in folders?
Vertical scrolling on the home screen has existed before with the jailbreaks, Infiniboard, or Springfinity. Vertical scrolling within folders would help express the folder metaphor of the past-tensed drawer and ease app migration.
5) Make search results meaningful
Search on iOS never shows you where files are. See below.
I can think of a few ways to alleviate this, such as list results showing location to the right for the top for App matches. I didn't bother to mock them up as what's really important is the lack of context.
6) Better App movement
If you've ever had to organize an iPhone, the task is so tedious it can take hours if you have a fair amount of apps. I've seen various suggestions and honestly, at this point. I'd take any.
7) It's time to loot macOS: Smart Folders
iOS needs to grow up. The Files app is a nice start although imperfect compared to iFiles found in the jailbreak world. Apple already has a brilliant solution that it can port to iOS, allow the OS to do auto-organization with Smart Folders. Smart folders for the unfamiliar work by using predetermined search strings. Apple could take it further and set Smart Folders on iOS to organize based on Application types. Upon app purchase or reinstall, the user can select "Smart folder", "dock", or "custom folder" and stay ahead of organization. Brillant right?
Bonus macOS -> iOS features
- Loot macOS's columned view for files.
- Allow for smaller grids and list within folders.
- Tap and hold on icons has a "Get Info" screen so you can see how much data your application is using and its associated folders
- In a perfect world, tabbed interface to make dragging between locations easier.
- A font manager.
On November 17th, 2017, I called utter and complete stupid bullshit as Google Pagespeed was giving glowing scores to pages like Wired.com and Newsweek.com. Both garnered higher scores than my hyper minimalist blog, which has a whopping 2.9k of CSS and about 40k of JS against their megabytes of JS and images. It was so irrelevant that it caused me grief professionally as clients would be unhappy with their scores despite being fairly optmized.
Pagespeed isn't perfect, but it is now what I'd consider fixed, and I've meant to write this article for some time. My blog's page speed has gone up to 76 from 70 on my homepage. Individual articles, such as Google Page Speed lacks common sense, now scores 90. Sanity has been restored. I'm not just saying that because my numbers are better. Let's start with my complaints.
- It would advise on how to optimize iframes even though the user has no control over such things
- It did not care if you used post-JPG/PNG formats
- It made no effort to measure total requests.
- It made no effort to measure time-to-paint
- It failed to recognize minified HTML due to a single line-break
- It did not compare against any other real-world dataset making it relativistic only to your website's previous scores
Surprisingly, all my major gripes were resolved to the point where I feel like a Google engineer took umbrage with my post (let me be clear; this almost certainly didn't happen). Lighthouse is oodles better (I've been using it since mid/late 2018), trying to leverage Chrome User Experience Report for higher tier pages. What's interesting is the change philosophically from the technology bucketed approach of CSS, JS, HTML, and Server-side technologies existing in their own orbits to one that clearly has standardized goals. This makes cross-site comparisons more sane. The benchmark metrics are as follows: First Contentful Paint, First Meaningful Paint, Speed Index, First CPU Idle, Time to Interactive, and Estimated Input Latency. This lends itself to a greater understanding of the stages of a web page's life-cycle. This is the most significant change, and I approve of it. The feedback is more meaningful, as well. Below is an unorganized list of my observations.
- If you are using Wordpress, it'll suggest plugins to assist certain tasks. Another meaningful change that I didn't think of to rant about was DOM Tree depths. Google now recommends less than 32 levels of depth.
- It makes some executive calls like using
font-display: auto;which re-enables one of the banes of web dev, Flash of Unstyled text that was expressed in shorthand, as "FOUT, FOIT, FOFT". Google prefers the FOUT.
- Google has stepped away from suggesting minified HTML, likely because compressing HTML is far more important, as demonstrated here. You can still minify HTML to squeeze extra bytes away.
- Lighthouse measures JS execution times, not just size.
- It no longer suggests the wantonly silly declaritive image sizes. Prior, Pagespeed wanted you to write out the pixel value to speed up render times. This was good advice in 2000, but tragically out-of-step for the responsive web.
- Audits can pass even if they are not met 100%, such as minify CSS or JS if the vast majority has been met.
All in all, it's good to see Lighthouse.
Apple Arcade is everything it should be, solving the biggest problem the Apple store has had: a vector for premium/high-quality games to be delivered without leaning on In-App Purchases. Not all IAPs are bad. There are a few titles that have done them correctly. Time Locker has only a $3 purchase that's remotely required, and it does not have any consumables. The only other IAPs are optional characters. Polytopia that unlocks "races" for $1 purchases, for a grand total of 9. The most you can spend on either game is roughly $10-15, which seems right for a high-quality mobile game.
Most though, as almost everyone knows are the detested loot boxes or in-game currency, and thus we've seen a race to the bottom. For years I lamented there wasn't a classification for full-fledged games without IAPs. I wanted a premium game store where developers could charge $10-$25 and get their fair shake but never did I consider a subscription gaming service. I don't play many mobile games, but when I do, I don't want to spend hours looking for titles that are pay-once models. When at the gym, I tend to walk for 15 minutes of warm-up, and at that time, I play silly iPhone games. Tower defense games are a personal favorite, and I play them before committing to running a 5k and off to do other activities.
Apple Arcade is damn good value as it stands, today. I don't think I've seen any console or gaming platform launch with so much content. It's good enough that I worry though about the rest of the App Store, as there's enough content for me to work through for quite some time. Also, as an added bonus, the Apple Arcade isn't limited to iOS or iPadOS, it's coming to tvOS and most importantly, Mac OS. In one swoop, Apple has a platform that spans every compartment of gaming: mobile, tablet, console/TV, and desktop (PC). All its competitors are missing one of these buckets (Steam, Microsoft, Google, Sony) and none have games that titles that can be easily ported to between all formats. Depending on Apple's commitment to funding titles, this indeed could be a very big deal. I may eat my words later, but Apple Arcade is probably the most significant product Apple has conceived since the Apple Watch. At $5 a month, it's the cost of one Playstation or Xbox game a year, or the same price as PSN a year.
There's still asterisks to be resolved: what does the future look like? How many games can we expect? Will we ever see ports or non-exclusive content on Apple Arcade? Apple Arcade isn't going to be the end-all-be-all for gaming but out-of-the-gates, it's competition for Sony's very successful and very well executed PSN.
So far, I haven't mentioned one giant of gaming, arguably the most loved of them all. Apple Arcade probably won't be causing much damage to the Sony Playstation, Microsoft Xbox, or Steam platform but it's scope of more casual/family-friendly titles and whimsical nature certainly reminiscent of said company, and I wouldn't be surprised if it caused parents too Switch by giving kids hand-me-down devices or simply get them an iPod Touch. Apple this year already is the fourth largest gaming company and previously, it hasn't even tried.
I doubt we'll see any AAA-Titles ported to the Apple Arcade be it popular sports Franchises like Madden, NBA Live, NBA2k, FIFA or any games perhaps based off professional sports leagues due to licensing. I also wouldn't expect ports of classics like Sonic The Hedgehog to be folded in. (I wouldn't rule it out either, as Sonic and Frogger both made appearences.)
Lastly, the service isn't perfect. There's not really a Steam/PSN/Xbox Live system to it for friends lists for gaming. There's also an extreme lack of titles with much depth, many of the games I've tried are nugget sized experiences. I personally love Cricket Through the Ages, and really liked Assemble With Care, both criminally short but some of the others less so. There's only so many single-button games I want to play. Some of the more hyped titles like Saynara are beautifully shallow or feel like a demo like Red Scare or cheap knockoffs like Punch Planet. As a fan of Oceanhorn, I look forward to diving into Oceanhorn 2. I'll chalk it up to launch titles, rarely are they the pinnacle of a console sans a few rare outliers, mostly from Nintendo (Tetris, Super Mario World, Pilot Wings, Super Mario 64). That said, with my relaxed interest in gaming perhaps more nugget experiences are exactly what I'm after if delivered right.
I'll be interested to see if I agree with myself two years from now.
Right now, there are multiple 12 GB of RAM phones on the market. They run the gamut of prices and spec, but none are more expensive than a MacBook Pro 13 inch with a factory config, save perhaps the Galaxy Fold, a curiosity.
- Samsung Galaxy Fold $???
- Samsung Galaxy S10 Plus $1,600
- OnePlus 7 Pro $1299
- Asus ROG Phone 2 €899
- Samsung Galaxy Note 10 Plus $929
- Xiaomi Black Shark 2 or Black Shark 2 Pro $879
- Lenovo Z6 Pro $849
- Xiaomi Mi 9 Explorer Edition $800
- Nubia Red Magic 3 $700
- Vivo iQOO $650
I didn't look up the MSRPs but rather what seemed to be legitimate prices online to give an honest representation. That's likely an impartial list, but there are at minimum 10 Android models shipping with more RAM than a $1299 MacBook Pro, and even the $1999 model of the 13 MacBook Pro ships with 8 GB of RAM. In all 13 inch models, it must be custom ordered. There's been a bit of stagnation for laptops and RAM, partially due to chipsets, partially due to modern OSes using much more efficient RAM management via compression aided by SSD scratch disks and lastly due to the increased power draw. The last feels increasingly irrelevant as phones have caught up to laptops and the foolish TouchBar. It was only in July of 2018 that Apple addressed the lack of 32 GB RAM options for the MacBook Pro lineup.
Just as a barometer of applications: Adobe Photoshop, Lightroom recommends 8 GB of RAM or more, Illustrator recommends 16 GB with 4 GB being the minimum, After Effects recommends 16 GB with 8 GB being the minimum. Notably, the assumption is you would not be running multiple professional applications at once, which in reality with say, After Effects, which can routinely involve any number of 2D editing applications, and even 3D apps to juggle resources. Then there's development, where Docker or VMs/Simulators, and horrid JS memory vacuums exist. I'm not even going to touch professional audio. GPUs crossed the 8 GB barrier some time ago, meaning you could connect a 16 GB AMD Radeon VII to a $1999 MacBook Pro with 8 GB of RAM. Even for general web surfing, it's easy to saturate 8 GB of RAM with a browser with poor memory management (chrome).
None of Apple's Pro laptops should ship with less than 16 GB and the MacBook Air should have a factory model that ships with 16 GB of RAM. This would be moot if we had user-serviceable RAM upgrades. Laptops needn't be held to the modular standards of desktops, but they should be for basic specs.
Also worth noting all iPhones 11s have 4 GB of RAM*. 4 GB of RAM is good for right now but seems a bit counterproductive until you consider the revelation that the iPhone 11 may have 2 GB of ram dedicated to the camera. iOS's memory management works mostly due to Apple's stringent background task management. My guess is the next iteration of the iPhone will probably move to 6 GB of Application RAM and 2 GB for the camera.
Over the break, I went on a binge of minor changes to this blog.
- This blog now supports a JSON Feed, looked at Apple News but screwed up the process for importing my RSS feed. May revisit that later but with the low traffic most of this blog sees, not really worth the effort.
- Improved the JS. To reduce requests, I've concatted four JS files into one. I upgraded jQuery 1.12 to 3.x as it is faster and smaller, and its no longer hosted on a CDN.
- Fixed the canonical URL declaration in the head.
- Removed a few errant CSS classes, now this serves an absurdly low 2.9k of CSS down from roughly 3k.
Sometimes I think the lack of visual flourish is mistaken for a lack of design but I like minimalism.
NetNewsWire 5.0 was just released as open source. Ever since Google Reader shut down, I've been using Feedly as an aggregator and Reeder for iOS. Reeder makes it very easy to import Feedly feeds but it isn't as straight forward for NetNewsWire. Fortunately, Feedly and NetNewsWire both support OPML (Outline Processor Markup Language) to import/export feeds. Feedly buries this so here's a quick step-by-step to get up and running with NetNewsWire and Feedly.
Sign in to Feedly and click the gear.
From the settings, next to the import OPML click and the export button.
From NetNewsWire, select File -> Import Subscriptions and import your OPML file
That's it! Enjoy. I highly recommend Reeder for iOS.
I'm pretty sure since the inception of this blog, the URLs in the RSS feed have been broken. I just noticed writing about NetNewsWire. Now they are fixed. You can add the RSS Feed here to your favorite RSS reader.
Expect a slow trickle of tiny improvements to this blog, like the printable blog posts.
This isn't a blog about politics or even my opinions, but I'm going to go on the record and say nuking hurricanes is a bad idea. This seems like a silly thing to say, but here we are, having a national discussion about it, and why it is a bad idea. It's worth noting our sitting president suggested multiple times according to Axios, that suggested nuking hurricanes to stop them from hitting the U.S..
A lot of people are clowning this idea (because it's patently stupid), but it's not entirely wrong, rather a problem of scale. We simply do not have enough nuclear weapons on the planet to end all hurricanes. If truly want to nuke our way out of hurricanes, we'll have to invest a lot more into Nuclear weapons. No more earth = no more hurricanes. Checkmate, hurricanes.
Jokes aside, Axios may not be the household name, nor even my go-to source for journalism and Trump denies it, but it is believable. Let us consider that this is fresh on the heels of Trump asserts, "I could win that war in a week. I just don’t want to kill 10 million people". Then there are the long-standing reports that Trump asked multiple times why can't use nukes? multiple times and would use nuclear weapons in response to a terrorist attack by ISIS. Even if the latest is "fake news," the other accusations are damning enough.
"The biggest problem we have is nuclear ... having some maniac, having some madman go out and get a nuclear weapon." - Donald Trump, 2016
I couldn't agree more. This is clearly a person who is unfit to be given responsibility for the most destructive weapon humankind has created. As the great political journal of our age, TeenVogue*, wrote, "Yes, Trump could instigate a nuclear war without anyone stopping him."
We are all at the mercy of a tweet.
*Not a satirical comment.
We've all been there, you have changes that haven't been QAed, but there's a hotfix that needs to go out yesterday. Pantheon is a great host, but it has one major gotcha: you can't switch code branches. The way Pantheon works is the Pantheon Remote git library gets one, and only one branch. There's no way to switch branches. Code can only be promoted from staging->test->live. This is problematic, especially if you're coming from various deployment management utilities that let you switch branches or platform-as-a-service like Heroku.
Image Credit: Pantheon, Use the Pantheon Workflow
Pantheon does offer multi-dev which essentially creates a separate branch for testing (and which can be promoted to the main chain) but still doesn't fix the hotfix issue.
Pantheon Hotfix Flow
- Create a local branch and reset to the last commit that was made live (this is a pain as Pantheon doesn't show last commit git hashes)
- Make changes locally. Commit the changes to your new branch.
git push -f Pantheon YOURBRANCHHERE:master
- Promote from Dev -> Test -> Live from the control panel
- Make sure your hotfix is merged into your master local (and your origin)
- Reset Pantheon’s Dev to the master branch
git push -f Pantheon master:master
Despite being warned that you should never use git force, this is the cleanest method. You can push up your desired hotfix and leave it on the live environment until your normal deploy chain overwrites it.
Using the magic of
@media print, I've included a mild update to this blog to be more printer / PDF friendly, mostly for a singular post, The Definitive Mac Pro Upgrade Guide that drives the majority of my traffic.
- Images are now capped to 75% of the page width as my images are only double density, optimized for screens, not printers.
- The main content now expands to the full page width, so users can set the margins within their print prefs.
- The main body copy, and line height has been reduced from 16px to 15px, and 2 em to 1.75 em.
- I've included a link back to the original blog post at the top of each page, visible when printed so PDF users can easily return to the blog post in a web browser.
Happy printing, you weird PDF-loving bastards!
My 2017 MacBook Pro stopped charging and refused to accept power from any power supply on any port. I was restoring my My computer to my previous laptop, a 2015 MacBook Pro and encountered the above. I tried using the day-before's back up, but this didn't work.
I received the following message when booting from a Time Machine drive:
An Error Occurred Restoring from Backup
To Try Restoring from a different backup, click choose other Backup.
To reinstall macOS, click install macOS. During the install you can chooes to restore your information from a Time Machine Backup.
To Boot from an existing macOS installation...
I've seen some high tier fixes, like harryfear.co.uk's fix but there's an easier route and the clue is in the error message.
More carefully reading the message, I booted off the Recovery partition and then reinstalled macOS. Then once completed, on the Migration Assistant I selected the option to transfer information over from my Time Machine drive. This isn't a true 1:1, I noticed some things missing such as /etc Apache2 modifications but some of the geek stuff like, HomeBrew and its many CLI applications (Heroku) made the cut. Beyond renewing SSH keys and running Docker builds, my computer was good to go. Standard Mac applications had no issues.
If a restore fails, fear not. Restores are faster but you will not lose your important files.
- Boot off a recovery partition, reinstall macOS
- At the end of the installation, you will see the Migration Assistant. Select transfer files from another computer/device/Time Machine then select your time machine drive
I suspect for most users, self-included, the harryfear fix is overshooting the problem and Apple's solution is "good enough".
Sometimes you encounter something that'll surprise you, and yesterday was one of those days: Chrome does not support inline media queries on the
videotag. (you can test it here) Worse, plain media queries will not stop multiple videos from loading, which effectively doubles your data, so it requires a JS solution. CSSTricks has an article from 2012 using jQuery, but there's no follow up and I wasn't that enthralled. I saw, thenewcode: Make HTML5 Video Adaptive With Inline Media Queries but it fails to mention Chrome's refusal to support it.
First, I wanted to prevent any request to be made, so I created an empty video tag with my two videos as attributes. Easy right? Now that all major browsers support MPEG4, I could safely assume the only legacy users are IE and Safari as the browsers are tied to OS updates, whereas Chrome and FireFox are not thus very few users would not be using a recent browser. Safari and IE both support MPEG4. There's not a good reason for me to want to support WebM.
I didn't want to rely on any framework, jQuery document ready meant the JS wouldn't fire until the rest of the page loaded, and es6 meant leaving out old browsers. Thus, I'm limited to ES5.
First, I needed to get all the videos on the page. This creates a variable that contains an array of objects, even if only one is found on the entire page.
Next, I needed to create a source for the video tag. The source tag needs an
type. After that we need to append the newly created DOM element back to an element. This function doesn't need to know how many videos are on the page or what the screen size. It just will return a
Next is where the logic happens. If a screen size over a predetermined value, I will load the desktop or mobile version. Since I have two
data-attributesto work off of depending on the screen size depends on which one I want to use. If the screen is above a certain size, it will grab the desktop version instead of the mobile version, to feed to
addSourceToVideo. Easy enough, right?
Now that we've written code to determine write sources to empty
videotags, it needs to init and be able to handle multiple videos. Remember our array of objects? It's time to use it. There's no point of running the code if there aren't any videos on the page, so we need to check to see if the var
videoscontains any data. If it does, then we need to loop over our array and return an individual video in case we have multiple videos on our page.
Notably, you could tie the above code to a resize event incase a user resizes the window and have it trigger
videoSize. I chose not not to for simplicity. You can see the working version of the above code, on CodePen. I didn't embed it in this post, so those using a slower connection aren't being hit with 30 MB of video data. Place this script inlne or as seperate file below your videos, but before the rest of your JS payload for maximum performance.
I'm always drawn with complete morbid fascinating with "influencer" culture as I largely have avoided social media. I have exactly one social media account of the major, Facebook (arguably the worst of them all) but deleted the app off my phone about 4 years ago. I'm including YouTube in this, as it should be considered a social network and reddit, which I avoid. I'm convinced that vast majority of social media is an anecdoche.
So when I read about influencers asking for free meals at restaurants, I experience a clash of contradictory thoughts simultaneously: "Who has the gall to cold-call a restaurants for a free meal because they have 50,000 Instagram followers?", "I'm surprised this happens", "I am not surprised in the least bit", "this is scam verges on genius", "major brands will give famous people free shit, why not small businesses give those with tiny soap box for cheap advertising?", and "everything about this is idiotic" followed by general self-satisfied feeling of being above it all, despite my immediate desire to share/discuss it with my friends.
Over marketization of all facets of life has made even the most mudane activity a transactional exchange that can be sold thanks to social media. Its all viewed through the nihilistic world view that anyone has a "personal brand". Any experience, even a wedding proposal is a marketing opportunity, and people flock to toxic lakes. The irony is influencers are internet points that may or may not mean a damn thing, an influencer with 2 million followers couldn't sell 36 t-shirts.
If there's one thing that is certainly true, it's further evidence of the enshittening.
There's plenty of tutorials on creating your own custom Gutenberg blocks, but I found that the between beginner and advanced was lacking. I'm going to skip the basics but rather a short list of things to understand to more effectively work with Gutenberg. So from a trial-by-fire experience of working on two Wordpress 5.0 Gutenberg websites, here's what I've learned. The guiding principal is to re-use functionality when possible and try to replicate the Wordpress UI.
Rule 0: Understand React's role
Wordpress chose to use React to create it's UX for its new Gutenberg block editor. However, instead of using React directly, Wordpress uses Element, an abstraction layer over React. If you're wondering why someone would want to do this, Wordpress has a very concise list:
- In many applications, especially those extended by a rich plugin ecosystem as is the case with WordPress, it’s wise to create interfaces to underlying third-party code. The thinking is that if ever a need arises to change or even replace the underlying implementation, it can be done without catastrophic rippling effects to dependent code, so long as the interface stays the same.
- It provides a mechanism to shield implementers by omitting features with uncertain futures (
- It helps avoid incompatibilities between versions by ensuring that every plugin operates on a single centralized version of the code.
This means you'll be writing code and making imports to wp.element, wp.components and wp.blocks. React only exists in the admin side of things, and all the content found within a block is saved statically meaning you won't be able to create React experiences simply by creating Gutenberg blocks. The
Rule 1: Bundle your custom blocks in one plugin
It's pretty easy to bundle all your custom blocks into one plugin. Unless you're looking to distribute your custom blocks across many sites, it's saner for development and deployment to make one master plugin for all your blocks. On the dev side, this means a single webpack instance to spin up as opposed to one for each block. A good example of this in action is Zac Gordon's WPForJSCourse example. While his course isn't free, this plugin is. This includes everything you need: A sane structure, a webpack config and a setup. All the custom blocks are registered in one nifty index js file.
It's pretty easy to do, but it was a design pattern that I didn't realize would benefit me when I started in on Gutenberg.
Rule 2: Learn Innerblocks and reuse core blocks
Most tutorials seem to stop short of innerblocks, and Innerblocks are probably one of the most important features of Gutenberg. Innerblocks allow you to load blocks inside of blocks. Below is a super basic example for a slideshow, allowing the user to enter as many images using the
core/imageGutenberg block but restricting the user from entering any more.
Innerblocks aren't simply limited to allowing and restricting other blocks, they can also accept templates, which is a set of pre-defined blocks. This allows assembling a very complicated UI or layout widget out of any number of prebuilt or custom blocks. There's no reason to reinvent the wheel as Wordpress gives you quite a few different blocks. Below I've included a list of all the common blocks by category.
Common blocks category
core/html— Custom HTML
Layout Elements category
core/media-text— Media and Text
core/nextpage— Page break
Pictured: Mock up of a hypothetical user page
Let's break down the above design: It's two columns consisting of:
Column 1 (core/column) Column 2 (core/column) Image (core/image) Headline (custom)
With Gutenberg, simple layouts like the above can potentially be done using the
codeblock, but it isn't desirable as it requires a bit of mastery of Wordpress, with a high margin of error if it taps into using custom CSS. We have a two column design, consisting of an Image in the first column, and two fields in the second, followed by text beneath the columns. So let's look a template code.
Using the templates, I'm able to place inside columns a mixture of custom and factory Gutenberg blocks! Innerblocks aren't infallible, you can template lock blocks so users cannot add more blocks but occasionally this creates issues. Also, the custom styling for blocks does not work on any block that features an innerblock (yet). Perhaps this will change but as of writing this, it hasn't.
Rule 3: Restricting dependent blocks
Often you'll create a block that shouldn't appear in your list of plugins from the Wordpress gutenberg GUI. Any custom block can be easily restricted to being only accessible from a certain block type. In my previous example I had two custom blocks,
mycustomblocks/profile-name. These two blocks are very simple blocks, but I do not want them polluting my list of Gutenberg blocks. This only requires declaring the parent
See the parent flag? It really is that easy.
Rule 4: Learn to use the custom toolbar and form fields
To truly make a plugin feel native, you'll need to tap into the UX that Gutenberg uses, the toolbar and the sidebar containing form fields, using InspectorControls and BlockControls. Again, Zac Gorden's JSForWordpress tutorial repo has a great example of each.
BlockControls appear in the editable area of a block as inline controls. Also, be sure to see the official documentation on Toolbars and Inspector
It will feel a little strange but the edit
returnis returned as an array, for the main editable area mark up.