Integrating Node KSS with Gulp

    First I off, I highly recommend reading CSSTricks' Build a Style Guide Straight from Sass, it's a game changer for auto style guide generation. That said, I assume if you're at this page you're already a convert.

    I'm going to assume the following:

    • node-kss is installed in the same directory as your gulpfile
    • node-kss has been set up and is generating a style guide.
    • you have at least very rudimentary understanding of gulp

    If either of the first is untrue, please go to the CSS tricks link as it's a wonderful guide and will get you a working spot >Node-KSS has a gulp repository but its wantonly out of date. I recommend not using it. Fortunately chaining it's pretty easy. First, we need to install gulp-shell in our gulp project.

        npm install --save-dev gulp-shell

    Next, we're going to need to require gulp shell in our gulp file, this can vary based on your set up, it may be var or const depending on if you're running ES6 or not or part of a larger declaration:


        const shell = require('gulp-shell')


        var shell = require('gulp-shell')

    Next we're going to create in our gulpfile a task to execute the command to run node-kss (note you can run alterations of said command if your configuration is different, kss is not required to be installed in the same place as gulp.)

    gulp.task('kss', shell.task(['./node_modules/.bin/kss --config kss-config.json']));

    Lastly, we now need to reference this task in another task. Below is an example of how I'm using it, I created a watch task called "styleguide", a slightly modified version of my default task. Your task will differ from mine

    gulp.task('styleguide',['serve'], function() {
      // Watch .scss files'**/*.scss', function(event) {
        console.log('File ' + event.path + ' was ' + event.type + ', running tasks...');'sass');'kss');
        }); , function(event) {
          console.log('File ' + event.path + ' was ' + event.type + ', running tasks...');
        });'change', browserSync.reload);'change', browserSync.reload);

    Note that I applied'kss'); after my Sass task has run, this will generate a style guide. Since the style guide generates new HTML on every save, my'change', browserSync.reload); is triggered because of my project's directory structure. This is why I created a separate task named "styleguide" as I do not always need my kss task to run, and do not want to interfere with live CSS injection via browserSync. Your needs will vary.

    Gulp Boilerplate

    Every now and again, I remember I have a GitHub account and throw something simple up there. I made a Grunt Boilerplate years ago and finally got around to making one for Gulp. There are a few features I still need to stick in, but I like to have a starting point rather than re-inventing my tasks every project.


    Features all the greatest hits:

    • Sass processing
    • CSS Browser auto-prefixing
    • CSS minification
    • JS Uglify (minification)
    • BrowserSync (Inject CSS changes + follow, reload on JS change)

    This is mostly for my own benefit, but if anyone finds it useful, I'm glad. You can nab it here Gulp-Sass-JS-BrowserSync-Boilerplate

    When Node-Sass fails Installing

    So you're here because bash is outputting some big mess that looks like the following when you tried to install gulp-sass or node-sass via NPM. You've probably updated Node and NPM, switched versions in NVM or HomeBrew and are beating your head while node-sass isn't installing. The issue is likely not in the node or npm version but the package.json.

      > node-sass@0.8.6 install /Users/<path-to-project>/_gulp/node_modules/gulp-sass/node_modules/node-sass
    > node build.js
    (node:43004) [DEP0006] DeprecationWarning: child_process: options.customFds option is deprecated. Use options.stdio instead.
      CXX(target) Release/
    In file included from ../binding.cpp:1:
    ../../nan/nan.h:339:13: error: no member named 'New' in 'v8::String'
        return  _NAN_ERROR(v8::Exception::Error, errmsg);
    ../../nan/nan.h:319:50: note: expanded from macro '_NAN_ERROR'
    # define _NAN_ERROR(fun, errmsg) fun(v8::String::New(errmsg))
    ../../nan/nan.h:343:5: error: no member named 'ThrowException' in namespace 'v8'
        _NAN_THROW_ERROR(v8::Exception::Error, errmsg);
    ../../nan/nan.h:324:11: note: expanded from macro '_NAN_THROW_ERROR'
          v8::ThrowException(_NAN_ERROR(fun, errmsg));                             \
    ../../nan/nan.h:343:5: error: no member named 'New' in 'v8::String'
        _NAN_THROW_ERROR(v8::Exception::Error, errmsg);
    ../../nan/nan.h:324:26: note: expanded from macro '_NAN_THROW_ERROR'
          v8::ThrowException(_NAN_ERROR(fun, errmsg));                             \
    ../../nan/nan.h:319:50: note: expanded from macro '_NAN_ERROR'
    # define _NAN_ERROR(fun, errmsg) fun(v8::String::New(errmsg))
    ../../nan/nan.h:348:9: error: no type named 'ThrowException' in namespace 'v8'
    ../../nan/nan.h:355:65: error: no member named 'New' in 'v8::String'
        v8::Local<v8::Value> err = v8::Exception::Error(v8::String::New(msg));
    ../../nan/nan.h:356:50: error: expected '(' for function-style cast or type construction
        v8::Local<v8::Object> obj = err.As<v8::Object>();
    ../../nan/nan.h:356:52: error: expected expression
        v8::Local<v8::Object> obj = err.As<v8::Object>();
    ../../nan/nan.h:357:65: error: too few arguments to function call, expected 2, have 1
        obj->Set(v8::String::New("code"), v8::Int32::New(errorNumber));
                                          ~~~~~~~~~~~~~~            ^
    /Users/<user>/.node-gyp/8.1.2/include/node/v8.h:2764:3: note: 'New' declared here
      static Local<Integer> New(Isolate* isolate, int32_t value);
    In file included from ../binding.cpp:1:
    ../../nan/nan.h:357:26: error: no member named 'New' in 'v8::String'
        obj->Set(v8::String::New("code"), v8::Int32::New(errorNumber));
    ../../nan/nan.h:369:12: error: no member named 'New' in 'v8::String'
        return _NAN_ERROR(v8::Exception::TypeError, errmsg);
    ../../nan/nan.h:319:50: note: expanded from macro '_NAN_ERROR'
    # define _NAN_ERROR(fun, errmsg) fun(v8::String::New(errmsg))
    ../../nan/nan.h:373:5: error: no member named 'ThrowException' in namespace 'v8'
        _NAN_THROW_ERROR(v8::Exception::TypeError, errmsg);
    ../../nan/nan.h:324:11: note: expanded from macro '_NAN_THROW_ERROR'
          v8::ThrowException(_NAN_ERROR(fun, errmsg));                             \
    ../../nan/nan.h:373:5: error: no member named 'New' in 'v8::String'
        _NAN_THROW_ERROR(v8::Exception::TypeError, errmsg);
    ../../nan/nan.h:324:26: note: expanded from macro '_NAN_THROW_ERROR'
          v8::ThrowException(_NAN_ERROR(fun, errmsg));                             \
    ../../nan/nan.h:319:50: note: expanded from macro '_NAN_ERROR'
    # define _NAN_ERROR(fun, errmsg) fun(v8::String::New(errmsg))
    ../../nan/nan.h:377:12: error: no member named 'New' in 'v8::String'
        return _NAN_ERROR(v8::Exception::RangeError, errmsg);
    ../../nan/nan.h:319:50: note: expanded from macro '_NAN_ERROR'
    # define _NAN_ERROR(fun, errmsg) fun(v8::String::New(errmsg))
    ../../nan/nan.h:381:5: error: no member named 'ThrowException' in namespace 'v8'
        _NAN_THROW_ERROR(v8::Exception::RangeError, errmsg);
    ../../nan/nan.h:324:11: note: expanded from macro '_NAN_THROW_ERROR'
          v8::ThrowException(_NAN_ERROR(fun, errmsg));                             \
    ../../nan/nan.h:381:5: error: no member named 'New' in 'v8::String'
        _NAN_THROW_ERROR(v8::Exception::RangeError, errmsg);
    ../../nan/nan.h:324:26: note: expanded from macro '_NAN_THROW_ERROR'
          v8::ThrowException(_NAN_ERROR(fun, errmsg));                             \
    ../../nan/nan.h:319:50: note: expanded from macro '_NAN_ERROR'
    # define _NAN_ERROR(fun, errmsg) fun(v8::String::New(errmsg))
    ../../nan/nan.h:406:13: error: no member named 'smalloc' in namespace 'node'
        , node::smalloc::FreeCallback callback
    ../../nan/nan.h:141:71: note: expanded from macro 'NAN_INLINE'
    # define NAN_INLINE(declarator) inline __attribute__((always_inline)) declarator
    ../../nan/nan.h:416:12: error: no matching function for call to 'New'
        return node::Buffer::New(data, size);
    /Users/<user>/.node-gyp/8.1.2/include/node/node_buffer.h:52:40: note: candidate function not viable: no known conversion from 'char *' to 'v8::Isolate *' for 1st argument
    NODE_EXTERN v8::MaybeLocal<v8::Object> New(v8::Isolate* isolate, size_t length);
    /Users/<user>/.node-gyp/8.1.2/include/node/node_buffer.h:55:40: note: candidate function not viable: no known conversion from 'char *' to 'v8::Isolate *' for 1st argument
    NODE_EXTERN v8::MaybeLocal<v8::Object> New(v8::Isolate* isolate,
    /Users/<user>/.node-gyp/8.1.2/include/node/node_buffer.h:67:40: note: candidate function not viable: requires 3 arguments, but 2 were provided
    NODE_EXTERN v8::MaybeLocal<v8::Object> New(v8::Isolate* isolate,
    /Users/<user>/.node-gyp/8.1.2/include/node/node_buffer.h:60:40: note: candidate function not viable: requires 5 arguments, but 2 were provided
    NODE_EXTERN v8::MaybeLocal<v8::Object> New(v8::Isolate* isolate,
    In file included from ../binding.cpp:1:
    ../../nan/nan.h:420:12: error: no matching function for call to 'New'
        return node::Buffer::New(size);
    /Users/<user>/.node-gyp/8.1.2/include/node/node_buffer.h:52:40: note: candidate function not viable: requires 2 arguments, but 1 was provided
    NODE_EXTERN v8::MaybeLocal<v8::Object> New(v8::Isolate* isolate, size_t length);
    /Users/<user>/.node-gyp/8.1.2/include/node/node_buffer.h:55:40: note: candidate function not viable: requires at least 2 arguments, but 1 was provided
    NODE_EXTERN v8::MaybeLocal<v8::Object> New(v8::Isolate* isolate,
    /Users/<user>/.node-gyp/8.1.2/include/node/node_buffer.h:67:40: note: candidate function not viable: requires 3 arguments, but 1 was provided
    NODE_EXTERN v8::MaybeLocal<v8::Object> New(v8::Isolate* isolate,
    /Users/<user>/.node-gyp/8.1.2/include/node/node_buffer.h:60:40: note: candidate function not viable: requires 5 arguments, but 1 was provided
    NODE_EXTERN v8::MaybeLocal<v8::Object> New(v8::Isolate* isolate,
    In file included from ../binding.cpp:1:
    ../../nan/nan.h:427:26: error: no member named 'Use' in namespace 'node::Buffer'
        return node::Buffer::Use(data, size);
    fatal error: too many errors emitted, stopping now [-ferror-limit=]
    20 errors generated.
    make: *** [Release/] Error 1
    gyp ERR! build error
    gyp ERR! stack Error: `make` failed with exit code: 2
    gyp ERR! stack     at ChildProcess.onExit (/Users/<user>/.nvm/versions/node/v8.1.2/lib/node_modules/npm/node_modules/node-gyp/lib/build.js:258:23)
    gyp ERR! stack     at emitTwo (events.js:125:13)
    gyp ERR! stack     at ChildProcess.emit (events.js:213:7)
    gyp ERR! stack     at Process.ChildProcess._handle.onexit (internal/child_process.js:197:12)
    gyp ERR! System Darwin 16.7.0
    gyp ERR! command "/Users/<user>/.nvm/versions/node/v8.1.2/bin/node" "/Users/<user>/.nvm/versions/node/v8.1.2/lib/node_modules/npm/node_modules/node-gyp/bin/node-gyp.js" "rebuild"
    gyp ERR! cwd /Users/<path-to-project>/assets/_gulp/node_modules/gulp-sass/node_modules/node-sass
    gyp ERR! node -v v8.1.2
    gyp ERR! node-gyp -v v3.6.2
    gyp ERR! not ok

    Go to package.json and look at the versions. Most likely the version is locked to a very old version of node-sass or gulp-sass in your project (or the project you're using), switch it's version to something recent, (as of writing this, it is "gulp-sass": "^3.0.0", or "node-sass": "^4.7.2"). Congrats, it'll now install!

    Safari's Autofill needs to be redesigned

    All major browsers have built-in login managers that save and automatically fill in username and password data to make the login experience more seamless. The set of heuristics used to determine which login forms will be autofilled varies by browser, but the basic requirement is that a username and password field be available.

    Login form autofilling in general doesn’t require user interaction; all of the major browsers will autofill the username (often an email address) immediately, regardless of the visibility of the form. Chrome doesn’t autofill the password field until the user clicks or touches anywhere on the page. Other browsers we tested [2] don’t require user interaction to autofill password fields.

    Thus, third-party javascript can retrieve the saved credentials by creating a form with the username and password fields, which will then be autofilled by the login manager.


    Ironically before the holidays, I had to deal with this from the opposite end as auto-form filling from Safari was filling out hidden fields.

    Consider the following

    • Safari's autofill can fill out more than just username/password.
    • Safari's autofill does not give you the ability to view the stored information in its local database other than site entries.
    • Safari's autofill will fill out visibility: hidden and display: none
    • Safari's autofill does not trigger a DOM event on display code>visibility: hidden and display: none. Safari does allow to query for input:-webkit-autofill but testing for this means super hacky setTimeout and setInverval hacks.
    • Safari does (mostly) respect the HTML5 convention but will ignore autofill off on username or password fields

    This leads to a bizarre world where Safari is egregiously handing out info that can't be vetted.

    Safari Autofill Manager

    Pictured: Safari's autofill manager for non-username/passswords (other), doesn't allow you to see what information its autofilling or edit the values. I found some surprising entries in my Safari autofill manager.

    I had the problem where a donation form was falling our API validation as Safari's autofill was completing hidden form elements without invoking changes and creating scenarios we hadn't previously considered. It took error logging to figure out Safari was the culprit, and a heavy dose of intuition to figure out that it was autofill.

    The solution was to add autofill and disabled but lead me to wonder about the potential abuses of autofill. Apparently, I wasn't the only one.

    ImageOptim vs Squash 2 - Comparing PNG optimization - A Squash 2 review

    For years I've leaned on ImageOptim as my go-to for image optimization. I tend to be a little obsessive, using modern formats (WebP, JPEG 2000) and testing out avant-garde projects like Guetzli by Google. I recently decided to finally try out Squash by Realmac Software.

    Over the years, codecs have improved remarkably, especially in the realm of video: For example: H.261 (1984, 1988) -> MPEG-1 (1988-1991) -> MPEG2 aka H.263 (1996-2015) ->MPEG4 aka H.264 (1999-current) -> High Efficiency Video Coding (HEVC) aka H.265 or MPEG (2015 - current). Each iteration with the ultimate goal of improving video quality with at lower bit rates. This doesn't even cover the other formats, VP8, VP9, Ogg Vorbis, DIVX, 3IVX, Sorenson, Real Media and the many others that occurred the past 30 years which all have had variations of mainstream success. Audio has had a similar vector from LMA4:1, Mpeg, MP2, Mp3, ACC, Ogg, AC3, DTS to name a few.

    However, static images haven't had the wide range of codecs (most formats are lossless proprietary files used by various image editors) and have been almost entirely relegated to five formats, SVG, BMP, PNG, JPEG and GIF for distribution. You may occasionally PSDs or EPS files, or photography formats like DNG or standard-free RAW, but those fall into the same category as video codecs like ProRez, DNxHD, Cineform. These are intermediate formats that require specialized software to view/edit and converted when distributed beyond professional means (sans EPS).

    We're starting to see future image formats like Google with WebP, and Apple with JPEG2000 and HEIC, and Safari allowing inline MP4s to be treated as images but for the past 10 years, much of the action in image compression has been trying to squeeze out ever last single byte out of the existing formats, almost entirely for JPEG and PNG (and SVG but that's a different story) A lot of the slow movement of web formats has to do with the W3c. It took Cisco buying and distributing the Mp4 patent for free to move MP4 to the accepted video formation for Microsoft, Apple, Google, and Mozilla. It may take some similar act of corporate benevolence to bring a successor to JPEG.

    Interestingly though, there's a been a concerted effort to squeeze every bit of optimization out of the existing formats: JPEG has MOZJpeg, Guetzli, JPEGOptim, and Jpegtran. PNG has Zopfil, PNGOUT, OptiPNG, AdvPNG, PNGCrush. These all differ as some are encoders, and some are strictly optimizers but the end game is to extract the most out the formats which often involves trickery to exploit the compression. Both ImageOptim and Squash are GUI front ends that make use of these optimizations to create the best JPEG or PNG per kilobyte possible. These libraries do not come without a penalty, that being CPU cycles. These all can take minutes to execute on larger images, and the longest being Guetzli, a 8 MP image can take around 40 minutes to encode even a 5th generation Core i7. We're probably quickly approaching the end of the law of diminishing returns. If you're using Guetzli, I'd argue it's easier to provide alternative image formats (WebP / JPEG 2000) as opposed to burning hours encoding a hand full of images as you'll get better results for the people who can see them (Safari and Chrome users). The rest, however, are still viable.

    PNG Compression tests

    Settings used: ImageOptim (default)

    • Zopfli
    • PNGOUT
    • OptiPNG
    • AdvPNG
    • Strip PNG meta data
    • Lossless
    • Optimization Level: Insane

    Squash 2

    • More Compressed (slower)

    Test 1: Complex Webpage screenshot

    Kaleidoscope Show differences results: >No differences

    Winner: Squash
    Squash Savings over ImageOptim: 21,939 bytes (21.9K), 1.3%

    Test 2: Simple Webpage screenshot

    Winner: Tie

    Kaleidoscope Show differences results: No differences

    Neither of these is terribly surprising, Squash uses LibPNG and Zopfil, which are open source PNG optimizations. I'm a little surprised that Squash shaved off a few more K. To make sure this wasn't a fluke, I tested another screenshot, 2.9MB (2,880,886 bytes), again Squash 2 won, (1.1 MB) 1,116,796 to (1.1 MB) 1,140,793, for a savings of 23,997 bytes (24k). On very large PNGs, Squash 2 has the advantage. I checking PNGCrush, brought it down 1,126,420 bytes.

    Test 3: Large Photograph

    Kaleidoscope Show differences results: No Differences

    Winner: Squash

    This last test weighs in the most for the favor of Squash, 330,665 bytes is significant, even if only a 6% difference

    The Results...

    While hardly the epitome of comprehensive testing, Squash does provide slightly better PNG compression. That said, ImageOptim is quite good for the sticker price of free. Squash 2 is part of SetApp collection or $15 stand alone. Squash isn't as accomplished in JPEG optimization as ImageOptim but seems to be best PNG GUI utility for OS X. It's surprising too, as ImageOptim offers more options for optimization and the same optimization libraries. You can't really go wrong using either utility.

    Mini Review of Squash

    Squash is essentially a drag and drop no brainer utility, drag images in and Squash does the best. If you've used ImageOptim then you're familiar with it. The big differences between ImageOptim and Squash are mostly cosmetic as both do the same operation. Squash appears to be no faster than ImageOptim nor does it have has as many options. The UI does provide a goofy animation and annoying sound (I killed the sound effects immediately).

    Where Squash won at PNGs, it lost out on lossless JPEG compression. Test routinely showed that ImageOptim shaved off on average about 5% more off JPEGs although individual tests differed wildly.

    Squash 2 is a minimalist utility through and through. Drag images in and it outputs compressed ones. Quite possibly the best thing Squash offers over ImageOptim is one of the most simple, it allows you to create new versions of the file appended with a suffix. ImageOptim overwrites images which can be undesirable.

    Detecting Content Blockers is a losing battle, but you can be smart and ethical when doing so...

    There's been a bit of a cat and mouse game between adblockers/content blockers and advertisers/analytics/trackers. The short answer is you aren't going to defeat them single-handedly. Many of the libraries designed to detect them will fail as they're inevitably blocked once a content blocker is updated to detect them. As someone who once ran a website, that hit 150,000 unique visitors a month funded by advertising, I'm sympathetic the publisher's plight. As a content writer, I value analytics, I use google analytics on this site as it helps me understand what content resonates, what channels people use to find my content and how they consume it. As developer with a touch fo UX, logging and error tracking is extremely helpful. A service like loggly can help me find errors, and design better to catch edge cases that aren't on the "happy path" and make data-driven decisions about a product. However, the advertising industry has perniciously proven they are not to be trusted. There's a reason why as a user I surf with Ghostery/1blocker, block cross-origin cookies (on my desktop, kill all cookies), use a VPN, and disabled flash long before most people to dodge the dreaded forever flash cookie. Privacy matters.

    This is my attempt create an ethical framework around content-blocking from the perspective of a developer/content create/publisher.

    A quick list of observations

    I've assembled a list of facts/observations about content blockers.

    • Adblock/Adblock Plus focus on advertising but not analytics. This could change in the future.
    • 1blocker and Ghostery are particularly good content blockers. Both will block <script> tags from loading, or any onerror codes at the src level
    • Content blockers are not fooled by appending <script> tags via javascript to the DOM.
    • 1blocker and Ghostery will not be removed from the DOM, thus any checks to see if they exist will be true.
    • 1blocker and Ghostery can detect anti-blockers popular scripts and prevent them.
    • Browsers are more aggressively pushing privacy settings, FireFox leading the charge and Safari not far behind.
    • If your website fails to work with one of the popular content blockers working, you are cutting out 20% of audience.

    But I'm a special snowflake!
    Using powers for good

    So as a developer/UX designer you're suddenly faced with a problem. Your website or web app has features that break when content blockers are enabled. You've already made sure that your core functionality isn't tied to anything that will be blocked by content blockers.

    Likely your client or manager will ask "can't you just go around the content blocker?".

    The short answer is "No". You will not forcibly defeat content blockers, and if you try, you're signing up for the unwinnable, all consuming, cat and mouse game. However, you can potentially detect content blockers, rather than defeat them. With a service like Loggly, you can easily check if the _Ltracker var has loaded.

      if (typeof _LTracker === 'undefined' || _LTracker === null) {
        //execute code

    Suddenly we're at the ethical precipice as we can do a number of things with this information. I've assembled a list of the ethical paths.

    Ethics of content blocking code

    Most Ethical:

    Website/WebApp's core features work any warnings until user reaches an ancillary feature that may be broken. User is able to complete core functions (consume content, use navigation, submit forms).

    Example: Videos still work. User is able to place orders but 3rd party chat tech support may be broken. User is informed.

      if (typeof _LTracker === 'undefined' || _LTracker === null) {
        //If and only if function on page requires service
        //inform user.

    Fairly Ethical:

    User receives warnings on every page, encouraging to whitelist site regardless if functionality is affected.

    Example: User is pestered with a whitelist site message. User is still able perform operations. Videos still work. User is able to place orders. 3rd party live chat tech support may be broken. User is informed.

      if (typeof _LTracker === 'undefined' || _LTracker === null) {
        //display global message.
        //Inform user that analytics are helpful for improving the service

    Least Ethical:

    User is blocked from consuming content until site is white listed regardless if functionality is affected.

      if (typeof _LTracker === 'undefined' || _LTracker === null) {
        //display global message.
        //obfuscate content/block content/disable features when error is present.

    No Ethical Stance: Site does not attempt to detect any blocked content. Site either functions or does not. This is the majority of websites.

    This model isn't free of problems, its almost entirely from the lens of a non-advertisement supported website, like a campaign site / company site/ ecomm / SaaS. While these sites may contain advertising and tracking, all the aforementioned are either have revenue generated by sales (Sass/Ecomm) or lead generation (Campaign/Company). Websites that are dependent on ad-revenue adhere a different set of ethics and variables.

    Other methods for checking for a script loaded.

    Checking for variable existance is the most fail safe method to see if a script has loaded. While the onerror will not work on an individual scrupt tag, you can write in scripts to the head with the following code. This though comes at a mild expense of code execution and may not work in all scenerios.

    Google PageSpeed Insight lacks commonsense and is becoming irrelevant

    This has been something that has irked me for some time now, and I haven't unloaded a good rant on development in some time. Yesterday I wrote about image bloat and decided to add a few negligible optimizations that I've meant to do for a year or two that resulted in about 8-10k reduction per page. After I enabled HTML and CSS minification on my blog, I skated over to PageSpeed, plugged my URL in and frowned. My newly optimized blog post scored a whopping 70/100. My page is 84.5k (or 68.5K without google analytics).

    Google Pagespeed bein' hyper-judgemental about a 90k page

    For reference, scores 73 out of 100 on mobile with the total page loading 5.1 megabytes and scores a god damned 84 out of 100 and loads 7.1 MB!. This is utter and complete stupid bullshit.

    Here in lies the rub: While Google PageSpeed always had a "reach for the stars" mentality but is woefully in out of touch when judging a page's real world performance. A 300k page, even poorly optimized one is going to beat the 3 MB page (average page size of major websites) in load times. In the era of a smart phone data plans: a customer could load 30 poorly optimized pages for one bloated highly optimized 3 MB beast of a page. It's telling that Google stopped developing its "PageSpeed" tool into Chrome and has since relegated to its annoying web-only interface. It's become a tool that would be SEO gurus/experts/snakeoil salespersons use when hired by clients use to hold over developers and provide "recommendations" in CMS websites that do not provide easy vectors for the more avant-garde optimizations like HTML minification (which incidentally tends to save less data than CSS/JS optimization, or less than using HTTP compression).

    PageSpeed says nothing about image formats beyond image scaling (and seems to be mostly tone deaf) to responsive images in reasonable margins of error. You can plug in a 500k PNG that could be served by a 40k JPEG image only to have PageSpeed score not even budge. It won't even blink if you're making an effort to support avant-garde image formats like WebP and JPEG2000 to provide more bang per Kilobyte.

    PageSpeed is also frighteningly javascript unaware. "Oh, you have a BitCoin mining javascript file? Is it minified? Is it uglified? Is it GZ compressed? Yes? THUMBS UP BUDDY! Also, good job on the 'Your flash is out of date malware javascript pop up.'" If you're tricky, write in an obfuscated javascript append script to say, the 460k uncompressed D3 library and Google PageSpeed won't even bother to check.

    Other poor detects revolve around iframes to popular services like YouTube / Vimeo / SoundCloud / CodePen and suggest optimizations based on the iframe content, anathema to the entire principal of CORS.

    There's also zero comment on total requests on the page other than suggesting to concatenate files and create image maps, it'll ding you hard for having multiple CSS imports for Google Fonts, but doesn't give a royal damn if you're making several hundred HTTP requests. (Note: most browsers are limited to 6 requests at a time per domain, and usually cap out at around 17 simultaneous. Each request must filled or 403/404ed to open another request. This says nothing about the limitations of the server either for max clients, more requests = more server stress.)

    Want to measure rendering performance? Forget it. There's no discernable metric about time to paint, or continous painting. Feel free to go nuts with CSS filters and bring a lesser device to its knees, PageSpeed doesn't care as long as your CSS is minified.

    Lastly, it can be wildly inaccuraate. My page is minified HTML and yet PageSpeed's wonderful insight is that I should minify my HTML. Wat. View source on any page on this blog if you don't believe me...

    There's probably a reason why I didn't notice that PageSpeed Insights had been removed in Chrome, as its mostly useless to a savvy front end dev beyond a sanity check. It can be taken that Google Pagespeed isn't a metric of your site vs other websites but rather, you vs yourself. Even that rational falls apart as it doesn't give guidance on recommendations too many factors nor does it put any judgement on data use. Google clearly cares about data use, as its questionable Accelerated Mobile Mobile project (AMP) exists. PageSpeed Insights was a tool of genius, but now it feels like it's past its prime and/or in need of some TLC. Really, what I'm asking for is perspective, and Google Pagespeed Insights doesn't have it.

    This article does not contain any images

    At some point in the past several years, the millions of different possibilities of turning individual pixels into a website coalesced around a singularly recognizable and repeatable form: logo and menu, massive image, and page text distractingly split across columns or separated by even more images, subscription forms, or prompts to read more articles. The web has rapidly become a wholly unpleasant place to read. It isn’t the fault of any singular website, but a sort of collective failing to prioritize readers.

    I don’t know about you, but I’ve become numb to the web’s noise. I know that I need to wait for every article I read to load fully before I click anywhere, lest anything move around as ads are pulled in through very slow scripts from ten different networks. I know that I need to wait a few seconds to cancel the autoplaying video at the top of the page, and a few more seconds to close the request for me to enter my email and receive spam. And I know that I’ll need to scroll down past that gigantic header image to read anything, especially on my phone, where that image probably cost me more to download than anything else on the page.

    Nick Heer,

    This blog post is a bit of a meta-reaction seeing as this is a response to Not Every Article Needs A Picture but it's pretty rare to see any blog or news source post an article without an image, and the ban lays squarely on the cult of the "hero" image. The Hero image was a late web 2.0 design, a celebration of bandwidth and the exploding opportunity in web design, and now is feeling trite, stale images and it's only exacerbated by the, Kinjas and every news site imaginable.

    Even the print guys fail this test, newspapers like NY Times do not even follow their own print standard and wedge photos into all their articles. As Wired famously wrote, "The Average Webpage Is Now the Size of the Original Doom" (ironically on a page surpasses the 2.3 MB mark at 3 MB* ), do we really need to tax users more? I feel bad cheating my favorite publishers out of ad-revenue, but even whitelisting sites has me running back to Ghostery as I watch my Mid 2015 MacBook slow down and go into leaf blower mode to simply surf the web. On my phone, I have 1blocker but find myself mostly using RSS to this day as its fast, quick and cuts through the unnecessary pictures. Admittedly, my blog index pages fail the Doom test but it's also loading 20 articles at time (this article viewed by itself is 103k), perhaps I may still yet sneak in another feature.

    *With Ghostery Enabled,'s article is a much more palatable 937K.
    *With Ghostery Enabled, this article is 97k instead of 102k.

    Installing Composer, Drush 8 and Drupal Console globally via composer on macOS (OS X)

    Install Composer

    Before we install Drush, we need to install globally Composer. Composer is a PHP package manager akin to NPM or Bower.

    curl -sS | php
    mv composer.phar /usr/local/bin/composer

    Next we want to edit our .bash_profile. Go your home folder

    cd ~/

    Create a new .bash_profile, (don't worry, if you have one, this won't overwrite it). We need to add a global entry for Composer.

    touch .bash_profile
    nano .bash_profile

    Add the following to your .bash_profile

    $ export PATH="$HOME/.composer/vendor/bin:$PATH"

    Install Drush

    Now that we have composer installed globally, we can install Drush via composer.

    composer global require drush/drush:dev-master

    Finally, we can select a specific version. For Drupal 8, we want Drush 8.

    $ composer global require drush/drush:8.*

    Setting up Jekyll Admin

    I've finally gotten around to looking into Jekyll a bit more, and one of the more exciting projects is Jekyll Admin. The documentation is a bit loose (the developer documentation is quite good). I'm writing this under the assumption that you're using OS X/Linux or such with Ruby preinstalled (OS X comes preinstalled).

    Open, up the terminal.

    Step 1: Install Jekyll-admin

    gem install jekyll-admin

    I had a bit of trouble with the install on both my MacBook Pro and my Mac Pro. If it hangs, hit command period and run the command again. It should work second go around.

    Step 2: Configure Jekyll

    Open up your _config.yml in an editor.

    :ocate either gems or plugins in your config (depending on your version) and add Jekyll admin. Right now Jekyll admin should run, but.... before you get too far ahead of yourself, you will want to add front matter defaults to your yml file.

    Step 3: Add front matter defaults.

    You may already have configured front matter defaults, depending on your setup. If you do not, then every used meta-data field will have to be added by hand to every post. My blog almost 99% of its content exists in posts. Thus I only needed to add a configuration for _poosts.

    Make sure you have front matter defaults set up for posts. For my blog, I do not make heavy use of front matter, my configuration I added the following so every post would have pre-filled for any post the categories, tags and layout.

          path: ""
          type: posts
          layout: post
          categories: ""
          tags: ""

    Keep in mind yaml requires spaces and not tabs. Using tabs will not work for yaml.

    Step 4: Run Jekyll.

    Start up Jekyll as you normally would. Navigate to after you've spun up Jekyll. Congrats. That's it.

    A mild blog update

    I try to stay away from spending too much time under-the-hood for my blog. As developer and designer, I'm always prone to over-tweaking. The point of my blog is to write about development as opposed to developing. So against my own better judgment, I decided to finally unveil a new feature to my hyper-minimalist stylings that I've debated adding for a year now. All posts now can be viewed by topics.

    Nuking SoundFlower.kext - Soundflower.kext can't be modified or deleted

    You may want want to remove Soundflower by Rogue Amoeba for some reason or another (upgrade?). For me, I noticed my FocusRite Scarlett 6i6 seems to have a driver incompatibility with some versions of SoundFlower, as would not show up in macOS However, Soundflower (for some reason or another) is viewed as required by OS X/macOS, unlike many kext files. I found this surprisingly more difficult than expected.

    soundflower cannot be deleted

    You're here because you've tried everything to remove SoundFlower:

    1. You tried the official installer DMG, and the removal AppleScript failed.
    2. You manually went to /System/Library/Extensions and found that you received the error "Soundflower.kext" can't be modified or deleted because it's required by OS X.
    3. Tried sudo rm-ing the damned file to find out its a directory and sudo rm -r doesn't work either and returns an Operation not permitted.
    4. Tried an app zapping app
    5. You tried Kext signing disabling by plugging in boot args and the first three things still didn't work...

    I do have a solution and its not as practical but boot your Mac on another volume OR boot your Mac into Target Disk mode

    Launch OS X on your other drive or plug your Mac into your secondary computer

    Locate the soundflower.kext in /System/Library/Extensions and drag it to the trash

    Try deleting, if your Mac complains, do the following:

    1. Launch the terminal (its located under Applications/utilities)
    2. Type in the following:
      Sudo rm -r
      Note: the trailing space is important
    3. Drag the icon of the kext into the terminal window, it should fill out the path to the kext file.
    4. Hit return, your Mac will prompt you for the admin password (this will be the admin password for the drive/computer you are currently booted from, not the password for the drive you are connected to)
    5. Hit return, it should delete now without any hitches
    6. Reboot your Mac as normal.

    Setting up a Bootstrap subtheme for Drupal 8

    There are a few directions for creating sub-themes for Bootstrap but none quite covered all the steps. For simplicity's sake, I'm going to use the name customtheme for my theme's name and title. Feel free to use whatever makes sense for your site instead of customtheme.

    Step 1: Install the Boostrap theme

    Download the latest version of the Bootstrap theme. Decompress the contents and drag the entire folder into core/themes/. Check to see that installed properly under the admin appearance. It should be listed under uninstalled themes. Click the install button.

    Step 2: Pick a starter kit

    Navigate to the newly created bootstrap folder in core/themes/, and go into the starterkit directory. You should see three folders, CDN, LESS and SASS. Each of these are variants based on Bootstrap 3. Personally, I use Sass but for this example, it doesn't matter.

    Step 3: Copy your preferred setup into /themes in the root of your site.

    Copy the selected folder into /themes Rename the directory to something that is acceptable for Drupal's theme naming conventions (No spaces etc).

    Step 4: Change file names

    In your site you should have:

    • /config
      • /install
        • THEMENAME.settings.yml
      • /schema
        • THEMENAME.schema.yml
    • /images
    • logo.svg
    • screenshot.png
    • /scss (this is dependent on theme you selected)
    • /templates
    • THEMENAME.libraries.yml
    • THEMENAME.starterkit.yml
    • THEMENAME.theme

    Change the names of the bolded files to your themename. Change the starterkit to info. It should look something like this:

    • /config
      • /install
        • customtheme.settings.yml
      • /schema
        • customtheme.schema.yml
    • /images
    • logo.svg
    • screenshot.png
    • /scss (this is dependent on theme you selected)
    • /templates
    • customtheme.libraries.yml
    • customtheme.theme

    Step 5: Edit the yml files.


    change the instances of THEMNAME and title.

        # Schema for the theme setting configuration file of the THEMETITLE theme.
          type: theme_settings
          label: 'THEMETITLE settings'


         # Schema for the theme setting configuration file of the customtheme .
           type: theme_settings
           label: 'customtheme settings'

    Next open up and change the the THEMENAME and THEMETITLE

    core: 8.x
    type: theme
    base theme: bootstrap
    name: 'THEMETITLE'
    description: 'Uses the Bootstrap framework Sass source files and must be compiled (not for beginners).'
    package: 'Bootstrap'
      navigation: 'Navigation'
      navigation_collapsible: 'Navigation (Collapsible)'
      header: 'Top Bar'
      highlighted: 'Highlighted'
      help: 'Help'
      content: 'Content'
      sidebar_first: 'Primary'
      sidebar_second: 'Secondary'
      footer: 'Footer'
      page_top: 'Page top'
      page_bottom: 'Page bottom'
      - 'THEMENAME/global-styling'
      - 'THEMENAME/bootstrap-scripts'


       core: 8.x
       type: theme
       base theme: bootstrap
       name: 'customtheme'
       description: 'This is a custom theme.'
       package: 'Bootstrap'
         navigation: 'Navigation'
         navigation_collapsible: 'Navigation (Collapsible)'
         header: 'Top Bar'
         highlighted: 'Highlighted'
         help: 'Help'
         content: 'Content'
         sidebar_first: 'Primary'
         sidebar_second: 'Secondary'
         footer: 'Footer'
         page_top: 'Page top'
         page_bottom: 'Page bottom'
         - 'customtheme/global-styling'
         - 'customtheme/bootstrap-scripts'

    Step 6: Go to appearance and install

    Your theme is now ready to go; it should appear in your appearence.

    Step 6.5: Go to appearance and install

    If you're using the CSS (precompiled bersion) version and want to use their implimentation of Bootstrap, go (3.3)

    If you're using the Sass version and want to use their implimentation of Bootstrap, go to (3.3)and download the Sass version, decompress the folder and rename it to bootstrap and place in the root of your theme.

    Additional notes

    From here, you'll most certainly want to turn off theme caching. I found this guide super helpful.

    I'd suggest following the tutorial to create a local to enable debugging and the cache to false as well. Also, I recommend checking out, the article Drupal 8 fundamentals.

    Goodbye FireBug

    Today, Mozilla announced it was retiring FireBug. Inevitably native browser development tools eclipsed FireBug, but I can't help but say "Goodbye" as there are so very few singular pieces of software that have such a bearing on my life. When a friend of mine introduced me to FireBug, and I realized back in 2007 I could see in real-time the effects of CSS, little did I know it'd lead me down the path of web development.

    Pure CSS (scss) Bootstrap compatible circular progress bars

    First off, credit where credit is due. I found a pretty good start to a circle progress bar by Alimul Al Razy via a random google search.

    I rather liked the approach and made my own modifications, pairing down the animation and step generation into two For loops, and making use of data-attributes. You can see it on CodePen. It's quick and easy to style up and does not require bootstrap. $howManySteps controls how many levels of percentage needed, if you need increments of 5%, enter 20, increments of 2 would be 50, etc.

    See the Pen Pure CSS (SCSS) Bootstrap compatible circular progress bars by Greg Gant (@fuzzywalrus) on CodePen.

    GitHub Gist

    NearStory Launch

    I don't make it habit of plugging products on my blog, but this one warrants it.

    Longtime office mate, Giovanni Salimena, is officially launching his iOS application, NearStory. It's already up in the app store and its free to download and free to use. Near story aggregates news/audio content related to your location, so that you can learn the local history of the area around you.

    Download it on the app store.

    Installing Provenance (OpenEmu) on iOS 11 without a jailbreak

    Provenance Logo

    Following up yesterday's post on how to install PPSSPP, I decided to add the instructions for how to install Provenance on iOS 11 without jailbreaking. I wrote in 2015 a guide on how to install emulators via Cydia and sideloading services which is worth checking out for more information on emulation on iOS. Provenance is THE go to emulator for retro iOS gaming, as its based on the wildly popular OpenEmu and boasts MFi gamepad support. Best of all, it is incredibly easy install compared to RetroArch or PPSSPP. Only RetroArch provides a wider range of support. Provenance boasts support for:

    • Sega
      • SG-1000
      • Master System
      • Genesis / Mega Drive
      • MegaCD
      • Game Gear
      • 32X
    • Nintendo
      • NES (Nintendo Entertainment System)
      • Famicom Disk System
      • SNES (Super Nintendo)
      • Gameboy / Gameboy Color
      • Gameboy Advance
    • Atari
      • 2600
      • 7800


    Building iOS applications requires installing Xcode, so if you haven't installed Xcode or updated to Xcode 9.0, download it. Once installed, launch Xcode and then launch a terminal session. Run the following to install the CLI utilities for Xcode.

    xcode-select --install

    Step 1: Download Provenance

    Either download the zip from or via the terminal.

    git clone

    Step 2: Provenance.xcworkspace

    Open the xcworkspace file (Not Provenance.xcodeproj)!

    Provenance setup in xcode

    Step 3: Set up the xcworkspace

    1. Set up the project to the Provenance app
    2. Set the target to your iOS device (be sure you have it connected)
    3. Change the bundle identifier to something unique
    4. Set up your developer profile

    Step 4: Build!

    Hit build, you're good to go. Provenance is incredibly easy to set up. Go to the official wiki for full details on how to use of the emulator.

    If you encounter the error, Untrusted Developer: "your device management setting do not allow using apps from developer... on this iPhone. You can allow using these apps in Settings."

    Go to Settings -> General -> Profiles & Device Managment and under developer App, tap your profile to allow apps.

    Congrats, you're now ready to use Provenance.

    Step 5: Adding games to Provenance

    Provenance Uploads

    Make sure you iOS device and computer are connected to the same access point. Tap on the Provenance icon on your iOS device. Click the + icon to start the webserver. Go to your computer and navigate to the URL in the message on the screen of your iOS device.

    Leave your rom files in .zip file to save space. You can queue up as many games as you want using shift click from the file menu. Sub Folders do not appear to be work within the Roms folder.

    Provenance is one of the better iOS apps. You'll probably want to pick up a MFI enabled controller to get the most out of it.

    Installing PPSSPP on iOS 11 without a jailbreak

    Installing PPSSPP on iOS 11 isn't particularly hard but does require several steps which can be mildly daunting for non-developers or non-iOS developers. I based this guide off of the official guide but realized it provided scant details to troubleshoot any issues.

    Important support disclaimer

    PPSSPP, as of writing this is, not officially supported on iOS 11. iOS 11 was the reckoning for older apps, dropping 32-bit support. From a development standpoint, often this means replacing out-of-date libraries that require more than mere hours of work. Videos on youtube and articles claiming iOS 11 PSP emulation are generally posted around 2015 or 2016 running iOS9/iOS10, well before even the iOS 11 beta with updated titles to garner views. Some video and articles posted more recently, again, show complex sideloading claiming iOS 11 support but the project hasn't been updated. Websites like RedmondPie are distributing IPAs, they won't work. If you'd like to keep tabs on iOS 11 support, I'd suggest looking at as they'll likely build the first stable iOS release for their service or better the official PPSSPP github. The emulator, Happy Chick, often featured in these YouTube videos and articles, also does not support iOS 11, which also uses the PPSSPP core. PPSSPP will launch on iOS 11 but it will be unable to load a game, this is due to 32 bit only support for dynamic recompilation as the emulator must on-the-fly recompile code meant for the PPSSPP. This guide should be accurate if/when PPSSPP gains 64 bit dynarec support and I plan to update this guide when that day comes. If you are interested in a fully functional emulator for iOS 11, I have a guide for installing Provenance, which supports a host of 8-bit and 16-bit era consoles (SNES, Genesis, Sega CD, NES, Master System, GameGear, GameBoy, etc).

    These instructions should work for iOS 9 / iOS 10 devices as well.

    For more information about Emulation on iOS, please see my very extensive guide iOS Emulation, gamepads, Cydia, Xcode, - A Tutorial for iOS emus. It's geared as primer for iOS emulation and the various ways emulators can be installed on iOS.

    PPSSPP icon


    Building iOS applications requires installing Xcode, so if you haven't installed Xcode or updated to Xcode 9.0, download it. Once installed, launch Xcode and then launch a terminal session. Run the following to install the CLI utilities for Xcode.

    xcode-select --install

    If you do not have an Apple developer ID, you will need to create one.

    Step 1 Confirm MacPorts is installed

    port version

    If you get an error message or about version matching or command not found, Go to MacPorts Releases and download the version of macports that matches your OS version.

    Step 2

    Navigate in your terminal to the directory you'd like install the project to.

    Run in your terminal: (this may take a bit depending on your internet connection)

    git clone

    After its installed navigate into your newly cloned repository,

    cd ppsspp

    Next from ppsspp directory run the following:

    git submodule update --init --recursive

    Step 4

    Next we want to create the PPSSPP.xcodeproj and dependencies in an directory called build-ios. This will take a bit.

    mkdir build-ios
    cd build-ios
    cmake -DCMAKE_TOOLCHAIN_FILE=../cmake/Toolchains/ios.cmake -GXcode ..

    If you get an error:

    Error: Current platform "darwin 16" does not match expected platform "darwin 15"
    Error: If you upgraded your OS, please follow the migration instructions:
    OS platform mismatch
        while executing

    The above error means you have the incorrect version of MacPorts. Go to the link listed above and download and install it.

    Step 4

    Open up PPSSPP.xcodeproj in the build-ios folder in Xcode.

    Codekit settings

    Select in the menu, the PPSSPP app icon, instead of ALL_BUILD. If you do not do this, you will not successfully build the app. Next Under Project -> Build For, select running.

    Plug in your iPhone or iPad into your computer and select the target as your device instead of a simulator.

    If you hit build, mostly likely you will get a provisioning error. Assign it to your developer ID. If you haven't added your developer ID, go to Xcode -> Preferences -> Accounts, and add your developer profile. In the general tab of the PPSSPP project assign your developer profile to the Signing section.

    Hit build. If Xcode errors out abt the bundle identifier, give a random string after the .org name.

    properly configured Xcode project for PPSSPP

    Image: Properly configured PPSSPP project requires the fields to be set.

    Step 5

    PPSSPP on phone

    You should now see PPSSPP on your iOS device.

    Untrusted Developer

    If you encounter the error, Untrusted Developer: "your device management setting do not allow using apps from developer... on this iPhone. You can allow using these apps in Settings."

    Go to Settings -> General -> Profiles & Device Managment and under developer App, tap your profile to allow apps.

    Congrats, you're now ready to use PPSSPP. Note, as of writing this, PPSSPP's iOS 11 support is incomplete. PPSSPP will launch but freezes when gamesattempt to load due to the dynamic recompiler not being 64 bit.

    If you'd like to build an iOS 11 compatible emulator, check out installing Provenance

    2017-10-23: Added support disclaimer.

    2017-11-20: Added further support disclaiming, PPSSPP still isn't iOS 11 compatible as it looks like youtubers are looking to cash in on the desires of would be gamers through questionable URLs. Why trust me? There aren't any ads here, I'm not getting paid to write this but there are ads on the YouTube vids and websites. Just sayin'

    2017-11-21: Further disclaimer clarification, looks like RedmondPie is linking a busted IPA file. Until the PPSSPP github project is updated, there will be no iOS 11 PPSSPP support. It's that simple.

    Creating an SVG Fill animation

    Recently I was tasked with creating a fill animation on an SVG; a request has come up a few times recently even for my company's website. The animation as described would rise up to reach a certain predetermined point and stop, like a vertical progress bar. I didn't find any 100% useful guides but was able to piece together from previous SVG work, and a few good stack overflow finds the basics.

    Svg animation

    Creating an SVG fill animation requires some knowledge of a graphics program like Sketch or Illustrator. For this example, I'll be outlining what I did in Sketch to treat the graphic, but this is not Sketch specific. I'll do my best to make this novice accessible but some basic understanding.

    Step 1: Treating your graphic

    Creating a fill animation requires the right graphic. To pull off this animation, we need a polygon that's a solid color for the vertical progress bar effect. This particular animation will rise up to the 25% mark as outlined by the article.

    Originally this graphic's green fill was a separate layer. While this a correct way to illustrate this, it's not easily animated. If we were to stretch the image, the effect would appear like the animation below.

    Lightbulb gift

    Instead, a much simpler solution is to use a gradient fill. Due to the trickiness of SVGs and gradients, make sure the gradient points extend the entire length of the fill; otherwise, the start and end points can create problems. Sketch is a little picking about gradient points, so don't worry if you can see the gradient transition. We will correct this in the XML of the SVG after exporting. Make sure you name your SVG polygons as this will become very useful for CSS as these will become the IDs for each polygon.

    Codekit settings

    Step 2: Export and paste

    Note: A caveat of the SVG format is that it requires being inline on a page for CSS to be able to target the SVG nodes. If it's linked via SRC, CSS is then unable to target the XML in the SVG. We want CSS control as we will be using it to set the gradient.

    Paste in the SVG into your HTML (feel free to remove any XML comments in the header). There are two things to observe: All the SVG gradients are declared <defs>in the section of SVG and that the gradient is linked within the polygon.

    Step 3: Creating more gradients

    To create our animation we're going to need three gradients:
    1. Default Gradient - this will be our default unfilled state
    2. Animation Gradient - this will be our gradient that contains tags within our gradient
    3. Finished Gradient - this is the final animation state, this will be our simple bobbing animation that loops infinitely after the animation has completed

    In the defs, I'm going to do three things: first name gradient and secondly set the second stops to the same endpoint to create the illusion of a solid line. Lastly, I need to make the light bulb "empty" so I'll set the offsets of the last two gradient stops to 100%.

      <linearGradient x1="0%" y1="0%" x2="0%" y2="99.9334221%" id="bulbGradient-default">
          <stop id="stop1" stop-color="#FFC809" offset="0%"></stop>
          <stop id="stop2" stop-color="#FFCF06" offset="100%"></stop>
          <stop id="stop3" stop-color="#6CB31D" offset="100%"></stop>

    Copy and paste and rename the gradient to match this pattern. It'll take a bit of trial and error but set the final stop offset points.

      <linearGradient x1="0%" y1="0%" x2="0%" y2="99.9334221%" id="bulbGradient-animate">
         <stop id="stop1" stop-color="#FFC809" offset="0%"></stop>
         <stop id="stop2" stop-color="#FFCF06" offset="73.5%"></stop>
         <stop id="stop3" stop-color="#6CB31D" offset="73.5%"></stop>
     <linearGradient x1="50%" y1="0%" x2="50%" y2="76.9334221%" id="bulbGradient-end">
         <stop id="stop1" stop-color="#FFC809" offset="0%"></stop>
         <stop id="stop2" stop-color="#FFCF06" offset="73.5%"></stop>
         <stop id="stop3" stop-color="#6CB31D" offset="73.5%"></stop>

    Step 4: Animation

    We can't target the defs via CSS, but we do have another tool, SMIL animation. SMIL is depreciated, but it works for linear gradients. SVGs can contain animations. SMIL is supported in all browsers sans IE/Edge (more on that later). For this example, we're going to use animate Animate consists of the attributeName (the part we want to animate in our parent), duration, values and repeat count. Normally we'd use CSS animations as they're more well supported but as of writing this, I've yet to find any way to animate gradients without complex JS. Within our stop tags, we'll add the animate values. Fortunately, for both animations, the last two stops will contain the same animation to continue our solid line effect.

    If we do not declare a begin property, the animation will automatically regardless if we can see it once the DOM is ready. To prevent this, we need to set the begin time as indefinite otherwise our animation will begin to play. We may not even see our animation or see a strange jump. We also need to give each animate property an unique ID so we can target them.

         <linearGradient x1="0%" y1="0%" x2="0%" y2="99.9334221%" id="bulbGradient-animate">
            <stop id="stop1" stop-color="#FFC809" offset="0%"></stop>
            <stop id="stop2" stop-color="#FFCF06" offset="73.5%">
                <animate attributeName="offset" dur="2s" values="1; 0.735;" repeatCount="1" begin="indefinite" id="bulbGradient-animate-stop1"/>
            <stop id="stop3" stop-color="#6CB31D" offset="73.5%">
                <animate attributeName="offset" dur="2s" values="1; 0.735;" repeatCount="1" begin="indefinite" id="bulbGradient-animate-stop2"/>
        <linearGradient x1="50%" y1="0%" x2="50%" y2="76.9334221%" id="bulbGradient-end">
            <stop id="stop1" stop-color="#FFC809" offset="0%"></stop>
            <stop id="stop2" stop-color="#FFCF06" offset="73.5%">
                <animate attributeName="offset" dur="5s" values="0.995; 0.95; 0.995; 0.95; 0.995;" repeatCount="indefinite" begin="indefinite" />
            <stop id="stop3" stop-color="#6CB31D" offset="73.5%">
                <animate attributeName="offset" dur="5s" values="0.995; 0.95; 0.995; 0.95; 0.995;" repeatCount="indefinite" begin="indefinite" />

    Step 5: CSS

    Finally, we need to set up our CSS so control our linearGradient, each gradient being assigned to a CSS state. The following is written in scss.

        #bulb-icon {
            #Bulb {
               fill: url(#bulbGradient-default);
          #bulb-icon.animate {
            #Bulb {
               fill: url(#bulbGradient-animate);
          #bulb-icon.end {
            #Bulb {
               fill: url(#bulbGradient-end);

    What we have is a pre-animation state gradient, the actually animated gradient, and then the final state after the animation for the gradient.

    Step 6: Javascript

    First, we need to create objects from our animate tags, this way we can access the methods available to them.

        var bulbstop1 = document.getElementById('bulbGradient-animate-stop1');
        var bulbstop2 = document.getElementById('bulbGradient-animate-stop2');

    After that, it is time to write simple our JS. We want to create an animation based on time via JS using a simple setTimeout function to change the class after it is done. The animation is 5 seconds long, so I've made the setTimeout a few ms shorter than 5 seconds. To trigger the animation I need to use the beginElement() method. This initializes the animate.

        function animate(){
          $("#bulb-icon").attr("class", "animate");;
            $("#bulb-icon").attr("class", "end");;
          }, 4995);

    Note: I learned about the beginElement() method from a great blog post at which I highly recommend reading.

    IE Support

    As mentioned previously, Internet Explorer and Edge do not support SMIL with no plans to support SMIL. However, we can add SMIL support with Fakesmile, an Internet Explorer shiv.

    Our final Product!

    See the Pen SVG Animation by Greg Gant (@fuzzywalrus) on CodePen.

    Update 10/20/17: Added in more info about JS. Added the restart animation to CodePen. Added info about beginElement

    Total Eclipse Oregon

    "Welcome back to daylight Portland. So that's it, the last solar eclipse to be seen on this continent in this century..

    And as I said not until August 21st, 2017, will another eclipse be visible from North America. That's 38 years from now. May the shadow of the moon fall on a world at peace" - Frank Reynolds, ABC News

    Such unbridled optimism... :(

    Total Eclipse Hubbard Oregon

    Pictured: Hubbard, Oregon. Captured on my OM-D EM5

    A total eclipse is something to behold as it touches more than the eyes. There are a few silly things I never considered going in which were all obvious in retrospect, the sudden temperature drop, the quiet as all traffic stopped and birds (mostly) stopped chirping, the distant cheering, and the 360-degree sunset. As fortunate as I was to witness it, friends in Salem and north of Corvallis reported being able to see the stars. I feel no need to place any more special significance on the experience than the beauty of nature and astrophysics. That alone should be enough to inspire...

    The lead quote by Frank Reynolds can be found at 9:12.

    Also, bonus, watch at eight as the commentators speculating as to what Oregon was like 360 years ago, the last time the path was nearly the same.

subscribe via RSS