• #front end development
    • #os x
    • #sketch

    Manual Install Invision Craft for Bohemiancoding Sketch

    Pictured: This installation failed...

    Manual installation

    Craft's help page is woefully unhelpful about installing the plugin manually for sketch, and it only reads "We don't recommend manually installing the Craft plugins for Sketch." without any additional steps, fortunately the good people behind Invision made the plugin URL available to reverse engineering :)

    1. Download craft-sketch.zip and unzip it (doubleclick)
    2. Go to ~/Library/Application Support/com.bohemiancoding.sketch3/Plugins and drag in the Panels.sketchplugin
    3. Go to ~/Library/Application Support/com.bohemiancoding.sketch3/Panels and drag in the Data.sketchpanel, Duplicate.sketchpanel, Library.sketchpanel, and Sync.sketchpanel
    4. Launch Sketch and verify it has been installed under Sketch plugins.

    Enjoy your new plugin!

    • #os x
    • #front end development

    Three months of BackBlaze - A Review

    Review in progress

    I finally bit the bullet and plucked down $50 for a year subscription. I'm a bit data paranoid, years and years ago I lost an IBM 75 GXP Deskstar (a 45 GB HD) in early 2001. It's a funny detail that I can remember something so esoteric as a hard drive model, especially considering how many I've owned over the years but it speaks to the gravity of it.

    I had a PowerMac G4 at the time, with a set of two 18 GB Western Digital HDs, a 40 GB Maxtor, and had moved to my new (fast for the time) 45 GB IBM Deskstar. Back then, I was near the absolute fringe with so much storage. All it took was self encoding a sizable collection of a few hundred CDs to 320 Kbps MP3s to fill nearly my entire 40 GB. It all came down to a realization that I had: Data storage would become so abundant that there was no reason to store my music other than at the highest bitrate. Thus, I decided I'd eat the cost upfront to save myself regret in the future.

    I wasn't the only person to be burned by IBM, soon a class action lawsuit followed but the damage was done. My 45 GB Deskstar, (affectionately dubbed the "deathstar" by legions of scorned customers) was my boot drive, storing all my most important documents. While I didn't lose my music collection, the data I did lose was irreplaceable: art projects, websites, school work, among other things. My lesson was learned and data backup became part of my life. My first attempts were CD-Roms, followed later DVDs. Eventually using I started other HDs as manual backups, even Carbon Copy cloner and a RAID1 + 0, setup.

    While it might sound like paranoia I had good reason to fear, my income throughout college and after was always tethered to my web projects. Even my art major, digital arts depended on a working computer. When Apple debuted Time Machine 2007, all of my previous habits were abandoned and it changed the way I fundamentally approached my computer for the better.


    So as I write this, I've been using Time Machine for 9 years. Time Machine ranks as absolutely one of the best features Apple has ever added OS X. If you're not using it, you should be. Time Machine provides backup repository of your entire HD, including revision histories.

    Time Machine is damn near magic but it has major flaw: Its a local only backup solution. Unless you have a friend with a beefy internet connection, a VPN, and who's willing to leave a NAS (Network attached storage) drive on 24/7 and a little OS X know-how, you're limited to backing up Time Machine only when you're physically at your Time Machine Drive's location. It doesn't take much imagination to see how this could be problematic: a catastrophic power surge could ruin all your electronic devices, frying your computer and time machine drives, or perhaps your house is burglarized, computer, hard drives and all. For these and many more reasons is why offsite backup is the holy grail.

    So what is Backblaze? It'd be easy to simply call BackBlaze a "cloud" time machine, but that'd be inaccurate. Backblaze doesn't do version history, and it isn't particularly designed for single file downloads (although it can be done).


    • Inexpensive!
    • Offsite backups
    • Time Machine level of simplicity
    • Backups can be downloaded or shipped to you at no cost on a USB drive (long as drive is return within grace period)
    • BackBlaze has a "find my computer" feature for stolen computers (assuming your drive isn't wiped or replaced)


    • Not full backup: OS and applications aren't backed up
    • Backups are entirely dependent on internet speed, expect weeks of backing for drives larger than 500 GB
    • No back up prioritization
    • No file versioning
    • Files stored for 30 days

    The 30 day backup for BackBlaze is a bit tricky but basically if a file has been deleted, BackBlaze will stop storing it after 30 days, unlike say, Time Machine which will keep the file until Time Machine is forced to delete old backups for storage. Clearly, for Backblaze this is an overhead check. Storing every file indefinitely for each user is likely a very tall order. However, it's also important to understand the implications. If you use Backblaze to backup external drives, they will need to be connected to the computer in question at least once a month while Backblaze is running to reauthorize the index of the files, so don't count on Backblaze to backup data that you only access once in a great while, and if you're going to be away from your data for more than 30 days, so if you're banking on hiking the Pacific Crest Trial and stowing your laptop for 3-4 months in a friend's garage, Backblaze may not be for you.

    The Setup

    Setting up BackBlaze requires two things: Signing up for a trial or paid account, and downloading and installing it's application. The initial takes a fair amount of time, it probably took longer for me than the average user with 6 HDs to sift through, which took about a half hour.

    The installer, nearly complete

    Once installed, Backblaze lives as a control panel in your system preferences. For those familiar with Time Machine, the options are similar: you can pick the drives/folders you'd like to exclude, but unlike Time Machine you can specify backup frequency, max file sizes to upload, and what speed to upload.

    Get used to this moving at a snail's pace

    The updates

    Currently I'm only two days in on BackBlaze, and uploading around 10 GBs a day by leaving my Mac Pro roughly 16 hours a day on a medium data capping, over the weekends I intend to untether the data cap. BackBlaze is a passive experience. My intent is to update this review as my impressions.

    Currently it appears that the service shoots for small files first, the first 1.1 million files appears to have constituted roughly 20 GBs of space. My best guess is that roughly 10,000 files constitute 90% of the space. I have a feeling I'm on the extreme end of who this service is geared towards. Most users are on laptops, and most laptops are on SSD, very few users probably have larger than 1 TB drives. I'm an outlier, my Mac Pro's bootdrive is a 750 GB SSD, and the backup boot drive is a 2 TB Hard Drive. BackBlaze auto-ignored my two time machine drives, and my bootcamp HD. I picked to ignore my 3 TB external drive, and my other external 2 TB HD. So in short, I'm backing up two drives since those both store what I'd consider my valuable data. Between more than a decade of shooting photos (RAW and between several iPhones) and digital music as my hobby, I probably have more irreplaceable data than most users, (sans the hardcore videographers). Will I manage to get my first back up within three months? I'm unsure.

    A week later

    After a week of roughly 16 hour runs on my 50 Mbps/20Mbps connection, I've uploaded 200 GB of 1.9 TB for my initial backup with roughly 300,000 of the 1,500,000 files being uploaded. I noticed by default the Downloads folder isn't excluded by default, added it which reduced my uploads by about 60 GB, a drop but in all likelihood 3 less days of uploading. So far the biggest miff I've had is there isn't any prioritization to what data is targeted first. Smallest files to largest seems like a logical strategy but I'd also like to assert some data as more valuable that supersedes the base priority, especially after the initial upload. I still have some questions about how daily backups are handled, and what happens if a file changes before a backup is complete. I'm guessing if its been uploaded, it will not be backed up until the next batch update.

    Is Backblaze worth it?

    Considering that Backblaze is cheaper than Amazon's Glacier, Backblaze already makes a bit more sane. However there are competitors, like iDrive which is cheaper but is limited to $35 a year for 1 TB and offers multi-computer backup/accounts. There's also SpiderOak, Tresorit, CrashPlan, Carbonite, and SugarSync. Over the coming months, I'll do a base comparison of feature sets and pricing.

    • #front end development

    Introducing CSSFilterGenerator.com

    After noticing none of the CSS filter generators offered layer reordering, I wrote my own from scratch. It's still a beta but it works and generates usable code! Check out CSSFilterGenerator.com. Hopefully you find it useful too!

    • #ios
    • #iphone

    Headphone jacks are NOT the new floppy drive

    Rumors aren't part of my blogging but occasionally I've been prone to rant about Apple. The iPhone 7 rumormill sparked an unusual amount of interest on my part, not for what it included but what it didn't include: the 3.5 mm headphone jack. I've always regarded the Lightning port as a senseless money grab despite mostly preferring form factor. Yesterday that rumor has shifted to slightly more sane position, the headphone jack stays! Rejoice...!?

    The silliness of it all is the lightning port creates the insanity that users either must buy bluetooth headphones or worse, a dongle for their existing headphones and forgo the ability to charge and listen to the phone simultaneously. The only advantage lightning ports offer is bus powering for noise canceling headphones which can already be attained without sacrificing the 3.5mm headphone jack, and a thinner phone that's even more prone to bending. It's the same asinine behavior that lead to the new MacBook featuring only 1 USB port, requiring pretty much all users to purchase a $79 dongle for charging/video out and the same insanity that lead to the Mac Pro being mostly panned as a flop by actual Pro users * yours truly and many pundits.

    Hostile design...

    Apple's opinion is ports is clear, and its disdain for modularity is frightening. While I normally agree with famed Apple pundit, John Gruber, Headphone jacks are the new floppy drive. His reasoning is flawed, very flawed.

    Why would Apple care about headphone compatibility with Android? If Apple gave two shits about port compatibility with Android, iPhones would have Micro-USB ports. In 1998 people used floppy drives extensively for sneaker-netting files between Macs and PCs. That didn’t stop Apple from dropping it.

    I remember 1998 too and perhaps more vividly, and everyone had Zip Drives and applications came on CD-Rom in big funky boxes. The floppy was already on its death bead, as 1.4mb was simply too small, the only things that came on floppies in 1998 for Mac users were drivers which could easily be printed on optical media. Everyone was asking for something better by then, hence why my PowerMac G3 450 that I bought in 1999 had one internally. While the iMacs did kill the floppy, they had USB, ethernet, a built in modem, a CD drive and it wasn't long before they had CD-RW/DVD drives.

    By the time the MacBook Air eschewed the optical drive, anyone looking to transfer files had the internet, USB drives and networks for file transfer, all faster and offering exponential storage over optical media. Much like the iMac, the MacBook Airs even added Lightning ports, a much welcomed addition. Even by the mid 2000s, the internet had become the preferred way for most software distribution.

    If anything, the 3.5mm headphone jack has hit a renaissance. My computers all have them (My media PC and Mac Pro have several). My Lightning port dock has two. My old iPod has one. My 2013 car has one. My PS4 controllers each have one. My iPad has one. My USB connected speakers, Vanatoo Transparent Ones still have one. My old Klipsch Promedias have both input and a nice front facing jack. My Numark NS7s have one (along side its 1/4 inch) and I'm not even counting the test bed of devices I have at work for development. If we went back in time, 10 years ago, my car did not have a 3.5mm jack, nor did my game consoles have them built in, and my Motorola Razr didn't either, and that's not counting the items listed above that simply didn't exist in 2006.

    Outside a few fanboys, No one is asking for a replacement for the 3.5mm.

    Apple switching to lightning port makes headphones incompatible even with their own Macs. I have a 2015 MacBook Retina on my desk as I type these very words and I can most surely assert that it does not have a lightning port but does have a 3.5mm jack. Same goes for my 2008 Mac Pro. On my desk I have two pairs of $200+ headphones at work that at best are going require a dongle and at worse will not be usable with iOS.

    The craziest part is we're in a world where we can have our cake and eat it too. Lightning port headphones, bluetooth headphones and 3.5mm all can continue to coexist. The Verge's Nilay Patel is 100% correct, Taking the headphone jack off phones is user-hostile and stupid and I will not be buying an iPhone 7 if it doesn't have a 3.5mm jack.

    * Edit

    Gruber followed up with the following quote:

    Removing the analog headphone jack is inevitable, and the transition is inevitably irritating. This is what makes Apple different. They will initiate a painful transition for a long-term gain. Other companies will avoid inducing pain at all costs — and you wind up using VGA until the mid-2010s.

    This analogy is clever if you do not understand the inherent physics of the problem. Video ≠ Audio. Unlike say HDMI vs VGA which offers inherently better picture, lightning cables do not.

    Video signal transfer isn't powering a transducer. No matter the signal chain of audio, if you want to hear it you're inevitably going to convert audio to analog waveform by DAC to an amplifier then to a transducer. All a lightning port cable does is delay the conversion. If you've ever wondered why the staying power of analog has been so strong, its simply due to the physics. Using a lightning cable solves nothing, and places the amp/DAC outside something the phone already provides, which in turn equates to expensive dongles or expensive headphones. For home theaters, we use a centralized receiver that takes digital inputs like HDMI, S/PDIF (Toslink or coax), Bluetooth/wifi and USB, decodes/converts the signals to analog to be amplified and transmitted rather than sticking a DAC/amplifier in each individual speaker.

    Taking this stance isn't standing in the way of progress, it's actually arguing for progress, it's an open standard that's virtually future proof, almost universal until we do away with transducers as we know them. If it was truly about thinness, we'd have a simple plug adapter from 3.5mm to a thinner variant as we do from 1/4 inch to 3.5 mm. This isn't progress, this is shackling us to a closed standard that Apple can tax.

    • #audio
    • #review

    A week with the Focusrite Scarlett 6i6 - a review

    Pictured: Scarlett features a stylish brushed red aluminum finish.

    I have a bit of a history reviewing audio hardware, specifically audio I/O. Over time, the audio interface has moved away from PCIe to USB, which it now currently rests at as the defacto state for nearly 15 years after USB 2.0 became widespread. I've owned a few external boxes over the course of a decade, briefly M-Audio's precursor unit that mimics today's Fast Track (Which I returned), Yamaha GO46 FireWire, and Native Instruments Audio Kontrol, and recorded two albums using the the later two. I consider myself a bit of an audio geek, but without the audiophile trappings.

    Recently I hit a breaking point, NI Audio Kontrol was able to accept 1/4 inch unbalanced cables. Mystified, I decided it was time to retire the AudioKontrol and check out the offerings in 2016. Unsurprisingly, audio interfaces offer far more bang for the buck than did even 5 years ago, at $180 I was able to score the Focusrite Scarlett 6i6, offering more high quality inputs and outputs than any of my previous devices at a lower price point. Even more impressive for $240, the 18i8 offers a whopping 18 potential inputs and 8 output buses.

    The weak point of every USB capture device in my experience has and probably always will be, drivers (and USB itself). As an OS X (excuse me, macOS) user, CoreAudio has been mostly positive. Most USB devices if they're ASIO/CoreAudio compliant, drivers are barely needed for basic I/O. However, if the interfaces have custom buttons / internal routing or other features, then drivers are required. In the case of my AudioKontrol, the drivers actually were mostly negative causing glitchy behavior, and same went for my week with the M-Audio Fast Track. After dealing with years of prosumerish solutions I decided to ante-up to Focusrite, renowned for their preamps, skipping budget players like Presonus and M-Audio.

    Fair warning, this as much an overview of digital audio as review. Now onto the review.

    FocusRite Scarlett 6i6

    focusrite Scarlett 6i6

    Pictured: The 6i6 makes for a good speaker rest

    The Scarlett 6i6 is 6 in and 6 out but that doesn't quite accurately sum up the ports. A break down includes the following:

    • 2 front facing Microphone XLR/ 1/4inch Line Inputs with hardware knobs for gain control and level monitoring (Supports 48v)
    • 2 1/4inch Line Inputs
    • 1 stereo S/PDIF input
    • Midi in
    • 2 1/4inch Headphone outputs with hardware volume knobs
    • 2 1/4inch Line (monitor) headphone outputs with volume knob
    • 2 additional line outputs
    • 1 stereo S/PDIF output
    • Midi out

    If you notice, this doesn't add up to the 6 outputs in the device name but instead a total of 6 inputs and 10 outputs. The reasoning is headphones/monitors are all on the same audio bus bring it back down to 6 outputs buses: one for the monitors (speakers/amp + two headphones), an additional set of 1/4 inch outputs and an SPIDF cable. Each of the headphones jacks and monitors have independent volume controls but any audio routed to the monitor outputs will be outputted to those three outputs. Also notable, the Scarlett only accepts 4 analog channels in. Most users probably won't use the S/PDIF I/O (more on that later). The full tech specs can be found here.

    Setting up

    Focusrite surprisingly ships the Scarletts with a host of wall adapters for your country of choice but being firmly rooted in North America, I had to swap to North American standard prongs. Other than that, the Scarlett is pretty straight forward: USB cable to the computer, AC adapter to the wall, audio inputs into the device. For me, this meant plugging in my Numark NS7s into the back ports and single mic.

    Pictured: The mess of cabling...

    Performance: latency

    With digital audio, there's always (as of writing this) buffering which requires interjecting latency. No matter the device, there will be latency depending on the buffer size. The math to calculating minimum latency is quite simple: Buffer size/sample rate (in KHz) = latency in milliseconds.


    512 samples/44.1 kHz = 11.7 ms

    384 samples/44.1 kHz = 8.7 ms

    512 samples/96 kHz = 5.3 ms

    384 samples/96 kHz = 4 ms

    However, this is only the absolute minimum for ONE direction, and lowering the buffer puts more stress on CPU to be sure that the buffer never is fully depleted. This becomes tougher to accomplish as the CPU is tasked with processing more information such as more fx and more tracks. Total travel times for buffering would like like the following:


    (in) 512 samples/44.1 kHz = 11.7 ms + (out) 512 samples/44.1 kHz = 23.4 ms minimum roundtrip travel time

    (in) 384 samples/44.1 kHz = 8.7 ms + (out) 384 samples/44.1 kHz = 17.4 ms minimum roundtrip travel time

    (in) 512 samples/96 kHz = 5.3 ms + (out) 512 samples/96 kHz = 5.3 ms = 10.6ms minimum roundtrip travel time

    (in) 384 samples/96 kHz = 4 ms + (out) 384 samples/96 kHz = 4 ms = 8ms minimum roundtrip travel time

    The math above also represents the absolute minimum for travel time for external audio to travel from an input and routed to an audio output. As stated this is the absolute minimum time, the audio travels through USB for the USB clock timer, which fires at 1 ms intervals, thus there's an latency buffer that has nothing to do with audio samples but rather continuous data flow imposed by USB. Lesser devices simply use a buffer size of roughly 6 ms for each direction (I/O) which adds more travel time, whereas higher end devices will finally tune the USB timing to minimize the delay. Someone using a low end USB device with 384 sample buffering can expect roughly a 29ms delay. Higher end boxes such as the Scarlett have fine tuned drivers to shave off crucial milliseconds for the USB buffering, and also include onboard DSP to allow onboard mixing to lower travel time delay. If this all sounds a bit confusing, it isn't as bad as it sounds.


    I would like to route my Mic Input directly into my output so I can monitor my inputs without having to route my audio to my computer, to the DAW then back out USB, all of which introduces a time delay, hence latency. Doing this skips the travel time through the ASIO buffer and USB Clock. The benefit is that I effectively has zero-latency for my input monitoring and my downside is that I cannot make use of any effects in realtime from my DAW.

    Higher end audio interface include DSP effects that can be controlled via the software mixers so basic compression/EQing/reverbs/delays can be applied to live monitoring and/or use other interfaces (Firewire has a slightly better clock timing, but Thunderbolt provides even lower latency due to the PCIe bus clock).

    All in all, the big step of buying the Scarlett line over a prosumer audio interface boils down to slightly better drivers and internal mixing.

    Performance: The bits of it all

    24 bit is really an unrealistic thing, it's nearly a meaningless stat when it comes to audio gear, however there are measures that more appropriately reflect the dynamic range, but to fully understand this, we have to talk analog and math.

    While I may get flack for saying this, despite issues like latency, digital has had a massive leg up over its analog predecessors, not simply from an archiving/storage perspective but also quality. The much loved vinyl format, hits roughly -80db between signal to noise, meaning the signal is signal power is roughly 80 times stronger to the noise power, which isn't bad. Digital however doesn't have an analog noise floor, and sound pressures are expressed in bit depth, which is the amount of steps to current in the digital-to-analog convert (DAC).

    To use an analogy I developed that works reasonably well when writing for an audio publication, Bit depth is akin to bit depth in digital imagery, instead of reflecting how many colors an image can have, it reflects how many steps in volume. Sample rate is the resolution, at which the sound is captured. What becomes interesting is that there's even a formula that explicitly tells you how the maximum dynamic range in decibels for any given bit depth. Using the signal-to-quantization-noise ratio formula: 20*log10(2^BITDEPTH-1), we can calculate the signal to noise ratio. 16 bit audio has a theoretical range of 96.33 dB, which is considerably better than Vinyl, and on par with the best of studio to reel to reel systems. Also, it's important to understand these values represent a theoretical maximum as the Analog-to-digital convertors (ADC) and digital-analog converters (DAC) rarely achieve their maximums. 24 bit audio has a theoretical range of 144.49 dB, far beyond even currently the best hardware on the market. Below I made a simple calculator to play with.

    The Focusrite features 109 dB dynamic range on its inputs and outputs which is a little more than 18 bit depth. For the computer savvy 18 bit = 218 which is effectively 262144 sound level pressures vs 16 bit's 65536, or 4x times more detail. Focusrite isn't being deceitful listing 24 bit, but rather dealing with the limitations of audio production. Also notably for a reference point, the theoretical maximum for volume reproduction of 24 bit would be from silence to a NASA rocket launch (140 dB), arena rock concerts are known to be in excessive of 120 dB. It's not realistic to use the entire dynamic range of 24 bit and your neighbors would not approve if you could.


    If I haven't talked sampling rates yet, there's a reason, by most accounts, bit depth matters more than sampling rates after a certain point. 44.1 Khz can reproduce 0Hz-22KHz. Capturing at 96 KHz, may actually reduce sound quality if your target format is 44.1 KHz through alias noise. The best way to imagine this is a photo. If you scale proportionally by half, the image will remain clear whereas, scaling to say, 45.9% of the image size would cause some of the image clarity to be sacrificed. The reason why in applications like Photoshop this isn't that big of a problem is through resampling (scaling) algorithms. This sample principal applies to audio, as the wave form must be recomputed and resampled, creating what is known as aliasing. Bit Depth downconversion uses dithering which is a lot more predictable as its a numeric reduction in values, where a range is compressed. Depending on your target format (movies = 48 KHz) or music (usually 44.1 KHz) capturing at 2x the sampling rate is of the target format is preferred. The Scarlett can capture 88.2 KHz but the advantages of higher sampling rate less obvious since DACs have become quite good over the years at filling in the gaps so-to-speak. What high resolution can do is capture above human hearing sounds, and more accurate articulation of the effects of things phasing. It's not night and day, and honestly, I'm mostly hard pressed to tell the difference, as are a lot of people, however, audio processing does better with denser data and the real advantage almost exists entirely in the DAW.

    Since I touched on analog vs digital I figure I'll put in a quip in the long standing debate. Most of analog's love has less to with superior quality, but characteristics left due to various mediums limitations. It should be also pointed out that analog effects like harmonic distortions from tube amplification and over-saturation from tape, can and are captured by digital when recording from analog sources. For audiophiles, much of the desire is to recreate how music "used to sound", hence the love of vintage hardware. There's nothing inherently wrong with this except that it often shapes audio debate in non-quantifiable terms and often leads to absurd claims about analog vs digital. Also to add to the debate mess, has been the shift in recording techniques, mixing and mastering over time which also drastically alters the sound of a recording.

    Lastly, digital for recording/listening intents and purposes exists in tandem with analog. In any audio digital path, the signal must be converted into analog electrical modulations to be fed into a transducer (speaker) or start as analog from a transducer (microphone) and have the analog signal into digital, so the devices that perform this are very important. In short, as it relates to this review, the Focusrite Scarlett quality that's professional at an absurdly low price point and it's a wonderful time for a hobbiest as digital solutions are cheap and extremely high quality. Focusrite isn't the only player making low-cost/high-quality computer audio interfaces this but it has one of the more attractive packages.

    The Real world

    At the price point, the Focusrite is well speced, the 2nd generation due out this month gives a modest bump, mostly more headroom on the analog ports, 192 KHz capture/playback and analog protection circuitry for unexpected power surges, all welcome features but not game changing. The 1st gen can be had for $180, a nice $70 price reduction making it a lot of bang for the buck.

    Scarlett Mixing Software

    Pictured: Scarlett Mixing Software

    The Scarlett drivers are straight forward although the device can be used without them but you'll miss out on the analog mixing. After installing the drivers, I rebooted and launched the mixer which immediately updated the firmware of the 6i6, which took mere seconds. No word of what the firmware update did but googling revealed that it improved sample rate switching for OS X (MacOS) users and enables standalone mode so the device can continue routing audio even if the computer isn't turned on, very cool.

    The software mixer straight forward, with handy routing presets and input gain control which is useful for the inputs that do not have hardware controls. Any configuration preset can be saved as a snapshot and instantly reloaded, likely more useful for the Scarlett featuring more inputs and outputs but still welcome. Sevearl of the mixer elements also control hardware switches on the device like Input Gain control or line level vs instrument for the front facing. There's a gift and a curse, the hardware is small and compact but it means its entirely driver and support dependent to set the device settings, whereas with previous devices I've owned, line vs instrument gain control was hardware facing. Even with bad drivers, the AudioKontrol functioned as a simple USB input/output device regardless. With any luck though, support will be long in tooth.

    My setup

    Everyone's home studio will look a little different so to give users a chance to contrast and compare, my current setup is as follows:

    Computer: Mac Pro 2008 with oodles of upgrades

    Monitoring: Vanatoo Transparent Ones with a MartinLogan Dynamo 300 subwoofer, Beyerdynamic DT-990 headphones

    Inputs: Numark NS7 Numark Performance Controller (motorized turn table controller with audio output), various Microphones

    Midi:Native Instruments Maschine, Korg Padkontrol, Korg Microkey 37, Korg NanoPad, Korg nanoKontrol (all USB)

    DAW: Cubase, Logic, Maschine

    My mini studio is very hip hop centric, mostly focusing around beat composition. It's not space intensive and uses only a modest amount of hardware and I don't have any real plans of expanding much beyond it. The only real upgrades is probably replacing my Shure SM7b with something a little more forgiving for a wider range of vocals.

    Out the gate, I was already happy to simply be able to listen to my headphones or speakers without having to change my audio settings in OS X, even if only an option click away. Swapping between the two was as simple as turning the volume up, this may not seem like a big deal but for all the love Vanatoo get, their speakers annoyingly do not have a front facing volume knob. Also the headphone amp, while some audiophiles scoff at it, is without a doubt reasonably better than my Mac's internal headphone jack that I was reduced to using. At least the Mac Pro headphone jacks aren't pummeled with white noise like the MacBooks. Out the gate, if nothing else more accessible volume knobs and better sound via headphones. I was previously debating a headphone amp for my power hungry DT-990s but they sound better than before and as good as I recall them when I used a Denon mid range receiver as my main headphone amp.

    The big hiccup came when trying to get audio to work in Cubase, part of it was user error as I could not get audio to output for the life of me in Cubase and only Cubase. In a moment of inspiration I realized that my ports may not be labeled correctly in the VST panel and noticed that it carried over disabling two outputs. Cubase started showing volume meters for sound but refused to actually output audio. At this point I resorted to a classic audio hack for OS X, create an aggregate device of one in the audio midi setup. For whatever reason, it worked. Annoying? Yes. All other applications functioned normally without this, meaning the issue lies somewhere between Cubase's VST engine and the drivers for the Scarlett.

    Recording was easy as ever, there isn't much to say, identifying buses was a charm and recording worked great, noiseless and sounded as rich as it should have for the instrument inputs on the back. The Mic Preamps are notably a little nicer than my Audio Kontrol, simply for the fact it'll accept unbalanced cables. The quality when tested with a Sennheiser e935 without any other preamp was clean and defined, and only required roughly half gain. Comparing it to the Audio Kontrol which wasn't terrible, it seemed just a hair "richer" to use a vague imprecise term. Audio quality is certainly up to professional standards, at least in my book.

    The next plus was for the first time, I was able to use live monitoring. With my AudioKontrol, it never worked if it was supposed to. I've always done monitoring via software which has meant delay. Not ideal but it worked. A quick trip to the mixer control panel and the Scarlett worked as expected. I could play my NS7s irregardless if I had a track set to monitor. It's a real benefit over the lower end devices I was using.

    After a week in Cubase, there's no noticeable glitches, which Cubase on OS X... macOS... is more prone to than many other audio apps. I'm pleasantly happy with the device.

    A slightly different take - S/PDIF

    The 6i6 is almost ideal but the S/PDIF coax ports are almost useless for most people. So for anyone asking what S/PDIF (Sony/Phillips Digital Interface Format) is, its the common format developed for transmitting PCM audio or compressed formats such as AC3 (Dolby Digital) or DTS via 75 Ohm Coax (RCA) cables or Toslink (Optical). Toslink over time became the much preferred format, likely for the "cool" factor, and optical cables require no shielding as RF noise does not affect light, thus cables are lightweight and small. S/PDIF can be found on many home theater receivers, some standalone CD players, most DVD players, some Blu-Ray players and in the professional world, DAT systems.

    S/PDIF is so ubiquitous that my Mac Pro has I/O via S/PDIF optical and most Macs (MacBook Pros, iMacs) can output S/PDIF optical with a specialized mini-toslink cable. Digital Coax is a fading format, limited to DAT and some CD/DVD players. Outside of DAT, most formats that use S/PDIF can be transferred directly like optical media (CD/DVDs) from an optical drive bay and thus, its mostly used as a way to transmit out from a computer to a receiver or speaker system. I'm not sure about user stats but coax S/PDIF really strikes me as not very useful. I'd much prefer another set of instrument inputs for S/PDIF, a 6i4 (6 analog inputs) would be more useful, I'm guessing most studio musicians would be in the same boat. At the very least, Toslink would be much preferred as there's a much greater chance someone has a speaker system or receiver that uses it.

    The other negative is I still don't know why Cubase has a problem with the Scarlett. I've used 3 other boxes over-the-years and never required any workarounds. It works but it strikes as a precarious position as I'm not sure if I'm a DAW update or OS update away from it not working with Cubase but as of writing this it does under OS X 10.11.5. As Mac Cubase user, I'm in the minority and Logic X works fine with it.


    • Value!
    • Build quality
    • Easy to use device mixer software
    • 6 outputs linked to the "Monitor" audio bus alone, meaning two separate headphone amps and external speakers with all independent volume controls
    • low latency for USB


    • Mild driver issues with Cubase, works fine with workaround.
    • Coax S/PDIF really could be swapped for more useful ports. It's best to think of this as a 4 input device

    • #audio

    Setting Up Focusrite Scarlet on OS X Cubase fix

    Focusrite Scarlet functions as my headphone amp, alt route to speakers and speaker stand.

    I recently purchased a Focusrite Scarlet 6i6 but quickly ran into problems with Cubase 8 and OS X 10.11.5, the device worked with all other audio apps which made Cubase's VST system the culprit. For whatever reason, getting my Scarlet 6i6 to work required creating an aggregate device which still mystifies me. Aggregate devices in OS X allow you to combine multiple audio I/O sources into one virtual device for use in applications, regardless if the application in question has support for things like multiple audio I/Os or even something mundane like audio input from device A and playback on device.

    Step 1: In Audio Midi Setup, create an aggregated device

    Step 2: Select the Focusrite, no other inputs/outputs are needed

    Step 3: In Cubase, under devices, select Device Setup and set your device to the newly created aggregate device

    Confirm all the ports are enabled and labeled in a sensible fashion

    Under devices, select VST Connections and set the output to the monitor output

    Under inputs in the VST connections, create any necessary buses needed for input and/or assign the ports as needed.

    • #front end development
    • #accessibility

    A fix for UI elements that do not respond to return or enter key presses

    While I usually don't write much about client specific work about my job, I recently was hit with a minor problem: Slide shows are terrible for accessibility. After scratching my head for a few minutes I was hit with a dead simple solution, simulate mouse clicks with JS. Check out the codepen below.

    See the Pen Tabindex return fix

    • #front end development

    Resolving Grunt Error - libsass bindings not found in...

      Running "watch" task
      >> File "../scss/theme/_color-mixins.scss" changed. <br>
      Loading "sass.js" tasks...ERROR
      >> Error: `libsass` bindings not found in /Users/~path/_build/node_modules/grunt-sass/node_modules/node-sass/vendor/darwin-x64-47/binding.node.
       Try reinstalling `node-sass`?
      Warning: Task "sass" not found. Use --force to continue.
    • Step 1: run npm update from the grunt directory with your package.json. This should fetch the latest dependencies that are specified in your package.json file.
    • Step 2: Likely if you try to run grunt, you'll experience a binding error that reads as follows:
      Node Sass could not find a binding for your current environment: OS X 64-bit with Node.js 4.x. Usually this error is followed by the suggestion to of using rebuild node sass.
      Run npm rebuild node-sass
    • Step 3: If you're still receiving sass build issues: try updating grunt
      npm install -g grunt-cli and re-running the above. Also you may need to update Node itself and check your package.json versioning.
    • #front end development

    Sitefinity updated logo for 2016

    Sitefinity is a mess

    Trying to track down what's causing JQuery Validate to glitch out and cause a "Uncaught TypeError: Cannot read property 'addMethod' of undefined" error in SiteFinity on a client's website. When signed into the CMS there's one hundred and five instances of javascript tags. Coincidentally, I also redesigned the Sitefinity logo.

    Sitefinity Updated Logo for 2016

    Now for an official press release:

    For Immediate Release: 05/09/16
    Telerik Sitefinity

    Sitefinity Logo updated for 2016 under direction of disgruntled front end developer with no connection to Telerik

    Portland, Oregon: Today unbeknownst to Telerik corp or anyone affiliated with Telerik or Sitfinity, Greg, an front end web developer with a surely disposition and contempt for french roast coffee released an updated logo for Sitefinity.

    The logo shows a daring new approach in design, with virtually no treatment on the logo but instead an abstraction of the word "javascript" repeated 105 times in various states of legibility, using Operator Mono in book weight from esteemed and respected font foundry, Hoefler&Co. The design is a literal abstraction of the way Telerik has almost no regard for the way javascript is treated, and appears to randomly "shit out" javascript tags according to Greg. The alternate design was arrived after initial sketches that included a large singular pixelated middle finger, and a dead musk rat.

    "This is the stuff that haunts my dreams. Who deemed this an acceptable or maintainable design pattern? Is there ever going to be a CMS that isn't some shade of terrible?" Greg ranted on his company's slack channel which elicited zero responses.

    The new logo should be used in any instance of the old logo and is free for use for all.

    About: This press release is 100% serious and real.

    • #audio

    Maschine hangs or crashes on Loading GUI fix

    It's one thing to post a fix, and its another to explain how I arrived at the fix. As a web developer (and very very long time Mac user), my life is debugging so I have a few more tools to pull from. Hopefully advanced users will be able to follow my logic to arrive at this fix. You can skip past the following to get to the fix but this explanation may help you to troubleshoot not just Mashcine but other apps. I'm not a magician and I do not pretend to be one, and you too with some practice and time can start learning to troubleshoot computer programs.

    maschine loading

    So recently Maschine stopped working randomly. Having a fair amount of technical prowess I wasn't too worried. I went to my Activity Monitor. For those not familiar, OS X comes with an application bundled in every install called "Activity Monitor", in your applications/utilities folder. Activity Monitor is a GUI (Graphical User Interface) to several UNIX applications that can be accessed in the OS X's terminal. These applications allow you monitor memory / CPU / network usage. As a debugging tool, it's a must as you can see if an activity is slowing your computer down by overly utilizing CPU cycles, ram or bandwidth and force quit tasks that can't be accessed by the normal force quick menu.

    I fired up activity monitor and located Maschine in the list and double clicked it to get more info about the precess and to see which files it recently accessed. Even to me, most of what appear in the log files are a garbled mess of esoteric computer speak but I also know that there's important info in these logs. Fortunately files accessed logs are straight forward. In the log, the last files accessed were plist files pertaining to it's freeze state.

    maschine open files

    As an OS X veteran, I knew offhand a few things: plist files are preference files that sometimes get corrupted, and usually can be deleted without any repercussions as the application in question will simply regenerate these (worst is some settings may be lost) and more importantly that the .saved has to do with OS X's ability to relaunch applications to their last known state. As a rule, Freeze states are pretty much always ok to delete, infact occasionally you need to dump a bad freeze state. This is common practice in iOS when users double tap the home button and swipe up to close a frozen app as iOS doesn't by default "quit" apps, but simply places them into freeze states. Deleting a freeze staet simply forces the application in question to fully relaunch.

    I started by dumping the Saved Application state, always an import first step in modern OS X debugging but it didn't work. After talking to my buddy Justin, he mentioned that the time Maschine stopped working when he changed plugins. So I had a hunch and decided to take a memory sample in the activity monitor. Memory samples allow you to peek inside to what your application is doing and what information is being accessed at that moment in time. Remember how I said even I don't understand much of what's in a log file? This is one of those times. We know that the application is hanging so something is causing it to hang, and we can bet that the problem can be "seen" as the program will likely try to repeatedly access something or we may see the last item the application tried to read before stalling out.
    (the screen caps are clickable for legibility)

    maschine open files

    Note that during the hang, the memory sample is calling PSP echo, which is a popular audio plugin. Maschine like many audio applications runs an initial plugin scan to make sure all the plugins installed are compatible and if they aren't they're blacklisted so they will not crash the host application. This scan usually is run only if the application detects a change such as a new plugin install. This sometimes fails, and causes an application to crash. Apple's Logic, has the ability to detect failed launches and thus rescans it's pliugins on a failed launch (a somewhat recent innovation with Logic 8 or 9). Maschine being a little more limited, doesn't have this ability so its up to the user to manually reset the approved plugin cache. While I couldn't find the Maschine 2 location for plugins list, I found the following article MASCHINE Crashes at Startup (OS X) which pointed me in the right direction.

    How to fix

    Step 1: Go to the following location on your computer:

    Users/[your user name]/Library/Application Support/Maschine 2/

    Note: You may need to enable your user library folder visibility if you have not done so already.

    Step 2: Drag all the files into your trash.

    Step 3: Relaunch application.

    maschine could not load plugin

    With any luck you should see something like the message above. Happy Beat Making and troubleshooting! Remember, the activity monitor is one of the most important tools in a power user's bag of tricks. OS X is big and complex but almost nothing is done behind closed doors, this means there's almost always a way to get to the root of a problem.

    • #front end development

    Setting up Operator Mono in Coda

    I'm a big Operator Mono fan. A few months ago I wrote how to Set up Operator Mono for Atom. It involves a bit of style sheet hacking. Coda is pretty straight forward but I realized after roughly 8 years of owning Coda, I've never messed with the font formatting.

    Step 1: Set the font in preference under Editor

    Coda preferences

    Click the Editor Font and locate Operator, select the weight you're most comfortable with.

    Step 2: Setting up the italics character set

    One of the best features of Operator Mono is that all its italics are an alternate character set, useful for programming. Coda doesn't pack in a style sheet akin to Atom or Sublime, which is a mixed blessing. It's pretty easy to set up but requires a little more handy work.

    • Click Colors in Preferences
    • Within the colors panel scroll area click on the various code examples and click the bold/italics to check boxes to change your code styles

    Coda  color preferences

    To mimic Atom's settings and the examples on the Operator Mono website, I recommend italicizing the following: all comments, tags, variables, attribute names, and leave CSS unitalicized. That's it!

    • #front end development

    PostCSS is really slow - PostCSS vs minification and autoprefixer

    A few months ago one of the back end developers ribbed me, "Front End Developers can't decide if they want to pre-process or post-process their CSS" after PostCSS and CSSnext.

    The worst part, is he's right but we've been doing both for some time unbeknownst to him. Unlike the hazier CSSNext ambition of bring future CSS code today (based on PostCSS), PostCSS itself simply a library of tools for programmatically manipulating CSS with Javascript which is a zealous under-sale of the potential. Like any good front end developer, I wanted to see if PostCSS actually made sense to use since I don't have the interest (yet) to use CSSnext.

    While admittedly this isn't the most scientific test, I ran this using the current versions of gulp and grunt with their respective plugins using a rather large project build on a heavily modified version of Bootstrap 3. The end result is roughly 9700 lines without minification, and a 160k CSS file minified. It's big but it isn't massive either.

    Grunt Results

    Without PostCSS

    • Total 586ms

    With PostCSS

    • Total 5.3s

    Gulp Results

    Without PostCSS

    • Total 388ms

    With PostCSS

    • Total 4.75s

    The configuration looks as follows:
    Libsass -> autoprefixer -> minification.

    In the grunt task, I have a watch task that triggers libass to grunt-autoprefixer to grunt-contrib-cssmin. I replaced the prefixing/cssmin with grunt-postcss.

    For Gulp the task was nearly identical, libass to gulp-autoprefixer to gulp-autoprefixer. I replaced the prefixing/cssmin with grunt-postcss. The end result is pretty much the same.

    What does this mean?

    The takeaway is that previously modules that PostCSS replicates are considerably slower but (and I'll use bold to stress this) this does not mean you should not use PostCSS. PostCSS still has some seriously potential if you're into eschewing the pre-processor for CSSNext or looking to use CSS Modules. However, unless you need PostCSS, you shouldn't feel obligated to replace current working tasks with the CSSnext version.

    • #off topic

    Kite - an Indie game

    Long time friend, James Treneman is a one man studio making his own game and its in the process of being Green lit on Steam. Check it out.

    • #front end development
    • #browsers

    Exploring and Developing for the PS4 browser

    The Playstation 4 is quite the capable device, unsurprisingly able to run linux. I recently bought a PS4 and in true developer spirit immediately began poking around the browser. To my knowledge, there's next to zero developer documentation. The best I could find was a single PDF from Sony which appears to be dated. My goal is to document what's known about the PS4's Browser.

    My PS4 test setup

    Version: CUH-1200

    OS: 3.50

    Gecko? Mozilla? Netfront?

    whatbrowser.org ps4 screenshot

    Google's whatbrowser gives an error.

    whatbrowseramiusing.co ps4 screenshot

    whatbrowseramiusing.co reads the Gecko-Like User-Agent string.

    whatismybrowser.com ps4 screenshot

    whatismybrowser.com likely matches UA string by closest match, and returns Mozilla.

    whatsmybrowserorg.org ps4 screenshot

    whatsmybrowser.org correctly identifies the PS4 as NetFront.

    Netfront is a proprietary web browser used for the Playstation 3, Playstation Vita,PSP, Nintendo 3DS, Wii U and Kindle E-reader. The original Netfront Browser has since been replaced by a webkit powered NetFront NX which appears to power the PS4.



    • HTML 4.01, XHTML 1.1, XHTML Basic 1.1, CE-HTML, XML 1.1, RSS feed (RSS 0.9/0.91/0.92/1.0/2.0, Atom 1.0)
    • HTML 5 Support: Canvas, Canvas Text, localStorage, sessionStorage, Web Workers, applicationCache, HTML5 Input types (partial) - Notable missing: Geolocation API, HTML5 Input input attributes, picture element, srcset, service workers, web components
    • CSS3 (Flexbox, full CSS3 selector support Media Queries, Animations, 2D/3D Transforms, Transitions, etc.) - Notable missing: Multiple background support
    • CSS1, CSS2.1
    • Javascript V 1.7+
    • DOM LEVEL 2

    PS4 passes all the CSS3.info's select test

    The PS4 scores relatively well on the CSS3 test (Chrome v 49.0.2623.112 scores 52%, Safari 9.1 54% and FireFox 45 63%)


    • TLS1.2 *no compression
    • NSS
    • Configurable digital certificates
    • Extended Validation
    • Elliptic Curve Cryptography
    • No SSL support (2/3)

    Viewport: unsupported

    WebGL: unsupported

    PS4's webGL error

    • Image Formats: JPG / GIF / PNG / BMP (32 bit + compress supported)
    • Note: TIFF image format is unsupported (commonly supported in webkit).
    • Video:

      • Container: Mp4/HLS
      • Codec H.264
      • Profile: Baseline/Main/High
      • Level: 4.1 or lower
      • Resolution: 1920 x 1080 or lower
      • Framerate: 60 fps or lower
      • Bitrate: 20 Mbps or lower
      • Autoplay: supported
    • Audio:

      • Formats: AAC (LC or HE-AAC v1)
      • Channels: 1 channel, 2 channels, 6 channels (AAC-LC only), or 7.1 channels (AAC-LC only)
      • Sampling rate: 8000, 11025, 12000, 16000, 22050, 24000, 32000, 44100, or 48000 Hz
      • Bitrate: 48 to 3456kbps
      • MP3/WAV/AIFF/AU/MIDI unsupported
      • Audio playback using the audio element is not supported.
      • Direct links to audio is not supportered
    • PDF: unsupported
    • Downloads: unsupported

    Browsers Tests


    Acid3 test

    Sunspider 1.0.2 test: Overall score: 3203 ms using remote login (and a game left open)


    PS4's fails to load html5test.com...

    Notably some previous users have completed the HTML5 tests, you can see the scores: here.

    PS4's fails during Octane test...

    Not pictured: Jetstream

    The PS4's weakest link appears to be modern JS support. Unsurprisingly the PS4 isn't a strong performer. In a very uncontrolled environment with 3 concurrent browsers open, and roughly 25 tabs, and several apps, my MacBook Pro in chrome scored a 157.8ms vs the PS4's 2929ms (Lower is better). An iPhone 6 scores roughly 326.6 ms. The big differences here are that both Chrome, Safari and FireFox use highly optimized JS engines, V8, NitroJS and SpiderMonkey respectively. However, other users report much better SunSpider benchmarks clocking around 1027ms which places roughly at the performance of an iPhone 5 when it comes to JS. While the PS4 certainly has room for improvement but is unlikely to see massive gains as I highly doubt most people spend much time in the browser outside of gaming.

    More to come...

    Stay tuned, I plan to update this over time. Testing the PS4 is tedious as the remote support doesn't allow text input via keyboard.

    Planned tests: weinre remote debugging, FireBug Lite, BrowserSync

    Anyone with better documentation or more information please feel free to e-mail at: blog@greggant.com. Thanks!

    • #digital politics

    The Encryption debate wages on

    The Obama administration took a seat in the encryption debate, even in light of What's app rolling out end-to-end encryption for its billion users. Worse yet, the leaked senate bill is soft on security, high on fear. At least there's a few out there fighting the good fight.

    It's almost if there's real world analogies to draw from.

    • #os x
    • #apple

    But He's Not Wrong

    Palmer continues to clarify what he meant by that blunt statement by saying “It just boils down to the fact that Apple doesn’t prioritize high-end GPUs. You can buy a $6,000 Mac Pro with the top of the line AMD FirePro D700, and it still doesn’t match our recommended specs. So if they prioritize higher-end GPUs like they used to for a while back in the day, we’d love to support Mac. But right now, there’s just not a single machine out there that supports it.”
    - ShackNews.com

    The only Macs capable of handling the Oculus Rift built by Apple are 3 years out of production Mac Pros... or Hackintoshes. Those of us with either still have dual booting, so there's always that. I've come to the conclusion the 2013 Mac Pro's trashcan looks are a metaphor for what Apple thinks of professionals.

    • #os x
    • #apple
    • #ios
    • #tutorial
    • #how to

    Messages not delivered error fix

    The message "Not Delivered" has eluded me longer that it should have after retiring my MacBook Pro 2012 Retina for a 2015 Retina. I was able to message anyone using an iOS device through messages but unable to send SMS to non-iOS phones. Here's the fix, it requires both your iPhone and your Mac to be handy.

    Step 1:
    Confirm your messages has been configured on your Mac in messages preferences

    Step 2:
    Go to Settings -> Messages and then finally to Text Message Forwarding

    Step 3:
    Add your computer:

    Step 4:
    A confirmation code will be sent to your Mac in Messsages, it may take a few seconds. Be ready to punch in the pin on your iphone.

    That's it

    • #security
    • #apple
    • #digital rights

    The Cost of the San Bernardino iPhone

    There’s a small point to be made here, insofar as it suggests the FBI is being disingenuous. They’re saying that it’s not about precedent, it’s just about this one phone, this one investigation. But the real reason they’re making a big deal out of it is that it’s politically useful. The phone itself likely isn’t important but the situation surrounding the phone — “terrorism” and the tragedy of 14 innocent people being killed — lends sympathy to their desire for access to encrypted devices all the time.
    - John Gruber, daringfireball.com

    The iPhone debate reminds me of a hot button issue like abortion or climate debate where the two sides are speaking different languages. Strangely the one semantic argument that the other side clearly understands has yet to be made: economics. Forget what you know about encryption and the absurdity of security back door. These are purely intellectually abstractions to the other side.

    Want a sure-fire way to slowly sink potentially hundreds of thousands of American jobs? Set up the precedence that the US Government has backdoor access. What foreign government or foreign firm or even foreign citizen wants to use systems that the US Government can trove at will? We've shown a complete and utter disregard for intelligence gathering on our allies, even our closest, why assume anything else even when wrapped in the best of intentions?

    • #front end development
    • #jekyll

    Grabbing the latest blog post from Jekyll using PHP

    On my website, greggant.com I have a small section of my homepage where it always displays the title of my most current blog post. Prior to Jekyll back when when my blog was hosted on Tumblr, I used the API to grab the latest post via a URL. On Jekyll, the solution is even easier: Create a dummy template that only includes the page title and use PHP to write the blog post title.

    Step 1: Create a blank template in _layouts

    I created a template I named blank.html, the entire contents are one line:

      {{ content }}   

    Step 2: Create a url

    Since the script I'll be using is all for public access, I kept it really simple and decided to make a new page in the root of my blog. I named it currentpost.html.

    layout: blank
    title: Inaudible Discssion
    {% for post in site.posts limit:1 %}
    {{post.title }}
    {% endfor %}

    It'd be possible to create a URL directly to the page, but for this solution I opted in for absolutely the most minimum amount of work seeing I'd rather link from my portfolio to my blog's homepage instead of the the actual post..

    Step 3: Creating the script in PHP

    Lastly, comes the PHP, and it looks as such:

    <a href="http://blog.greggant.com">
    <?php $url = "http://blog.greggant.com/currentpost.html";
    $str = file_get_contents($url); echo $str; ?>	</a>

    • #emulation
    • #ios
    • #tutorial

    iOS Emulation, gamepads, Cydia, Xcode, builds.io - A Tutorial for iOS emus



    ios 9 8-bit logo

    Before we get started, I’ll be using quite a bit of vernacular related to emulation. Many people familiar with emulation can skim the the Mini Glossary .

    Originally posted: 2015-12-09

    Emulation isn’t exactly legal. Emulators themselves remain legal, jailbreaking remains legal, but downloading ROMs is legally a grey area at best and piracy at worst. However, this isn’t about morality but an exploration of feasibility. With the prevalence of sites like emuparadise, emulation is as alive and kicking as it was during golden age of emulators, the late 90s and 2000s when the explosion of projects like NESticle, ZSNES, SNES9x, UltraHLE, Connectix VGS were still being actively developed.

    Lastly, emulation is a bit tricky to get up and running. I’ll do my best to distill this into a comprehensive, one stop guide for iOS emulation. This post encapsulates quite a few burnt hours, starting with two cross country flights where my sole entertainment on the flights were my iPhone with a Moga Rebel, and my MacBook Pro. While downloading and compiling emulators I realized that the information on iOS emulation is scattered and often unclear. Immediately I started kicking around the idea of writing a guide. What follows is the longest blog post I’ve ever written in terms of hours spent + words. In the end, I downloaded 20+ emulators, took countless screenshots / notes, created original art, dropped over $100 and I accrued over 5000 words, the most I’ve written on since college on a single document (excluding coding).

    Android has a huge advantage compared to iOS when it comes to emulation, as its apps can be downloaded through the Google Play store. Make no mistake, if emulation is very important to you, iOS is not the go-to platform as there’s plenty more work and headache that goes into getting emulators up and running. That isn’t to say iOS is bad at emulation, but getting emulation working on iOS is labor intensive. iOS has a bit of catch up to do on the game controller support even when compared to tvOS. If you’re looking for the ultimate portable retro gaming console, an Android phone with a microSD card is likely the better option. Hopefully this post will be useful to someone, somewhere…

    Mini Glossary of Terms


    Emulator - A Program that simulates/emulates specific hardware configurations. These programs range vastly in techniques and scope (and aren’t limited to games). In order for old games games to run properly, programmers recreate the functionality of the chipsets found in the old game hardware to allow games to be played on non-native hardware. Due to the complexity of simulating hardware, emulators always require more powerful hardware than those found on the original chipset . Emulators became prevalent in the late 1990s due to the rapidly increasing power of personal computers and the internet, which enabled developers to collaborate and distribute their code. Even a lowly 1998 iMac (Rev B) with PPC G3 233 MHz and 6 MBs of VRAM was orders of magnitude more powerful than all of the SNES’s hardware, and thus was able to play SNES games. During the early 2000s, emulators were able to add features not found in the original hardware. Famously the two defunct commercial attempts were made for game emulation Bleem emulator allowed the Sega Dreamcast to play a few select PSX games, and Connectix Virtual Game Station allowed anyone with a Macintosh to play any PSX game, and take advantage of the superior load times both occurring in 1999-2001. After lawsuits commercial emulators were deemed legal risk, and seemed on the cusp of legality. UltraHLE, the first N64 emulator was hit with legal threats from Nintendo and thus emulation today now mostly lives on in open source (see source code). For further reading, emulators vary between high level emulation and low level emulation. Today, the average smartphone is orders more powerful than retro game consoles.

    ROM - Read-Only Memory. Cartridge based game consoles like the NES, SNES and Genesis used Read-Only Memory to store games. Games that had the ability to save games used specialized battery packs and additional memory for saving purposes. Later game consoles like the Playstation and Xbox used less expensive optical media formats that were ROM hence the term CD-Rom and DVD-Rom, and opted for memory cards or internal memory to save games. The term ROM is usually used interchangeably for any game file.

    ISO - ISO is shorthand for the common ISO 9660 filesystem used on CD-Rom. ISO files are byte-for-byte self contained data dumps of entire CD-Roms and DVD-Roms. These files can be transferred and later burned onto optical media. Some early game consoles such as the Sega CD and Neo Geo did not have any sort of copy protection, so games can be pirated easily by simply downloading and burning onto CDs. Later consoles employed various methods to ensure copy protection such as the surprisingly simple but effective bad bye header that the Sony Playstation employed.  ISO file format today can contain other variants outside of the 9660 standard.

    Source Code - A program instructions written in the original language before compiled and translated to machine language.  In the context of iOS emulators, iOS emulator source code can be downloaded freely and compiled into native iOS apps using Apple’s Xcode. This program can then be transferred to an iOS device.  Open Source literally means the Source Code is open to viewing, thus anyone can view and modify the code, and submit bug fixes, features, optimizations, and changes to curators of a project to be reviewed. Open Source code projects can be found a variety of places like GitHub. Due to the licensing constraints of Apple’s App Store, pre-compiled apps can not be installed on iOS outside of the App store, but developers need a way to test their code on iOS devices, thus projects compiled through Xcode can be loaded on onto an iOS device. 

    Blitter - Blitter originally is a term for a coprocessor, but in the world of emulation, refers to graphics rendering techniques. Game consoles like the Genesis and SNES had max resolution of 320px x 240px, paltry when compared to modern consoles supporting 1920px x 1080px (1080p), making for 27 times as many pixels. Computers support a range of resolutions, all many times greater than retro game consoles, and the iPad Air and Mini both sport a native resolution of 2048px x 1536px, even greater than 1080p TVs. Due to the resolution differences, the retro consoles appear pixelated when blown up to modern display resolutions. Programmers have figured out various techniques on how to upscale graphics and make them appear smoother for 2D game consoles, called Blitter libraries back in the early 2000s and have steadily improved over the years. 3D game consoles like the Nintendo Gamecube & Wii, Sony Playstation 1 & 2 & PSP have emulators that render the polygons at modern resolutions and apply effects like Full screen Anti-Aliasing, and more detailed particles, and even some edge cases, tweaks that change draw distances and textures. Thus emulated consoles can attain much greater image quality than the original consoles could, especially for 3D games.

    Below is an example using OpenEmu’s blitter “filters”. 


    Blitter libaries can drastically improve image quality, although they can be quite CPU intensive for the most advanced like SABR.

    Getting Started with iOS emulation

    8 Bits of iPhone

    Unlike on Android, emulation isn’t something Apple condones. Occasionally, apps slip through the app stores such as iDOS (A well constructed DOS emulator). Some apps hide functionality in them, such as FloppyBird (a hidden NES emulator).  These are unreliable and often pulled instantly from the App store assuming if they somehow make it past Apple’s screening. This likely has less to do with any beef Apple has with game emulation but with emulation in general.


    Thus this leaves two avenues, the first being the most obvious: Jailbreaking. By now most users are pretty familiar with Jailbreaking. Rather than re-explain jailbreaking, I’ll quote my favorite resource for iOS, iDownloadblog.com

    Jailbreaking is the process by which Apple’s mobile operating system, iOS, is modified to run unsigned code in order to gain access to files that Apple wouldn’t normally let you access.

    Jailbreaking adds unofficial application installers to your iOS device, such as Cydia, which let you download many 3rd-party applications, tweaks, and extensions that are unavailable through the App Store. These packages open up endless possibilities to do things on your iOS device that a non-jailbroken iOS device would never be able to do.

    - Downloadblog.com - JailBreak

    Rather than walk through jailbreaking here on this blog, there’s entire websites and forums dedicated to Jailbreaking. While I’ve written a few popular jailbreak centric blog posts and iFile, I am not an expert. You can find all the necessary files and instructions on jailbreaking at iDownloadblog.com.

    Jailbreak free - Build your own

    The second avenue comes rooted from the Enterprise Distribution certificate exploit to install the app via HTTP. Originally a few clever authors took advantage of a flaw in iOS’s ability to deploy enterprise Apps in iOS 8. The popular emulator, GBA4ios was distributed by this method. The advantage is it didn’t require a jailbreak, the downside is Apple quickly patched this glitch in subsequent iOS updates.


    These emulators quickly became open source, anyone with a copy of Xcode and developer license can download and compile the open source projects. 

    The advantage is its totally free, and unlikely to be closed by Apple. Downside is you need a to jump through several hoops. Notably since previously linked guide, was written a few things have changed. This is by no means a perfect guide but the standard steps to get up and running.

    Step 1) Download Xcode from the Mac App Store

    Step 2) Install CocoaPods

    CocoaPods is a dependency manager for Xcode projects. Since many applicaitons have various dependancies, you’ll need this in order to properly compile many open source projects. Installing Cocaopods process depends on your OS version. OS X 10.10 and below only require one terminal command: sudo gem install cocoapods

    One of OS X 10.11 - El Capitain’s changes is a new security provision for rootless access. In reality, there’s still a root account for the OS but locks off certain directories from access meaning even if a user has root access (full file system) they cannot modify certian portions of the operating system. The default install location for gem installs was in a locked area, thus the above install command will not work. Rather than disable the “rootless” security security provisions, its safer to install the files elsewhere. You’ll need to follow the instructions, by running the following commands in your terminal:

    1. $ sudo gem install cocoapods

      This should execute successfull installs

      Fetching: nap-1.0.0.gem (100%)

      Successfully installed nap-1.0.0

      Fetching: thread_safe-0.3.5.gem (100%)

      Successfully installed thread_safe-0.3.5

      Fetching: minitest-5.8.2.gem (100%)

      Successfully installed minitest-5.8.2


    2. Once done, run:

      $ export PATH=$PATH:/Library/Ruby/bin
    3. Test to make sure that Codepods has installed properly

      Test to make sure its installed properly

    Step 3) Launch Xcode

    Go to Xcode -> Preferences  and click the Accounts tab and add your Apple ID


    Step 4)

    Download an open source project, and open the xcodeproj file.

    Step 5)

    This is where things start to get tricky,you’ll need to navigate to the General project settings tab and change the Bundle Identifier to a unique name, and the team name to your Apple ID. 

    Next change the target device to your connected iOS device. Click fix, and it should make some corrections.

    Step 6)


    Getting projects to work can involve many steps, so keep in mind that these steps may take additional corrections. Be prepared to visit Github projects, wikis and forums to guild you. This currently is the only way to get emulators on the newly released Apple TV

    Sounds a bit complicated right? It’s because it is. I’m a web developer by trade and it took me a few hours to navigate the process as I’m not an iOS dev. However, there’s another way to go: Have someone else do the work for you.

    Jailbreak Free - Have someone else make the builds -  Builds.io
    A builds.io mini-review


    Since the certificate famed exploit, a few of the emulator authors for iOS combined forces to form Builds.io, a service to distribute HTTP installable enterprise Apps. In order for iOS to be a viable corporate entity, iOS needs the ability to install applications outside of the app store for internal apps that aren’t available on the app store for any number of reasons: not waiting for Apple’s validation, privacy/security, quick distribution and so forth. Builds.io uses Apple’s own built in enterprise functionality. It works by adding your device ID to the Builds.io enterprise network, and authorizes its apps to be installed via HTTP. The catch? It costs $10 a year. Sounds fishy? I was wary but decided to take the plunge.

    To add to questionability Builds.io doesn’t have full the full condolences of some of the emulation authors either. I found the following exchange on Twitter between a user of Builds.io and the author of LibRetro which makes RetroArch possible, a very popular open source emulator for iOS. Since as of writing this, RetroArch does not appear in Builds.io.


    Signing up for builds.io takes almost no time, that is if you don’t run into any PayPal problems. I had deleted my account and I had to create a new account simply to pay for builds.io. Signing up for the service takes only a few short minutes, to create an ID for your device.  Upon payment, your account is then activated (which takes about 2-3 minutes) and then you’re free to install applications.

    Installing applications from Builds.io is as simple as the Apple app store. Click the install button, click it again and a message will alert you that you’re about to install an iOS app. Once the install button is clicked, the app will start installing the background. Emulators are pretty small, and even on LTE, it only takes seconds. Once installed, the apps behave as any other iOS application. You’ll find most the of the emulators listed above at Builds.io. It works great, and there were zero hiccups. A $10 license only covers one device however so I did not test this on my iPhone 5 or 4.

    Would I recommend it the service? In a short answer: Yes.

    I recommended this to my technically-inclined-but-not-a-developer brother who spends all day navigation EDLs, codecs, and deliverables as post-production manager, as he’s quite the retro gamer but probably lacking the patience to compile his own emulators. Admittedly, I used Builds.io for many of my emulator installs. However, I’m still unsure as who the money goes to, and while one may argue you’re paying for the service and not the open source software, it still feels like you’re doing exactly that: paying the wrong person for someone else’s hard work.

    Picking your path


    Ideally for anyone interested in iOS emulation, you’d have both your iOS device jailbroken and use the open source emulators (by either compiling the applications yourself or paying the $10 for Builds.io or similar service). There’s some incredible jailbreak only features like drivers for bluetooth PS3 controller drivers and Controllers for all PS4 controller drivers.

    Now for the confusing part, there’s no magic bullet to unlock all the iOS emulators.

    • Some of the emulators are available through Cydia and open source like PPSSPP.
    • Some are available only through open source repositories and Builds.io like GBA4ios.
    • Some emulators are available only through Cydia like SNES 9X Ex+ and several helpful tweaks/drivers.  

    Having a jailbroken device also makes uploading ROMs to your iOS device easier. The ideal iOS emu-gamer would compile her/his own open source emulator AND jailbreak her/his device.

    Currently the momentum is in the compile-your-own side of things as the best iOS emulators are mostly found through open source.

    Software Recommendations

    If you chose to go the jailbreak route or non-jailbreak route, there’s a few recommended utilities to make your life easier iFile (a iOS file management system, available through Cydia) and iMazing (a Windows/Mac utility for device management) or iExplorer. 


    iFile is routinely one of the most recommended iOS jailbreak utilities. I’m quite a fan and have written a few articles related to it. iDownloadblog wrote the post as few years back “Why Every Jailbreak Power User Needs iFile”. Specifically pertaining to iOS, iFile allows iOS users to copy, move, open, delete files, create directories and the usual file management tasks. iFile will be routinely mentioned. 


    iMazing is a wonderful OS X and Windows utility that allows device management outside outside of iTunes, including full file system access for jailbroken devices. Non-jailbroken users still can use limited file management, still useful for rom management. I picked this up as part of a bundle through macupdate.com but it can be purchased for $35. Spendy but its the best overall iOS management utility I’ve used. 


    iExplorer is geared for one thing and only one thing: Browsing your iPhone from your mac or pc. The advantage over iMazing is the var/mobile/Applications are put front and foremost. Normally viewing through iFile, every application resides in a folder that has a random string of letters and numbers. iExplorer has a viewing option that lists the folders by the name of the application they contain, it makes life easier. Again, ringing in at $35. It also has one other killer app feature, mount your iPhone in the finder.

    Each of these applications are worth buying, but you only need one. iMazing is more bang for your buck but iExplorer is better strictly for transferring ROMs. I personally while writing this article switched using iExplorer strictly for rom transfer.

    A rundown of the Emulators


    I personally downloaded every emulator I could find and tested them, due to sanity’s sake, I’ve kept my reviews short, and for your sanity’s sake, where the ROMs are stored.   Note: Locating the ROMs using iFile or iMazing will take some trial and error I’m listing these buy iExplorer’s locations.

    Jailbreak Emulators

    The following is a list of emulators that can be downloaded via Cydia.

    PPSSPP - a very impressive PSP Emulator, that allows for upscaling games beyond native resolutions, FSAA and other graphical tweaks. My iPhone 6 I ran the games at 3x their original resolution only once in awhile with a hiccup. Full gamepad support to boot. Highly recommended.

    Default Rom location: /var/mobile/Documents/PSP/

    SNES9x EX+ - One of several SNES9x based SNES emulators, full gamepad support, and support for larger screen iOSdevices. With the plethora of SNES emulation options, this is one of the best. 

    Default Rom location: /private/var/mobile/media

    RetroArch - A powerful multi-console emulator for iOS, includes multi-emulator cores, includes the following support the following emulators: NES (NEStopia, QuickNES), SNES (bSNES, SNES9x, SNES9x Next), Nintendo DS (DeSmuME), NES (FCEUmm), Sega Genesis/MegaDrive/Master System/Game Gear/Sega CD (Genesis Plus GX & Picodrive), Atari Lynx (Handy), Arcade (Mame 2003), Neo Geo Pocket (Mednafen Neopop), PC Engine/TurboGrafx 16 (Mednafen PCE), Sony Platstation (Madnafen PSX, PCSX ReARMed), VirtualBoy (Mednafen VB), Nintendo 64 (Mupen64plus), GameBoy Advance (VBA Next, VBA-M).

    The catch? The Cydia version is massively out of date, you’ll need to use Xcode to install your own. See RetroArch in the open source section

    Target your iOS device and hit build.

    SNES A.D.+  -  Genesis AD+ - A classic iOS genesis emulator, Somewhat out of date, only support Wiimotes and doesn’t include full screen support for wide iOS devices (iPhone 5 and beyond). 

    Rom location: /private/var/mobile/Media/ROMs/SNES/

    Genesis AD+ - A classic iOS genesis emulator, Somewhat out of date, only support Wiimotes and doesn’t include full screen support for wide iOS devices (iPhone 5 and beyond). 

    Rom location: /private/var/root/media/ROMs/GENESIS

    GBC AD+ - A gameboy / gameboy color emulator. Somewhat out out of date, only supports wiimotes and doesn’t include full screen support for wide screen iOS devices.

    Rom location: /private/var/root/media/ROMs/GameBoy Color

    NES AD+ - A classic iOS NES emulator, Somewhat out out of date, only supports wiimotes and doesn’t include full screen support for wide screen iOS devices.

    Rom location: /private/var/root/media/ROMs/NES

    Nescaline - A classic iOS NES emulator, Still working even on iOS 8/9 but woefully out of date.  No controller support.

    Rom location: /var/root/Media/ROMs/NES/

    Installable cydia emulators but incompatible with iOS 8/9 - RockNES, NES4IPhone, SNES4iphone, Gameboy4iphone,

    Open Source Emulators

    The Open Source emulators currently are where the action is, most being still actively developed, thus the emulators are of a slightly higher standard.

    The rom locations the application directories themselves unless stated otherwise.

    NDS4iOS - Nintendo DS emulator, works well, supports gamepads but due to the DS, not the best for landscape gaming.

    PPSSPP - Same as the jailbreak version, impressive and one the best emulators on iOS.

    <a href="https://github.com/rileytestut/SNES4iOSSNES4iOS - A SNES emulator based off SNES–HD, another SNES9x core based emulator. Freeze states are partially supported, iPhone can save them but not load them. iPads can load and save freeze states.  With almost no updates for 3 years, SNES4ios is probably defunct.

    MeSNEmu - An SNES emulator using the SNES9x core, runs well, has minor smoothing, runs without dropping frames on my iPhone 6. Supports a number of controllers. ROMs can be copied directly to it through iTunes or popular utilities.

    Provenance - Multi console emulator that supports the Sega family of game console (Master System / Game Gear/ Genesis / 32x) using Genesis Plus GX and SNES through SNES9x, NES and GameBoy Advance. Supports game gamepads, Genesis runs exceptionally well. Includes game box art. 

    Gearboy -  GameBoy Color and GameBoy emulator.

    Gameplay Color - Another well written GameBoy and GameBoy Color Emulation

    GBA4iOS - The emulator that caused a stir online. It works nearly flawlessly, it could use some blitter support. ROMs can be added directly through the application using Web browser support. 

    Rom Location: var/mobile/Applications/ GBA4ios/Documents

    N64iOS - a work in progress that crashes shortly after launching. Its sitting on Github and hasn’t seen a commit over two years.

    Reicast - An Android Dreamcast emulator with the beginnings of an iOS port. Currently in the Reicast project there are iOS and OS X x project files, you can even compile Reicast to a phone and read the debugger files, and upload dreamcast bios but just don’t expect anything to happen. Seeing as its functioning Dreamcast emu on Android, the iOS port may eventually catch up.

    RetroArch - powerful multi-console emulator for iOS, includes multi-emulator cores, includes the following support the following emulators: NES (NEStopia, QuickNES), SNES (bSNES, SNES9x, SNES9x Next), Nintendo DS (DeSmuME), NES (FCEUmm), Sega Genesis/MegaDrive/Master System/Game Gear/Sega CD (Genesis Plus GX & Picodrive), Atari Lynx (Handy), Arcade (Mame 2003), Neo Geo Pocket (Mednafen Neopop), PC Engine/TurboGrafx 16 (Mednafen PCE), Sony Platstation (Madnafen PSX, PCSX ReARMed), VirtualBoy (Mednafen VB), Nintendo 64 (Mupen64plus), GameBoy Advance (VBA Next, VBA-M).

    The guide can be found here but its a bit overkill.

    Setting it all up

    ios 9 8-bit logo

    A few of the emulators require beyond the normal setup as described above. RetroArch

    RetroArch requires more work than the usual emulator to compile due to its complexity and the jailbreak version is woefully out of date and doesn’t support gamepads. RetroArch is easily the most powerful emulator on iOS or Android due to its insanely large support, and currently the only way to play PSX/N64 games under iOS.

    • Go to RetroArch’s GitHub project and download it.

    • Next go to RetroArch’s Assets github project and download it.

    • Rename retroarch-assts to “assets” and move it inside the main RetroArch project’s media folder

    • Go /pkg/apple/iOS within the iOS project, and open the RetroArch_iOS.xcodeproj, 

    • In Xcode, give the Bundle Identifier a unique name, and assign the team to your Apple ID.


    Gamepads are a must for emulation gaming. While all the emulators include variations on screen touch controls, they’re a far cry from a physical game pad. There’s a lot of gamepads on the iOS market, all of which use bluetooth to connect to an iOS device. This inherently makes iOS gamepads more expensive than the cheapest USB options for Android. There’s been a few round ups of gamepads for iOS, including Gizmodo and techradar but the modern favorites are the well known Mad Catz C.T.R.L.i and the lesser known Moga Rebel, both featuring a simple phone clip.

    I opted for the Moga Rebel, partially for availability as I made a spur-of-the-moment decision to buy a gamepad before a cross country flight at Target. The Moga Rebel isn’t cheap, at $79.99 its nearly as expensive as a Nintendo DS, which makes even current generation game console controllers seem like a good buy at $50-$60. The Moga sports a micro USB port, used exclusively for charging its rechargeable battery. While Moga doesn’t boast battery life, it certainly lasts longer than the iPhone and then some.

    The other key feature is a simple fold out clamp with rubberized that allows the control pad to grip anything from an iPhone 3g to and iPhone 6s tightly. It works great too, during a very turbulent return flight, even when I heard a few items fall, and I had to pause to grip my controller, my iPhone 6 never came loose.

    The gamepad is clearly inspired by the Xbox 360, which to date, is the best feeling console controller, complete with two analog sticks, two analog triggers, two bumper buttons, a d pad, 4 buttons, and a single start button. It’s only a select button shy of covering the all the bases.

    The pad is both lighter and wider than the Xbox 360 controller but I didn’t notice either until had the two side by side. The Moga Rebel feels “right”, an almost indefinable characteristic. Everyone who’s used generic gamepad can describe how “off” a controller can feel. Does it feel like an $80 controller? I’m not sure, but it certainly feels like a premium product.

    Most games that support gamepads supported the Moga Rebel, I got a kick out of playing Metal Slug and Knights of the Old Republic, both of which played flawlessly and required zero config. I also realized a few games in my library were finally playable such as Sonic CD. iOS gaming still is the wild west when it comes to gamepad support.

    Emulators boasting gamepad support magically worked with the Moga Rebel. I ended up spending most of gaming in emulation, as one might suspect, particularly with PSP games. At 3x their original resolution, and on a infinitely better screen than the original PSP, the PSP games I played (Little Big Planet, Lumines) played like native apps, and perhaps maybe even better. I even took a break to play the GBA remake of Super Dodgeball and beat the game, and played a game of Tecmo Super Bowl (with a rom with updated rosters) for the Sega Genesis.

    Other than the price and a missing select button, there’s nothing to complain about with the Moga Rebel. However, I can’t recommend the gamepad for one simple reason: Support. Moga seems to be dead or mostly dead. Other than a twitter account that seems to fire off a flurry of tweaks every month or two, their website appears to be un-updated, their support non-existent, and no OS X support. Had I known, I’d of opted for the less expensive MadCatz controller.

    The wrap up

    game cart

    iOS emulation takes a bit of work. Builds.io makes easy for the less technically inclined but anyone willing to spend the time with Xcode and Jailbreaks can unlock the full potential of their iPhone. To get up and running, you’re looking at a $50 investment for a Mad Catz Micro C.T.R.L.i. Then if you want to go further $10 for builds.io. subscription and $35 for iExplorer for a grand total of $95.

    I’ll try and keep this article up-to-date as I’ve done with some of my other guides and clean it up overtime.
    • You’ll want to have a jailbroken device but it isn’t required.
    • Support for controllers varies wildly on a per emulator basis. Quality of emulators varies quite a bit too.
    • You will not enjoy iOS emulation very much without a gamepad.
    • You will need to either subscribe to a service like Builds.io or learn to operate Xcode to get many of the best emulators.
    • The the more storage on an iPhone you have, the better. PSP games clock in often at a GB plus, and PSX games often around 600 MB. I am thankful for my 128 GB phone


    12/11/15: Minor restructuring of setup section, corrections.

    12/10/15: Added originally art, removed icons of emulators, article formatting. Wrap up.

subscribe via RSS