How to fix Far Cry 4 Common Glitches - Black Screen - Uplay stopped working - Save Game will not save

    scumbag ubisoft uplay

    The iconic “Scumbag Steve” hat is certainly overused but also eloquently summarizes Uplay.

    Common PC / Windows Far Cry 4 Fixes

    Gaming and Windows are two things I usually don’t blog about. That said I spent several hours  battling THREE separate problems with Uplay, I figured i’d spread the knowledge. Hopefully someone will find this useful.  Ubisoft has some great games, so its unfortunate to see them marred so badly by the problematic Uplay service.

    GameSpot even has written that users are complaining the game marred with bugs and glitches. It looks as if Ubisoft is aware of plenty of other problems, with a new patch 1.6 arriving on December 30th, 2014.

    For those experiencing random crashes, 1.6 patch (weighing in at a hefty 577.4 MB) includes RAM leak fixes among other tweaks. Make sure you’re running the latest version of Far Cry.

    Problem #1: I downloaded Far Cry 4 via Steam but I can’t launch the game

    Apparently downloading the game from steam doesn’t install everything you need, nor does the game appear to alert you that need additional software. To remedy this problem, You’ll need to download and install uPlay, at the uPlay Website. 

    Problem #2: Far Cry 4 stopped saving game / Far Cry 4 won’t save game.

    In my case, I was able to play Far Cry 4 for several hours before it stopped saving games. If I went to the pause menu and hit Save game, the yellow save icon flashed normally, but my mouse pointer disappeared and my keyboard stopped working. The game acted like it was still running properly, as the menu sfx continued to play and the menu background animations continued to run.

    I couldn’t find a straight answer for this but I figured out a solution. It appears to be that the Uplay service (responsible for managing game saves) no longer has the correct permissions to save the game. Essentially, the Uplay service has become locked out of its own directory. To remedy this, do the following:

    1. Locate the Uplay service, it should be under Program Files/Ubisoft/. Drag it to a safe location, such as the desktop. (We’ll need it for later).
    2. Download the Uplay installer and run the installer. 
    3. Drag from your old Uplay folder the Save Games folder into the newly installed one.
    4. Open up data, drag over all the missing files into the Ubisoft/data folder. Do not overwrite any folders.
    5. Launch the Uplay application. It should take a few moments to sync up.

    I highly recommend preemptively following The Honest Canadian’s fix (see below) to prevent any future problems from happening.

    image

    Problem #3: Uplay launcher has stopped working. A problem caused the program to stop working correctly.

    Part 1 - The Honest Canadian’s fix

    Another studious blogger (The Honest Canadian) posted a fix guide here, complete with pictures. However it didn’t quite work for me. First try Honest Canadian’s fix, (if it doesn’t work, you won’t harm anything).

    1. Locate your Uplay Folder, under Program Files/Ubisoft/
    2. Right click the folder and choose Properties.
    3. Click the ‘Security’ Tab and click ‘Edit’. 
    4. Select your  User profile (the one that reads YOUR\Users and set the Allow column of check marks to ‘Full Control’. Click Ok.
    5. Launch the Uplay application to verify it is working.

    However, this didn’t work. I attempted to declare ownership of the folder only to find the permissions had been set in such a way that all the files and folders couldn’t be changed by me or any other user.

    You may find that you cannot alter any of the settings in the Uplay folder.

    Verify that you are signed in as an admin. If you aren’t signed in an admin, first try the Honest Canadian’s fix.

     Part 2 - The Honest Canadian’s Fix didn’t work :(

    Assuming you are signed in as an admin and the Honest Canadian’s fix did not work, you’ll need to combine double whammy of my Save Game fix and the Honest Canadian Fix.

    1. Locate the Uplay service, it should be under Program Files/Ubisoft/. Drag it to a safe location, such as the desktop. (We’ll need it for later).
    2. Download the Uplay installer and run the installer. 
    3. Drag from your old Uplay folder the Save Games folder into the newly installed one.
    4. Open up data, drag over all the missing files into the Ubisoft/data folder. Do not overwrite any folders.
    5. Launch the Uplay application. It should take a few moments to sync up.
    6. Right click the Uplay and choose Properties.
    7. Click the ‘Security’ Tab and click ‘Edit’. Select your  User profile (the one that reads YOUR\Users and set the Allow column of check marks to ‘Full Control’. Click Ok.
    8. Launch the Uplay application to verify it is working

    Problem #4:  MSVCR100.dll Missing Error

    I haven’t encountered this but during my searching for answers I saw this fix. If you see this, you need to Install the redistributables

    1. Go to <path-to-game>\Support\Software\
    2. Install all items in this folder.

    Problem #5:  Crashing at Start Screen 

    Again, this is an issue I haven’t experienced but other players report some success doing the following:

    1. Turn uplay to offline mode and play offline campaign.
    2. Rebooting if problem continues.

    Nerd Stuff:

    I probably won’t be able to help anyone troubleshoot their Far Cry problems as I’m Mac-using, unix-loving, web developer. Windows is the operating system that I boot to play games and to test Internet Explorer. I’m not a hardcore gamer, but I do enjoy games. Chances are if you’ve been gaming for a half-decade on Windows, you know as much as I do if not more.

    Windows users may not understand this, but the permissions issue frustrated me immensely without the ability to just jump the terminal and type: sudo chmod -R 777 /Program Files/Ubisoft/Uplay. However, permissions are permissions and can be a pain on any OS.

    Also, shout out to the Honest Canadian, whoever you are, you magnificent bastard. His/her/their(whatever preferred pronoun)’s post lead me to pin-point the save game issue as a permissions issue.

    Thanks for reading! I’m all ears for feedback, and happy to respond.

    Edit: Dec 4th, 2014

    Added MSVCR100.dll + Start Screen fix.

    Edit: Dec 30th, 2014

    Added info about the Far Cry 4 1.6 patch


    Adding Ringtones - text tones to - iOS using iFile

    Before we get started…

    Disclaimer: This tutorial is for for jailbroken iOS 7.x / iOS 8.x phones with iFile, available via Cydia. iFile is a utility that allows you to access your phone’s file structure. This will NOT work with non-Jailbroken iOS devices. You can add custom ringtones and text tones using non-jailbroken devices.

    image

    Image: iFile logo. iFile is easily one the best jailbreak mods money can buy

    I decided recently I’d like to add the Mac OS Classic sounds from OS 7, OS 8 and OS 9 on my iPhone, however I was at work and couldn’t sync my phone to my work computer. Besides, who doesn’t love the Wild Eep, Sosumi or Quack? 

    You can download classic error sounds here. Edit: this url now no longer seems to work, see for more details. The Mac OS classic sound pack even comes with .m4r versions of each file, iOS’s native ringtone/text tone format.

    Not one to be deterred so easily, I decided to do it manually. 

    Step 1:

    First connect your iPhone and computer to the same WAP (wifi network).

    Step 2:

    Next, fire up iFile and click the web server (the globe icon). 

    image

    Step 3:

    Plug the IP address into your web browser on your desktop computer (In this example its 192.168.0:103:10000). 

    Step 4:

    Locate the Ringtones folder, in will located in var -> /stash -> _.somefilename -> Ringtones. 

    Note: You should have several folders in /var/stash/ so you’ll have manually locate it.

    image

    Upload your .m4r  sounds in the Ringtone folder, they should be in the .m4r format.  There are several utilities that can do this for you or you can use iTunes.

    Step 5:

    iOS 7 & 8 use a .plist to list your ringtones. In theory, you could manually alter  so your new sounds show, but its much easier to use a Cydia tweak.

    Connect  to Cydia and search for ToneEnabler. Its free, tiny and open source and will allow iOS to find your new ringtones without using a plist. Once installed, any manually installed ringtones will show up in the Preferences -> Sounds lists.

    Install the tweak, (it’ll require you to Restart Springboard).

    image

    Step 6

    Enable your sounds on your iPhone from Preferences -> Sounds

    image

    And that concludes manually adding custom ringtones and text tones to iOS via iFile.


    More iFile Tutorials: 

    Using iFile + Zeppelin to create custom carrier icons


    Speed up Modern.ie virtual machines in VMware

    If you haven’t been to Modern.ie and you’re a web developer, you really should. Microsoft provides free VMs (Virtual Machines) for pretty much every Internet Explorer configuration imaginable for every major VM platform: Parallels, VMware, VirtualBox etc.

    Strangely the VMs come pre-configured with below the recommended settings for VMware.  It only takes a minute or two reconfigure the virtual machines properly.

    Step #1:

    Shut down your virtual machine

    Step #2:

    Right click your VM and click settings

    image

    Step #3:

    Under settings, click “Processor’s and Memory”. First you’ll want to give your VM access to at least 2 processor cores if you’re on an i5 or i7 or Core2Quad..

    image

    Secondly, you’ll want to allow the VMware machine access to at least the recommended amount of RAM. If you’re a a computer with an ample amount of RAM (16GB) you may want to give double the recommended amount of RAM for increased performance.

    You may note the “Enable Hypervisor applications in this virtual machine” which may benefit speed for Windows XP, Windows Vista and Windows 7 configurations and Parallels recommends it for speed

    To quote StackOverflow poster David Pokluda

    The main difference is that Hyper-V doesn’t run on top of the OS but instead along with the system it runs on top of a thin layer called hypervisor. Hypervisor is a computer hardware platform virtualization software that allows multiple operating systems to run on a host computer concurrently.

    Sounds good right? However for Windows 8 it requires additional configuration  and so does Windows 8.1. You’ll want to enable Hypervisor for Windows 8 if you intend to use Visual Studio or the Windows Phone SDK. Note: If you do not properly configure it will decrease performance for Windows 8 since Windows 8 has drivers for Intel Vt engine built in.

    So should you or shouldn’t you use it? For Windows XP, Vista and 7, yes, otherwise for Windows 8, check the links above.  If you have any doubts, you can safely skip enabling hypervisor as it only provides modest gains for VM performance when running one VM for browser testing.

    Step #5

    Boot your VM, VMware may ask you to upgrade, click yes. Once booted Windows will install new drivers and need to reboot. Reboot one last time, and enjoy.


    Gaming services are the new shovelware

    On Tuesday, November 25th, I posted fixes for the video-game, Far Cry 4. It was the most read blog post I made in November despite only being up for 5 days. 

    While writing the post, I could barely contain my absolute disdain I had for the new gaming services / digital distributions and digital rights mechanisms.

    Gaming services are the new shovelware.

    In the past few years, after watching the rise of Valve’s Steam and Apple’s App Store, in an effort to successfully monetize intellectual property, we’ve seen an explosion in gaming services…. mostly forced upon users.

    The idea is simple, if you design a game that requires a  service/distribution platform service, then buyer is converted into that service’s user base. However, these same companies still need Steam in order to reach customers and thus you have services that replicate (often poorly) features that Steam has perfected over the course of a decade. 

    The offenders are many: Rockstar Social. Uplay. Windows Live. Origin. Yet, none of these services incentivize users to love them. Instead of buy -> install -> play, the wonderful three step process that Valve introduced and Apple took and ran away with. Now, when you buy a game on Steam, you have to buy -> install -> register for service -> verify registration -> sign into service -> play. 

    These would be a minor annoyance if they always worked smoothly but often they don’t. 

    Some services are only quasi-evil MyCrysis by CryTek and WB Play is nagware at best. A very small handful of services are good, Gearbox’s SHiFT is completely optional and nag-free. You can chose to sign in, get some nice bonuses and it doesn’t try to replicate the functions that Steam does. The WB gaming service with Shadows of Mordor I’ve successfully avoided. I’m not sure what it does (it replicates some social functions that Steam does) but I don’t care. I don’t play games online (besides Borderlands).

    In the past two years, I’ve had three separate games rendered unplayable entirely thanks to their services: Max Payne 3, Fable 3 and Far Cry 4. I am not hardcore gamer. I don’t even call myself a gamer, nor do I play a lot of games. I play games occasionally in Windows on my Mac Pro. With Max Payne 3, I could not play the game until I signed into service and unfortunately under OS X would not let me sign in. I had to download the 30 GB game in Windows, sign up in a web browser (sign ups within the game were broken) and then sign into the game, save my game, reboot and copy my game save to Max Payne in OS X. Fable 3 defeated me. The save game corrupted so I stopped playing. Not long after,  Windows 7 borked and I had to reinstall (I kept all my data). I forgot all about Windows Live. Fast forward to months later, I was ready to try my hand at Fable 3 again. Sadly, since I don’t know what email address I registered the game to I can’t sign in to play it despite having a legit key tied to my Steam account. Most recently Far Cry 4 had not one but three separate glitches that broke the game, all entirely thanks to Uplay. After hours of work, I managed to fix it and even documented how in a blog post. I’m privileged class of user, who’s been who’s had been using computers since 1992, had internet connection since 1997, has a technology degree, and works as a web developer. When I play games, I’m not out to replicate my job.

    We’ve seen some entirely terrible schemes, such as Ubisoft’s Uplay scheme the required an active internet connection AT ALL TIMES, Window’s Live’s geo-regioning (akin to a DVD or Blu-Ray) which can render a game unplayable. Origin has been accused of spyware, so much so that it’s EULA violates German law.. WB Play has yet to fully materialize but I fear we’ll see something akin to UltraViolet. I’ve gone as far as to use hacks to remove said services on games I’ve legally owned.

    I was thinking I may eventually pick up Assassin’s Creed: Black Flag, but now I won’t. This is plea that’ll land on def ears but please stop using proprietary systems to manage games. If they offer enhanced functionality, let me choose to opt in. Gearbox’s SHiFT should be the gold standard: Optionally enhance your game, and do not replicate functionality that Steam already provides (and does a better job of).

    Ubisoft Uplay, Electronic Arts Origin, and Microsoft Windows Live as a paying customer please stop, don’t turn me into a pirate. I love Steam and I hate your service(s).


    Tumblr likes vs Page Views

    On September 2, 2013, I wrote a post entitled “When learning to code always type it, do not copy and paste…“ It’s a clever anecdotal bit of advice that I read once on someone else’s blog. I’ve tried to track down the original source but have never been able to do so.

    As of writing this, it has been liked 18 times and reblogged 8 times. That’s just a tiny ripple on Tumblr but its easily my biggest success in the Tumblr-verse.

    The Tumblr platform

    Tumblr’s biggest draw is its ability to easily reblog content from other sources. t is a novel idea as it allows content to be reshared on someone's own blog easily. The end result is usually a user’s collection of images and quotes that a user finds interesting or fits a theme. Combined with the ability to follow other bloggers, Tumblr has lead to the creation of range of large and diverse communities.

    That said, I originally selected Tumblr as my blogging platform since it provided the right amount of customization, ease-of-use, and external management. Having set up more than a few Word Press blogs, I wanted to avoid maintenance as much as possible. The community aspect of Tumblr is a bonus, but I almost never participate in it, and almost never reblog or reshare content.

    Statistically speaking, reblogging is an interesting proposition. Any time content is reblogged or found via the Tumblr hashtag search; it cannot be tracked by Google Analytics as the GA-code isn’t embedded within posts. However, I can track how many referrals I get from other blogs, as my posts are attributed to me and reblogged posts may create enough agency to drive someone to visit my blog.

    How does this stack up against likes?

    To better understand how likes translate to page views, it's important to understand what is normal traffic for my blog.

    image

    Weekly sessions (unique visitors) from September 2nd, 2013 to November 15th, 2014. 

    My blog gets roughly 600 sessions per week according to Google Analytics, meaning that it it gets roughly 600 separate visitors a week. A single session is defined by Google as a single visitor to the same website within a half hour period. If a repeat visitor comes to my blog twice within a half hour or clicks another page, this is counted as a single session. If a visitor revisits my blog 2 hours later, it is counted as a new session. Google Analytics also lets you track how many repeat visitor your web property receives.

    image

    Top 10 pages by Pageviews

    One might expect my most liked post to show up in my top ten pages, since September 2nd, 2013, but it is plainly absent and doesn’t even appear within the top 100 posts on my blog. One might also expect that my top posts are nearly as successful on Tumblr as my most reblogged post.

    My top post accounts for 22.56% of my traffic but only has 4 tumblr likes and 2 tumblr reblogs. In fact, none of the top five posts have more than 2 reblogs or 4 likes.

    image

    My most reblogged post has a grand total of 8 pageviews. The spikes in the above graphic represent a single page view.

    As we can see, there’s almost no correlation between likes/reblogs and pageviews on a post. Perhaps, instead of driving pageviews to the reblogged/liked article, one again might assume Tumblr is driving more visitors to the first page of my blog, which isn’t the case.

    image

    Social media makes up for less than 1% of all the traffic to my blog and of that, Tumblr only accounts for 24.49% of social media traffic, meaning that Tumblr accounts for less than 0.25% of my entire traffic, tallying for a grand total of 60 visitors.

    Processing the data…

    What we can infer from this is that Tumblr rarely results in traffic acquisition for my blog. While I may get additional exposure from reblogs, it rarely results in anyone visiting my blog directly. Google Analytics reveals that there’s little correlation, if any, between likes/reblogs and pageviews. Tumblr also obfuscates traffic through reblogging and its own internal “follow” news feed. While I can see how many followers I have, its impossible to know how many people are reading my posts through Tumblr. My hunch is that it isn’t many.

    Depending on the nature of the Tumblr blog, it is possible to be far more successful within the Tumblr-sphere than it is via web searches. People creating popular image macros/gifs/photos that aren’t easily indexed by search engines possibly see far more reblogs than pageviews.

    While my case study can really only can accurately reflect what is true of my own blog, I’d be willing to bet in the case of technical blogs featuring long posts probably do not fair nearly as well on Tumblr.  When gauging success, Google Analytics is a far better metric for this instance.


    64 Bit Google Chrome Available Today

    The Chrome team is delighted to announce the promotion of Chrome 39 to the stable channel for Windows, Mac and Linux. Chrome 39.0.2171.65 contains a number of fixes and improvements, including:

    • 64-bit support for Mac
    • A number of new apps/extension APIs 
    • Lots of under the hood changes for stability and performance 

    - Chrome Releases

    Now we can use 4GB+ for a single tab in Chrome.


    Install an OS from a .vmdk image in VMware Fusion, OS X vmdk

    VMware Fusion plays nicely with the Virtual Machine Disk format (.vmdk) seeing as its a spin off of its enterprise VMware software like VMware Workstation.

    You can mount vmdk preinstalled images in VMware but you can also install from vmdk images. In this example, I have a OS X 10.7 Lion Installer Disk .vmdk and VMware Fusion 7. These instructions should be OS agnostic and the process hasn’t changed in VMware Fusion 8 to my knowledge.  

    These steps can be mimicked using other operating systems, in fact only step 9 differs when using Ubuntu or Windows.

    First MAKE A COPY OF YOUR VMDK as VMware Fusion can break your install disk image. Once you have a copy to work from, launch VMware Fusion and select create new virtual machine.

    Now let’s begin…

    image

    1) Select More Options from the lower right hand corner.

    image

    2) Select create custom virtual machine

    image

    3) Select the version of the OS you are installing, (I’m using OS X 10.7) and create the image.

    image

    4) Select “Use an existing virtual disk” and click “chose Virtual disk…” to locate your disk image. VMware Fusion will create a dummy Virtual Machine for you to save. 

    image

    5) Once created, you should see your VM’s window, select the configuration options (the wrench icon on the far left)

    image

    6) Select “Add Device” and then “New Hard Drive”

    image

    7) Configure your new hard drive disk image and confirm that its in your device list under the “Removable Devices” on your settings panel.

    8) Boot the virtual machine.

    image

    9) You’ll need need to partition your virtual machine’s space from the installer. In OS X, this is located under the Disk Utility. Click the Erase and setup up the volume with HFS+ journaled.

    For other operating systems, you’ll need to use the installer’s utility to properly format the blank volume to a format compatible with the operating system in question. 

    image

    10) Now you can begin installing as you have a volume to install from!

    image

    11) Wait… once installed, your VM will reboot, but will relaunch with the installer CD.  You’ll need to shut down the VMware machine.

    image

    12) Go to your settings for the VM, and select the Startup Disk and select your new volume.

    image

    13) You’ll need to remove the Installer disk. Click on the Hard Disk, and select the advance options and select “Remove Hard Disk”

    Now you should be able to boot your Virtual Machine normally and use it.


    Resolving - because its extensions are not built try gem prisitine for sass

    Resolving - because its extensions are not built. Try: gem pristine - for Sass

    You may see this error when compiling with Sass or Compass, in CodeKit or Grunt or otherwise: 

    Usually it’s a result of a bad compile, and you’re seeing this instead of a helpful error log. 

    Ignoring fast-stemmer-1.0.2 because its extensions are not built. Try: gem pristine fast-stemmer-1.0.2
    Ignoring hitimes-1.2.2 because its extensions are not built. Try: gem pristine hitimes-1.2.2
    Ignoring posix-spawn-0.3.9 because its extensions are not built. Try: gem pristine posix-spawn-0.3.9
    Ignoring redcarpet-3.1.2 because its extensions are not built. Try: gem pristine redcarpet-3.1.2
    Ignoring redcarpet-2.3.0 because its extensions are not built. Try: gem pristine redcarpet-2.3.0
    Ignoring yajl-ruby-1.1.0 because its extensions are not built. Try: gem pristine yajl-ruby-1.1.0

    Fire up a terminal window and try the following:

    compass version

    Likely you will see the error echoed the same as your failed compile. You’ll need to resolve these using a simple terminal gem pristine

    In your terminal, you’ll need to resolve each of the following errors individual.

    Example:

    For the first error you’d need correct the install, so you’ll want to run

    Ignoring fast-stremmer-1.0.2 because its extensions are not built. Try: gem pristine fast-stemmer-1.0.2 

    From the terminal  run:

    sudo gem pristine fast-stemmer

    This will for the rebuild the gem install and resolve that particular error. You’ll need to repeat this for each individual instance of the gem pristine error.

    Each built should look something like the following.

    mycomputer:username$ sudo  gem pristine hitimes

    Restoring gems to pristine condition...

    Building native extensions.  This could take a while...

    Restored hitimes-1.2.2

    Now you should be able to continue compiling Compass.


    Net Neutrality is free speech

    If you need a primer for what/how net neutrality affects you, the most eloquent explination comes from none other than Matt Inman of the The Oatmeal fame.

     The success and protection of the internet hinges on how successfully proponents of a free and open web can frame their argument. The argument can be boiled down, refined and distilled to:

    Net Neutrality is free speech.

    Opposing it is censorship.


    Ditching Helvetica for 10.10 Yosemite, switching back to Lucida Grande

    I’m not the only one it seems that isn’t fond of Helvetica Neue for my OS font. 

    Despite its grand reputation, Helvetica can’t do everything. It works well in big sizes, but it can be really weak in small sizes. Shapes like ‘C’ and ‘S’ curl back into themselves, leaving tight “apertures”–the channels of white between a letter’s interior and exterior. So each shape halts the eye again and again, rather than ushering it along the line. The lowercase ‘e,’ the most common letter in English and many other languages, takes an especially unobliging form. These and other letters can be a pixel away from being some other letter, and we’re left to deal with flickers of doubt as we read.  - Tobias Frere-Jones

    And while we’re on the subject of screens, sure, type legibility across the board is going to get better with the advances in Retina displays. But until everyone has Retina displays, you’re going to have a lot of squinty, frustrated Apple users. Helvetica is just not fun to read at small sizes—in books, on posters, or on iPads—and it never will be. - Gizmodo

    Lucida Grande despite the designer bitch-fest was a workhorse font, akin to monospace fonts like Courier. While Helvetica has its place in design, it’s an inferior font at small type faces, retina display be damned. 

    image

    Pictured: Yosemite search using Lucida Grande. Lucida Grande may not be the end all, be all of typography but its legibility is superb at smaller resolutions

    Fortunately for those of us addicted Lucida Grande or at least feel that its a superior option for onscreen legibility, there’s a fix for that. Lucida Grande

    Simply download and install the zip from github and run the application. Reboot and you’re back to running Yosemite (tested on the public release) with Lucida Grande.  


    Intermission.

    Even though I was playing lots of games, I still didn’t call myself a “gamer” because I had associated that term with the games I wasn’t playing — instead of all the ones I was playing. This was largely because I’d bought into the myth that to be a “real gamer,” you had to be playing testosterone-infused blockbuster franchises like Grand Theft Auto, God of War or Call of Duty.

    Anita Sarkeesian, It’s Game Over for ‘Gamers’

    I couldn’t agree more with her, I ran a very popular cheat code websites for years, PS2cheats.com (No longer under my control) for 7 years. The website made me enough income that I never had to hold down another job during my long duration of college… and yet, I distanced myself from gaming and the term “gamer” during that time, back in the early 2000s. Gamer culture embarrassed me to the degree that I wouldn’t even mention my popular website in person in causal conversation despite having serious “gamer” credentials. I did not want to be associated with videogames, and nearly stopped playing them. Even today, I don’t talk about videogames with many people outside of a select few people . I avoid gaming as a topic for this blog despite owing my current job as a front end web developer directly to gaming . PS2cheats is where wrote my JS script, my first CSS line, and switched to writing HTML by hand.

    image

    Wayback machine of Version 4 of PS2cheats.com, 

    When my Xbox 360 experienced the Red Ring of Death, I didn’t bother to replace it. I spent 3 years without a game console besides my iPhone and my Mac. The games I played were iOS games like Zen Bound, Plants vs Zombies, Triazzle, Spider and games ported to the Mac such as Portal 1 & 2, Borderlands (with friends) and indie games like World of Goo, Crayon Physics, and Thomas Was Alone.

    For my birthday my brother bought me a PS3, and I rediscovered gaming That Game Company’s Flower and Journey games and Naughty Dog’s The Last of Us reminded me that there still could be a mass market for games willing to break the boundaries and narrow confines that appeal to a certain demographic. I still enjoy games but I couldn’t give a royal damn about being a gamer. I even celebrate the death “gamer” as it means that gaming is no longer an activity owned by testosterone-driven white male gamer. I live in an age where mom plays more videogames on her iPad than I do.

    The worst part is, I suspect a good portion of the #gamergate community are 80s children such as myself. I grew up, and its time my games did too. Hats off to Anita Sarkeesian, keep fighting the good fight.

    Now back to our regular program of web development, Apple tidbits and digital audio…


    Walmart.com lists “Dead Children” for sale and other faux controversies

    I never thought I’d be defending Walmart or any other oligopsony in a blog post, but here I go…

    image

    Pictured:  Widely distributed screenshot used as evidence of Walmart’s “Fat Girl Costumes”.

    The Orlando Sentinel ran the following story.

     Despite growing complaints on Twitter, Walmart has yet to remove a section for “fat girl costumes” on its website.

    The plus-size Halloween costume section was first pointed out by Kristyn Washburn who tweeted the corporation with her complaint on Tuesday.

    Gawker owned Jeezebel.com added fuel the fire with its own speculation.

    As of this morning, the Fat Girl section is still up; it features a lot of the same outfits as the Women’s Plus Size Adult section, which makes us think some web developer created the section as a “hilarious” joke and then neglected to change it.

    CNN reports Walmart even apologized.

    Walmart found itself sending apology tweet after apology tweet Monday after the Twitterverse raked it over the coals for a major goof on its website.

    For whatever inexplicable reason, the retail giant’s site featured a Halloween category, titled “Fat Girl Costumes.”

    You won’t find it there now, thank goodness, but it stayed there for a large part of the morning – and long enough for multiple screen grabs.

    The real story

    As much as I object Walmart for any number of ethical, moral, economic and staggering number of other reasons, this story is too good to be true. This is one crime that Walmart isn’t guilty of. Allow me to illustrate.

    Note the URL being presented in the article:

    http://www.walmart.com/c/kp/fat-girl-costumes

    “Fat-girl-costumes” is simply a search string. Many websites place queries as human readable strings into URLs for search engine optimization (SEO) and for human legibility. This is further evidenced (in the screenshot above) that this is a search query as there isn’t a gender selected on the left hand side. When I tested this myself, I was presented with a mish-mash of adult costumes, plus and regular sized.

    Due to my decent level of technical proficiency, I knew intrinsically that I could manipulate the values used.  With this bit of knowledge, I easily created my own  Walmart listing. In the sake of faux-moral outrage, I decided to make my own morally reprehensible URL.

    http://www.walmart.com/c/kp/dead-children

    Clicking it will take you to a search page, simply clear out the address bar, click a department, take a screenshot and you now have evidence of Walmart’s support of Dead Children!

    image

    Pictured: An undoctored/unphotoshopped screenshot.

    Social media as evidence…

    The fun doesn’t stop there. You can even take this to Facebook as further proof. Using my faux controversy URL,  http://www.walmart.com/c/kp/dead-children, watch what happens when I post a link to it on FaceBook:

    image

    Pictured: Edited out last names, but otherwise undoctored.

    It’s complete with an auto-generated thumbnail of a pixie-goth girl(?), and a note “shop for dead children”, and as a bonus, my mom approves! Clearly Walmart supports selling dead children (and my family has no moral compass), right? 

    What’s happening with Facebook?

    Web pages contain various bits of data, known as OpenGraph tags, that are used to assist popular social media sites with previews and to prune for other data.

    On massive websites like Walmart, Amazon, BestBuy (just to name a few) logically/algorithmic determined data automatically populate these tags within a webpage for social networks as it’d be unsustainable to create each of these by hand. The data used on my FaceBook Post simply was Walmart’s best guess for “Dead Children”.  Computer algorithms have no morality.

    We can explicitly disallow combinations of words in a search field, like the famous, nearly decade old, 1,121 banned NFL Jersey names (warning adult language), but even then the algorithm and logic is not making  moral call. It is simply checking the jersey name against a list of predefined words and rules that a human concocted list.  (Funfact: my last name is on the list of banned from Nike+ ID  thanks to yuppie-clothing-line Gant). It’s up to humans to make filters in the computer logic that defines our sense of morality (or sense of intellectual property law in the case of my last name).

    Walmart appears to have no such filter, and thus (much like Amazon) can be searched for any string of words you can conjure. You’ll be surprised, as you’ll continue get search results even with vulgar words. You’re probably thinking “Why on earth would they allow this?” In Walmart’s (and especially Amazon’s) case, they carry massive amounts of books, music and movies, some of which have raunchy titles or cover explicit subject matter. Art often ventures into dark subject matters, be it sexism, racism, slavery, prostitution and so forth. The movie “12 years a slave” may come off as crass without any context, and I can’t even begin to list how many great works of music have crass album titles. Censoring search terms would potentially unintentionally obfuscate access to many movies and books, which in Walmart and Amazon’s case, would equate to lost profits. Walmart isn’t a great patron of the arts (even if Walton’s own a gallery) but it loves money

    While guilty of many things, Walmart isn’t guilty of creating “Fat girl costumes” as a bad joke as Jezebel suggested above. Rather, its logic matched easily “girl” and “costume” and generated a webpage, complete with title, much akin to my “dead children” example. Walmart isn’t the first to be hit with controversial product listings. Amazon last year was swamped with outrage over a seller who auto-generated thousands of unclever variations of “Keep Calm” shirts. At least in the case of Amazon, the products were real. This isn’t. There are plenty of reasons to dislike Walmart but this isn’t one of them.


    CodeKit - Error - File to import not found or unreadable

    image

    Seeing a message that reads something like the following?:

    Error: File to import not found or unreadable: normalize-version.
    Load paths:
    /yourpath/to/yourstyles.scss
    Sass::Globbing::Importer
    on line 10 of /yourpath/to/yourstyles.scss
    Use --trace for backtrace.

    Usually you’ll encounter this from a gem installed library, my current example I’m using Compass Normalize.css, which requires Compass.  There’s quite a few globs out there that require Compass to run that aren’t bundled as bower components but rather Ruby gems. 

    First make sure your have compass installed on your Codekit project, you can see this in your project list.

    image

    If you do not see Compass listed, right click on your project and  select "Edit Project Settings" on your project in question. Scroll down to the Frameworks and click “Install Compass”, make sure your paths are accurate before proceeding.

    image

    If you’re still seeing problems, you may need to swap your Sass compiler

    Happy Coding!


    Using the iOS Simulator in Xcode 6 - 7 for web development

    I’m revisiting one of my more popular posts, Using Xcode iPhone Simulator to Develop Mobile websites, which is nice but a little out of date. 

    You’ll need to download and install Xcode for OS X to access the iOS simulator. Xcode is a free download from the Mac App store. 

    Updated: 11/22/15

    Xcode 7 has been out for awhile now, and these instructions are still valid but a new user may not realize that. I’ve since updated the instructions to reflect this. I added instructions on how to download different iOS versions.

    image

    How to launch the iOS simulator

    Launch Xcode (you may need to agree to Apple’s terms and conditions and wait as it installs component if its your first launch).

    image

    Once launched either right click the icon in dock or locate it from the Open Developer Tool located under Xcode.

    image

    Protip: Once launched, you can permanently position the iOS simulator in your dock and skip Xcode altogether.  You can quit Xcode and the iOS simulator will stay launched

    Copy and Paste!

    You can copy and paste text into the iOS Simulator which requires tediously hitting paste on the open simulator window, then tapping and holding and pasting using the simulator itself. However if you hit command-v (paste) then command-shift-V you can directly paste text into text fields. It’s extremely handy for URLs.

    Inspect Element

    Launch Safari on your computer and go to settings and to “Advanced”

    image

    Make sure show Development menu is checked. 

    Launch your iOS Simulator, and fire up safari and go to the web page you’d like to perform inspect element/ use the javascript console on.

    Go back to Safari and click on the development and select your iOS simulator you’d like to inspect.

    image

    Protip: you’ll need to likely have the latest version of Safari to pair with Xcode, be sure you have all the latest OS X updates installed.  If you still can’t connect, try quitting Safari and relaunching followed the iOS simulator. The connection can  be finicky. If I’ve had my Mac Pro or MacBook Retina open for days on end,  sometimes a full reboot is the only way to fix the connection.

    Also, the Simulator is 99% accurate but I have had on rare occasion render issues on hardware that did not occur in the simulator. It pays to do actual hardware tests for UI interactions as its where you’re most likely to see differences.

    Pinch and Zoom

    Hold the option key to bring up the finger pads. By default the zoom behavior will be on the center of the screen. Hold Shift-Option to recenter the zoom. Release the shift key to zoom.

    Launch iOS Simulator from the Terminal

    From the console, paste:

    open -a /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/Applications/iPhone\ Simulator.app

    Protip: you can create aliases to the iOS simulator.app. Right clicking Xcode and using “Show packages” following the path outlined above will allow you to see the actual application.  Also, due to El Capitan’s user restrictions, by default this will no longer work 

    Drag and Drop in Simulator

    image

    Like most things in OS X, you can drag and drop files directly to your iOS simulator (images, webpages, PDF, anything that Safari supports). If you’re looking to serve static assets off a local workspace, it works in a pinch. 

    Protip: You can save images to the Photos app in the iOS Simulator. Drag an image over, and then save from Safari. You can test file uploads this way.

    Download different iOS versions

    Under the preferences, you can download alternate iOS versions to use with your iOS simulator. As a general rule, Apple will allow you to download the previous iOS the current version of Xcode.

    As a front end web developer, the .x releases tend to be pretty indistinguishable. Usually you’ll only need one version of the previous OS, rather than each 8.x iteration.  

    Keep Xcode 5 for iOS 6 testing

    Simply rename Xcode 5 before updating to keep your old version. You already updated and still need iOS6? I wrote a post detailing on how to download Xcode 5 if you already updated.

    Don’t forget Android…

    Do you wish you had the same awesome simulator for Android like iOS? See my post on GenyMotion, an alternative to using the dog-slow simulator bundled in the Android SDK.


    Recommended Mac Pro upgrades & hacks

    04/11/18




    IMPORTANT: Rather than re-edit this page, I've created a new guide, called "The Definitive Classic Mac Pro Upgrade Guide" which is a larger, more in-depth, easier to read, organized list of Mac Pro upgrades, aiming to encompass all the potential upgrades for the classic Mac Pros. I highly recommend reading it, as many upgrades (like NVMe storage, Audio over HDMI, PowerSupply mods etc) simply not discussed here.

    For posterity, below you'll find the original guide that was maintained until roughly 04/17.




    I originally wrote this in August of 2013 and it still remains one of the most read items on my blog (I've finally as of mid-April started the signifcant task of actually editing down this post with out of date information and cleaning it up). Like my Mac Pro, I’ve continually updated this page and will continue to do so until my Mac Pro retires.

    My opinion of the redesigned Mac Pro still remains negative as used a heavily upgraded 2008 Mac Pro 10 years only to upgrade to a Westmere Mac Pro in 2018, waiting for the mythical return to modular computing. That said, the fact 10+ years later my Mac Pro remains a viable computer is a testament both how well engineered the Mac Pro was, and to the benefits modular computer design.

    While Apple overthinks its new Mac Pro, we're stuck in limbo.

    Whether you may have economical reasons or a disdain for the new less upgradable/hackable Mac Pro, this guide seeks to illustrate upgrades that are feasible upgrades beyond your typical RAM and Hard drive upgrades.

    Please, contact me if there’s anything I’m missing.

    – Greg

    image

    Image Source: Danish tech blog, Giga.de, lampooning the new Mac Pro

    The Guide:

    So you have a Mac Pro and perhaps the latest Mac Pro isn’t exactly what you’re hoping for… It wasn’t for me.

    image

    Screenshot from WWDC. The 2013 Mac Pro doesn’t leave much for upgrades.

    Thanks to the wonders of upgrades and hacks, any Mac Pro is still a viable box, some more so than others. You can always upgrade the RAM and the HDDs, but there’s plenty more interesting upgrades you can do, and most on the cheap. You’d be surprised at the performance a 2008 Mac Pro has in GTA V.

    Notably RAM isn’t getting cheaper and if you’re not running 16 GB+ of RAM, you’re missing out on one of the best features of the Mac Pros, the ability to have more than 16 GB of ram. Apple has yet to make a laptop support greater than 16 GB of RAM and only very recently have the iMacs been able to support up to 32 GB of RAM.

    Upgrade #1: OS X 10.9+ 2006 Mac Pros and 10.13+ 2008 Mac Pros

    image

    Photo Credit: Apple Inc

    Cost: $0

    Effort Level: Moderate

    This is a “Well duh” upgrade, except for 2006 - 2008 Mac Pro owners, which are officially unsupported, but when has official really been a barrier? You can install 10.8+ on unsupported Macs (2006 Macs are capped lower than 2008 Mac Pros). 2006 Follow the guide here.

    Mavericks even on an old Mac Pro breathes new life into 2006 Mac Pros thanks to much better memory management and Grand Central. Mac Pro 2008 Mac Pros can install macOS High Sierra using macOS High Sierra Patcher Tool for Unsupported Macs tool.

    Upgrade #2: USB 3.0 or 3.1c PCIe card

    image

    Photo Credit: amazon.com

    Cost: $20-40 for USB 3.0, $50-60 for USB 3.1c

    Effort Level: Minimal

    At the top of the list comes USB 3.0 as its cheap and effective. USB 3.1 is more "optional" as the plethora of devices still aren't USB 3.1c

    USB 3.0 isn’t the newest kid on the block, but you’ll never be able to add Thunderbolt via upgrades. USB 3.0 is nearly the speed of eSATA and USB 3.1c is much faster. Sonnet's Allegro USB-C PCIe is a safe choice for USB 3.1 seekers. Sadly, there is not and will not be any Thunderbolt card for the Mac Pro as the Alpine Ridge thunderbolt cards are motherboard specific.

    For as little as $18, you can add USB 3.0 to any Mac Pro (running 10.8.3). It’ll take some hackage, as you’ll need to provide Molex power to the PCI chamber. With the 2006/2008 Mac Pros, you can easily feed off the Molex power cable in the optical drive. 2010 Mac Pro owners will need to get a SATA Power to Molex Power Adapter

    Note: Sonnet (Long time Mac hardware/upgrade manufacturer) also sells USB 3.0, more spendy but they carry the trust that comes with the Sonnet name. There’s others on the market, I recommend Macrumor’s forums.

    Upgrade #3 Make your own Fusion drive by installing an SSD

    image

    Cost: $100-$700 (SSDs prices are vary greatly based on make/size).

    Effort Level: Significant

    Yes, SSD is the future. Most modern SSDs are as reliable as HDDs (initially they weren’t).

    Installing an SSD might not be as straight forward as you think. Most SSDs are 2.5 inch, the Mac Pro drive bases are 5.25 inch for optical and 3.5 inch for HDDs.

    In 2009+ Mac Pros, installing a SSD is a breeze. The same guide recommends using the 2 extra SATA ports on the 2006-2008 Mac Pros, although the much easier option is to get a replacement sled that’s matched for your Mac Pro’s drive bays (assuming you have a drive bay free) or you can get the icy dock or the OWC Multi-Mount. (See upgrade #5).

    Anandtech liked the Fusion drive but lamented its rather measly 128 GB. But did you know you can make your own fusion drive and you can make it bigger? In fact, You can make one with any older Mac.

    Follow the guide here.

    Note: 10/14/14 update

    … Skipping the Fusion Drive probably is a more viable option as 1 TB SSDs entered below the $500 mark some time ago. I skipped the Fusion drive for a 750 GB SSD last year. The Mac Pros are SATA2 only so you won’t be making full use of the SSD without a SATA3 card, see upgrade #6 for more details.

    Upgrade #4: nVidia GeForce 700 through 1000 - (And any GTX 7xx & GTX-9xx & 10xx)

    image

    Photo Credit: Amazon.com

    Cost: Graphics cards are in constant flux thanks to bit coins

    Effort Level: Mild (Not difficult but ac Pros require 8 pin power adapters)

    Did you know that nVidia has OS X drivers for its cards? When I originally wrote this guide 4 years ago, this was surprising fact for many users. I tested a GeForce 760 Hackintosh vs it my Mac Pro, outlined the installation process 2008 Mac Pro Full Instructions and benchmarks here. It was quite capable of playing GTA V at mostly high settings (limited mostly by VRAM) at 2560 x 1440.

    Don't take my word for it, there's plenty of Amazon reviews with the 760 Check the Amazon reviews or youtubers jam a 970 in a Mac Protip.

    I've since posted a guide for the GeForce GTX 1060 in Mac Pro 2010.

    The GeForce drivers now support the latest GPUs, but you'll need the latest OS X. With GPU prices in flux, it'll be an expensive upgrade but one of the best ways to extend your life. Also to enable High Resolutions @60hz and HiDPI fix, you may need the Mac Pixel Clock patch

    Upgrade #4: Flash an AMD (Formerly ATI) Radeon 6870 or 6970

    image

    Photo Credit: Amazon.com

    (I’m leaving this up for posterit and history's sake, as the information is still valid although the nVidia route is the superior option) and the latest ATI cards now support dual roms.

    Cost: $?

    Effort Level: Moderate (Not difficult but requires patience locate an EFI rom and careful reading)

    Mac graphics cards are really expensive, but fortunately with a little TLC you can flash Radeon 6870s and 6970s in your Mac Pro, I retired my GeForce 8800 GT a year ago and haven’t regretted it. Out of the box, the 6870s will output picture after fully booting. However, you’ll get some weird issues (Quicktime does not work, games may not work, mac will not output video via graphics card until fully booted). To correct this, you’ll need to flash the firmware with the Mac rom for full EFI compatibility.

    Don’t worry, it’ll still work great in in Windows. However, the Mac roms usually only support 3 video outputs so be prepared that likely one of the DVI ports will be rendered unsupported. You’ll be limited to driving 3 monitors instead of 4. :)

    What you’ll need:

    If you’re too hesitant and have deep pockets, there’s always the Mac EVGA Geforce GTX 680.

    Upgrade #5: Make use of your unused (semi-hidden) SATA Ports

    Cost: $15-50

    Effort Level: Moderate

    image

    Photo Credit: macsales.com

    The 2006-2008 Mac Pros have two hidden unused SATA ports, likely designed to supply the optical bays with SATA ports.

    image

    Photo Credit: macsales.com

    You can easily turn these into eSATA ports with simple PCI plate upgrades with a generic internal to external eSATA port or Newer Tech’s package, designed for the Mac Pro.

    image

    Photo Credit: macsales.com

    If eSATA isn’t your style, you can easily add in two more drives into your free 5.25 drive bay using a OWC Multi-Mount

    Photo Credit: macsales.com

    Upgrade #6: Turbo speed your storage: Use PCIe Storage or Upgrade to SATA3 for SSD

    image

    Photo Credit: macsales.com

    Cost $100-1000

    Effort Level: Minimal to Moderate depending on choice

    The latest MacBook Airs and the upcoming Mac Pro boast PCIe storage. Its oodles faster than SATA and every Mac Pro 2006-2012 is limited to SATA2… but don’t think you’re exactly limited. There's a few vectors and all involve a PCIe card.

    The fastest are PCIe cards that are chassis for PCIe SSDs, these are cards that house the same storage format found on laptops. Some cards can house two cards together in a RAID configuration, and thus deliver truly blistering performance. These are at the extreme end, and deliver 2 GBs per second according the benches. There's also the singular PCIe Mercury Accelsior E2 by OWC which provides a single slot and two eSATA ports.

    Next up is the considerably less expensive option: PCIe to SATA3 housing units. Sonnet makes the dual Sonnet Tempo SSD Pro Plus which has two internal SSD mounts and two external eSATA III ports for Soft raid support, getting speeds to the 1 GB per second range, B&H carries the card for around $250 as of writing this in early 2018. Both OWC Accelsior S PCIe Adapter can house a single SSD drive, ranging from $50 to $90.

    The cheaper route is to get a bootable SATA3 card. Sadly the 2006 - 2012 Mac Pros never saw internal SATA3, and cap at SATA2′s 300 MB/s. Also, see the a friendly MacRumors poster’s guide for more card options and mounting options.

    With the Samsung 850 models hovering around 500+ MB/s a second, going to SATA 3 for many SSDs will nearly double the the sustained reads/writes over using internal SATA 2.

    Upgrade #7: Macs Fan Control

    image

    Effort Level: Minimal

    Cost: $0

    Why does Macs Fan Control count as an upgrade?

    My Mac Pro has always had a fan glitch where my Mac goes into leaf blower mode under any distress.

    Having used it to record two full length LPs, this has been rather problematic. However, you can bypass the thermometers default settings using Macs Fan Control. The Mac Pros have an impressive amount of fan chambers, and if your Mac is overly sensitive, you can easily reduce the Fan RPMs using temperature ranges for different zones. Also as a major bonus, there’s even a Windows version that lets you turn down your fans when booting Windows 7-8.

    Its been a life saver for me, although if you’re not careful you can damage your computer. Just keep an eye on the temperatures and learn what’s normal. I’ve been using it for years now, and its my Mac Pro no longer runs in leaf blower mode while playing games in Windows. I can’t recommend it enough.

    Download it here.

    Upgrade #8: Upgrade your CPUs

    Effort Level: Significant

    Cost: $80-$400

    There are guides you can follow like the Anandtech one linked here and everymac has a lot of great info. Any oldschool PC user probably isn’t afraid of a little thermal paste. Its not the most difficult (swapping out a busted flat panel on a MacBook requires more nimble fingers) but its not for the faint of heart or the shallow of wallet.

    Mac Pro 2006s can rock 8 Core Clovertowns, you’ll just need need a Socket 771 Xeon CPU, and if you find the right place, for as little you can go from 4 to 8 CPUs for as little as $80 total. The 2006 Mac Pro makes for a very compelling upgrade, and there’s plenty of guides for it.

    The upgrade options become a lot less compelling for the 2008 Mac Pro which the base model largely was octo-core. For, one could go from 2.8 GHz to 3.0 Ghz, but there isn’t much room for upgrades as the best CPU for the LGA771 is the X5492 (with the improve 1600 Mhz FSB), the few places that carry them run over

    NOTE: 2009 Mac Pros can accept the same CPUs as a 2010 CPU with a firmware hack, see below

    The 2010/2012 Mac Pros have more flexibility ranging from 6, 10 and 12 core configurations, there is a model difference between the Nehalem 2010 Mac Pros, and Westmere, EveryMac has a nice guide. Jamming two X5690 CPUs into a Westmere 2010/2012 Mac can yield very impressive benchmarks. Some crazy bastards have overclocked them to 4 GHz. Even the 2009 Mac with the X5690 is beast. Compare it to the 12 core 2013 Mac Pro. The X5690 aces it in multi-core and then some. It takes the iMac Pro of 2017 to best the multicore scores of the mighty X5690.

    Recommend Reading:

    EveryMac - 2008 Mac Pro CPU Upgrade

    EveryMac - 2009 Mac Pro CPU Upgrade

    EveryMac - 2010/2012 Mac Pro CPU Upgrade

    Mac Pro CPU by model compatibility list

    Upgrade #9: Go 4k!

    image

    Image Credit: wikipedia.com

    Effort level: Minimal with a few cavaets

    Cost: $300-1200

    While you won’t be able to attain the amazing 5k resolution of the iMac without breaking the bank, you can come relatively close with your aging Mac Pro with 4k.

    There’s a few things to understand right out of the gate, First thing first, need a graphics card that’s 4k compatibe

    Second, you’ll need 10.9.3+ for the UI scaling installed for UI scaling, otherwise be prepared to squint.

    Third, you’ll need a 4k Display.

    Sounds easy right? You need to carefully consider your purchase as not all 4k displays are equal. Some older low end 4k displays are 30 Hz only. This means you’ll get a maximum of 30 FPS plus you may need to install a hack to get 60 Hz for Mac Pros.

    4k monitors can get qutie large but the trade off is pixel density to size...

    A 28 4k monitor is 157.35 PPI (Pixels per inch) which means in-between “retina” and “standard. Previously a higher end 27 inch display normally would carry a 2560 x 1440 resolution, for a grand total of 104.9 PPI. The iMac 5k has an incredible 218 PPI, nearly the same as the Retina MacBook Pros.

    Regardless of the screen size in your 4k display, it’ll be able to scale to plenty of relative sizes, based on your preferences.

    Upgrade #10: Go Wireless 802.11ac + Bluetooth 4.0!

    image

    Cost: <$100

    Effort Level: Moderate (Note: Installing Airport cards is exceptionally tedious due to the wireless connectors)

    Jealous of the latest Wifi speeds? Want continuity and handoff and Airdrop support? Don’t be, wireless AC cards can be had for the Mac Pro for your Airport card slot or PCIe. (Notably, for native 10.12+ support, 2008 Mac Pros need this upgrade).

    2006-2008 Mac Pro users can nab Apple Broadcom Bcm94360cd - 802.11 A/B/G/N/AC + Bluetooth 4.0.

    There’s also the Apple Broadcom Bcm94360cd - 802.11 A/B/G/N/AC for 2009, 2010-2012 Mac Pros.

    I can vouch too, they work great, check out my installation guide. There’s a few tips and tricks that’ll make your life easier.

    User’s looking for a simpler solution can nab The ASUS Dual-Band Wireless-AC1900 PCI-E Adapter can be had for under $100 and its Mac compatible.


    Upgrade #11: 2009 firmware flash to enable Westmere Xeon CPUs and faster ram

    This is one of those times times where a software upgrade makes all the difference.

    Ars Technica reported on the success of the 2009 Mac Pros being flashed by Netkas forum members, the same tribe of digital badasses who figured out how to flash ATI graphics cards before nVIdia did us a big favor and started loading its graphics cards with OS X compatible bios, and writing its own drivers.

    Notably this will not provide any improvements beyond enabling you to upgrade your CPUs and RAM.

    Cost: $0

    Upgrade #12: Hackintosh?

    Effort Level: Extreme

    Cost: $500-1500

    Hackintoshes are ever present but make for a very shaky OS X experience. You’ll be wading kext files and diving into the deep in the pool. I highly do not recommend it unless you’re a seasoned OS X user who’s not afraid of the terminal, and doesn’t mind waiting for major OS updates.

    How to tell if you’re right for a Hackintosh.

    • Use Bit Torrent
    • Use or have used another OS than OS X
    • Knowingly voided a warranty on a computer
    • Successfully wired a motherboard’s LED lights to a motherboard
    • Modified a PC’s Bios
    • Successfully troubleshooted a Mac suffering repeat Kernel Panics
    • Know what the phrase "Show Package Contents” in OS X means
    • Installed Kext files manually
    • sudo nanoed a plist or other OS file in OS X
    • Visited one of the websites: InsanelyMac, TonyMacX86, iATKOS, osx86project and already considered a Hackintosh prior to reading this list

    If you answer yes to all of the following questions, you just might be right for a Hackintosh.

    Having installed 10.5 on a Pentium-D system years ago and trying it out, it made me realize that a Hackintosh wasn’t a viable option for me at the time. It worked… (mostly) but the lack of stability was unnerving. I was stuck waiting on compatible updates, and the lack of graphic support made for a rough experience.

    In the spring of 2014 I got the Hackintosh bug again and tried the Hackintosh route and mostly succeeded.

    Protip: Experienced Linux users will be more adapted to working on Hackintosh problems that the average Mac user.

    Recommend Reading:

    LifeHacker’s guide is probably the best jump off point, followed by InsanelyMac and TonyMacX86.



    A Better Android Simulator for Web development- GenyMotion

    The Android SDK leaves something to desired for someone who just simply wants to use the <Simulator. Even after installing Intel® Hardware Accelerated Execution Manager (HAXM) for hardware-assisted virtualization for Atom based builds of Android (see the guide here), the performance is pretty “meh”.   Its pain, and as strictly aweb developer, the world of Eclipse is foreign.

    I discovered a gem that I’m surprised isn’t more widely touted… GenyMotion.

    image

    GenyMotion makes Android testing more than just tolerable

    So what is GenyMotion? GenyMotion essentially is tricked out version of VirtualBox designed for running multiple versions Android rather than running the <simulator in the Applet. This means full virtualization of the hardware and thus the <simulator is exponentially faster.

    Best of all, for the base version, its free (app devs will want the paid version).

    image

    Pictured: Screenshot of style guide from a site currently being developed running on GenyMotion

    Much akin to other virtual machines you can install multiples of Android, versions d 2.3, 4.1, 4.2, 4.3 an 4.4 currently being supported. 

    Inspect Element on Android!

    Its not immediately apparent that you can use an Android VM as a remote device for Chrome Inspect Element, but you can. It took a little digging but here’s the steps that it took to enable .

    To enable inspect element, you’ll need to install a recent version of Android, 4.4 (one that comes preinstalled with Chrome and not browser). 

    1. Go to Settings -> About Phone and scroll down.
    2. Locate the Build Number and Tap it 7 times 
    3. Go back to Settings 
    4. Go to Developer Options  and check USB Debugging
    5. Now on your host computer, go to Chrome and in the URL bar type chrome://devices/
    6. Locate your connected Android VM. Congrats, now you have Inspect Element, just like iOS (>better late than never right?)

    Note: Link to Google’s official instructions

    image

    Pictured: Screenshot of Chrome inspecting element of GenyMotion Android

    You’ll need to create an account to download GenyMotion, but the hassle is worth it.

    /edit

    Jan 29 15 update:

    Added link to Google’s instructions


    Installing a GeForce GTX760 (GeForce GTX770/GTX780) into a 2006-2008 Mac Pro

    Update March 29th, 2018

    A quick heads up, I've added a new article on the GeForce GTX 1060 / 1070 / 1080 series, the information is still true but a bit dated. I recommend using preinstalling the nVidia web drivers, TonyMacX86 keeps a nice list running of version to OS version. 10.13 High Sierra or 10.12 Sierra

    Below is the original instructions for installing the GeForce 700 series in a Mac Pro.


    image

    One of the unsung beauties of the later nVidia cards is they have Mac bios built in. Gone are the days of having to buy a “Mac” version of a graphics card that came preloaded with the proper BIOS or taking matters into your own hands by locating a Mac rom and flashing your graphics cards, and installing the drivers. (the first graphics card I flashed, a Geforce 3, required a PC to run the NVflash.exe). Even as 2011, I flashed an the AMD Radeon 6870 with Mac EFI Firmware. Now you can buy cards off the shelf and jam them in your Mac Pro.

    I’ve noted the process in my “Recommended Mac Pro Upgrades” but I realize some users may want a more in-depth guide. Dropping a $200 graphics card into a 6 year old computer might seem crazy but with a GTX760, Borderlands 3 and Shadows of Mordor both run at 2560 x 1440, (max settings for Borderlands: The Prequel Sequel, nearly maxed out, sans the 4GB textures for Shadows of Mordor since opted for the 2GB GTX760). I’ve attached at the bottom benchmarks, further evidencing that the Mac Pro 2008 is still a viable computer for gaming.  You can even find videos of youtubers jamming 970s into Mac Pros, so sky’s the limit. 

    This guide is also a blueprint for the ones looking sticking the GTX960 and GTX970 into Mac Pros.

    Here’s how to get your Mac Pro loaded up with snazzy new graphics card in three easy steps!

    Be aware::  

    1. The nVidia drivers currently require 10.9 Mavericks or above, although you may be able to track down older drivers.
    2. The nVidia cards will load the EFI boot screen with the card plugged in (the screen you see if you hold down the option key and the Apple logo). You’ll want to have a backup graphics card in another slot with a monitor connected to see video before the OS X user screen. I personally have an ATI Radeon HD 2600 XT that shipped with my computer but any graphics card will do, flashed or factory as long as it can display the Apple logo on boot. You can operate the computer without an EFI graphics card, however you’ll have to manage booting using Start Up Disk in OS X and use the bootcamp tools in Windows to manage your boot drives. At the very least, keep your old graphics card somewhere safe should you ever need to do troubleshooting.

    Step 1) 

    Purchase a graphics card, and extra mini-PCIe to standard PCIe power cable(s).

    The Mac Pros use a funky mini-six-pin cable so you can’t just buy any PCIe power cable. The Mac Pro ship with two power ports.  I personally bought a PCIe PCI-e Dual 6 pin Power Cable for Mac G5 nVidia ATI Video Card (Disregard the G5 mislabeling on Amazon, its for a Mac Pro ).

    Modern graphics cards power hogs, especially compared to later generation Core series CPUs which continually dropped power requirements between generations. The amount of power connectors, reflect power requirements.

    Chart of power supply delivery based on PCIe power pinouts

    75 Watts 
    None

    150 Watts 
    One six-pin connector

    225 Watts 
    Two six-pin connectors (or one 8-pin)

    300 Watts 
    One eight-pin connector + one six-pin connector

    375 Watts 
    Two eight-pin connectors

    450 Watts 
    Two eight-pin connectors + one six-pin connector

    Since the Mac Pros 2006-2008 do not have eight-pin outs, you’ll need a six to eight pin adapter which requires two sets of six-pin cables (hence where the Y pin comes into play).  For most GTX 760s and 770s you’ll need three 6 Pin power cables, meaning at least one Y cable like the one linked above.

    Next any off the shelf GeForce GTX760 or GTX770 or GTX780 will do. Personally I picked up a EVGA GeForce GTX760 SuperClocked as I’m only a moderate gamer, You certainly will see benefits by dropping more money on a higher end graphics card.  I’d recommend more serious gamers getting one of the 4 GB cards as the newest games make use of the addition VRAM for high resolution textures. The 4 GB GTX770s run about $350 as of writing this, spendy but incredibly powerful for the dollar.

    Some GeForce GTXes may require one or two 8 pin plugins. The graphics card I purchased required one eight-pin cable  and came with a six to eight- pin adapter. This means I needed a grand total of three six-pin cables. 

    The Mac Pro 2008 shipped with the incredibly beefy 980w PSU. Even under full load according to Apple, the Mac Pro’s CPUs only draw 318w of power. Even with 4 bays filled 3.5 inch HDs, (under full loads 7200 drives only consume roughly 10w) and full ram load out (10w a stick under max load)  would leave 450w (if it were feasible trigger full load on CPUs/RAM/HDDs) for motherboard/fans/PCIe cards. Needless to say, the Mac Pro has plenty of headroom (short of trying to run SLI). 

    Summary

    The TL;DR: You’ll likely need additional power cables specifically designed for the mini-PCIe 6 pin power cables. You can get them here:  PCIe PCI-e Dual 6 pin Power Cable for Mac G5 nVidia ATI Video Card (Disregard the G5 mislabeling). Unless you’re attempting an SLI configuration, your Mac Pro has plenty of wattage to spare.

    Please read, September 1 2015 update: I’ve received a few e-mails about cabling. You will need to utilize both PCIe port ports on your Mac pro. This means you’ll likely need to purchase the PCIe PCI-e Dual 6 pin Power Cable.  One of the Mac Pro’s 6 pin adapters should connect to the 6 pin power input on the graphics card, and the other port should connect to a 6 pin to 8 pin adapter. Many graphics cards come with 6 to 8 pin Y adapters.  A reader reports that trying to connect a graphics 6 pin and 8 all to one PCIe power port does not work. Use both ports.

    Step 2) 

    Plugging in your PCIe power cables.  


    image

    The PCIe power cables are located near the front of the PCI chamber beneath the forward fans, right next to the Airport card (or airport card slot).  Its a squeeze but I was able to do it with my bare hands.

    Step 3)

    You’ll want to have a secondary graphics card otherwise this can be a pain. Install the new graphics card and move your old graphics card to another slot. Leave a monitor connected to your old graphics card (if you have two displays, connect one to each graphics card) and  boot up your Mac Pro. Download install the drivers from here. The drivers will require you to reboot. On reboot, plug in your monitor to your new graphics card. If you have a secondary display, leave one monitor plugged into your old display if you’d like to continue to be able to see your EFI boot screens.

    If your new video card still isn’t outputting video, go the preferences and to the nVidia Driver Manager and make sure you’re using nVIdia’s Web drivers.

    image

    Installing on windows will require the usual process of downloading and installing the drivers from nVidia’s website.

    Benchmarks and Comparisons

    You’re probably wondering if its even worth sticking a GTX760 in a six year old computer vs a Hackintosh Core i7 3770k on a Quo Motherboard? The results may be surprising. All the benchmarks have been performed with the exact same install of OS X.  

    OS X 10.9 (Mavericks) benchmarks:

    Settings:

    Unigine Heaven Benchmark 4.0 

    OpenGL 1600x900 8xAA windowed, Quality: Ultra, Tessellation: Extreme

    Mac Pro 2008 (Xeon E5462 2.8 Ghz) + GeForce GTX 760 + 14 GB RAM + Samsung 840 750 GB SSD

    FPS: 32.7

    Score: 825

    Min FPS: 15.7

    Max FPS: 68.9

    Hackintosh (i7 3770k 3.5 GHz) + GeForce GTX 760 + 16 GB RAM + Samsung 840 750 GB SSD

    FPS: 35.7

    Score: 899

    Min FPS: 7.4

    Max FPS: 91.2

    Hackintosh (i7 3770k 3.5 GHz) + GeForce GTX 770 + 16 GB RAM + Samsung 840 750 GB SSD

    FPS: 43.0

    Score: 1082

    Min FPS: 10.3

    Max FPS: 103.6

    ——————————————————————————

    OpenGL 2560 x 1440 8xAA FullScreen Quality:Ultra Tessellation: Extreme

    Mac Pro 2008 (Xeon E5462 2.8 Ghz) + GeForce GTX 760 + 14 GB RAM + Samsung 840 750 GB SSD

    FPS: 16.1

    Score: 405

    Min FPS: 5.8

    Max FPS: 37.4

    Hackintosh (i7 3770k 3.5 GHz) + GeForce GTX 760 + 16 GB RAM + Samsung 840 750 GB SSD

    FPS: 15.7

    Score: 396

    Min FPS: 6.9

    Max FPS: 37.3

     

    Hackintosh (i7 3770k 3.5 GHz) + GeForce GTX 770 + 16 GB RAM + Samsung 840 750 GB SSD

    FPS: 18.8

    Score: 474

    Min FPS: 7.6

    Max FPS: 47.5

    The Core i7 has an advantage, especially for brief maximums but it shouldn’t come as a huge surprise that the Mac Pro holds its own. However, at the max resolution, the Mac Pro bests faster Core i7.  Notably the GeForce 770GTX makes a difference but not quite as much as one might suspect under OS X. Surprising indeed. What’s clear is the GPU is the bigger bottleneck than the CPUs. 

    I’m left to purely speculate that this may be in part due to the fact the Mac Pro has 12MB of l2 cache per CPU vs 8 MB of L2 cache shared, and 32k L1 cache per CPU vs a single 32K of L1, and the Mac Pro has a 6.4 GT/s bus vs the 5 GT/s found in the Z77 Express Chipset. It also could be as simple a driver update. The big takeaway is that a 6 year old computer was able to compete with a CPU that’s a year old.

    The Core i7’s beefier bus speed, newer instruction sets, higher clock speed and Turbo Boost technology 2.0 (CPU auto overclock) all aid in higher max frame rates, when the GPU isn’t under full load.  The Unigine benchmark is a nice stress test of OpenGL but may not illustrate nearly as full a benchmark as windows only Benchmarks like 3Dmark. Also notably Unigine produces better OpenGL benchmarks in Windows, and even more so with DirectX. 

    I’d love to see 2014 5k iMac with the Radeon R9 M295X to throw into the mix. It’s likely my Mac Pro 2008 would be only a few FPS shy of the iMac, based on GPU Boss.

    The short of it, if you want a gaming Mac, an older Mac Pro with a modern GPU is your best bet to this day. The new iMac 5k display is something to marvel, but the GPU (even if absurdly powerful for a mobile chipset) is not when pushing so many pixels. For about $600 you can score a 2008 Mac Pro, making it a bit of a steal

    The Core i7 3770k was released on April 29, 2012 vs the E5462 in Nov 12, 2007.

    Update January 19th, 2015

    What About the GeForce GTX 9xxs / 10xx you may ask?

    If you’re like myself, you may be wondering if you can jam a GeForce GTX 980, 970, 960 or GTX 1080, 1070, 1060 etc in your Mac Pro? The answer is: Yes, yes you can.  You gotta love the Hackintosh community and nVidia. You can even watch youtubers jam the 970s into Mac Pros.

    Update September 1st, 2015

    Added more information about the Y adapter and GeForce GTX 9xx series. Also, the GeForce 760 was more than enough to play GTA V at nearly maxed at 2560 x 1440 and medium textures. To use the high resolution textures, you need a 4 GB card and the same goes for Shadows of Mordor In hindsight I probably should have stuck with the 770 4GB edition.

    The Witcher 3 runs fantastically too. My 2013 Retina MacBook Pro with the dedicated GeForce GT 650m simply cannot play these games. The 2008 Mac pro can. Enough said.

    Update January 3rd, 2016

    Noticed that the driver information listed Mavericks or Yosemite, changed it to “Mavericks or Above”. Added a little more clarification after a reader asked about EFI boot screens. Also, I’ll probably end up making a sequel article when it comes time to get a GeForce GTX 970.

    Update March 29th, 2018

    Minor typo fixes and new intro, added info about the 1000 series.


    The 4k buzzword and what it means for the web...

    Today I gave presentation on the 4k web and what it means to our designers. Our designers are sharp but I figured I’d share my notes, I spent too much time on it, creating charts.

    I wrote a blog post about the 4k web, go read it here

    What is 4k?

    • 4k is an umbrella term for multiple screen resolutions, most commonly 3840 × 2160.  
    • The resolution bump started in Hollywood for digital work flows, for scanned film negatives and digital video shot at a resolution that was considered “Film” grade.  Projectors and very large screen TVs soon followed, replacing the previous generation of “2k” digital projectors.
    • Outside of cinema, 4k video content is relatively scarce due to bandwidth constraints. Currently YouTube offers experimental 4k content and Sony Pictures offers a digital-only movie store.
    • Roughly a year and half ago, the cheapest 4k displays cost roughly $2000, making them almost exclusively a high-end luxury item.
    • 4k displays recently dropped in price (and size) making them affordable to the common person. Computer monitors carrying a 4k resolution can be had for as little as $500.

    Retina For Desktop

    Time for a bunch of numbers

    • Previously, the standard resolution for a 21-24 inch monitor has been 1920 x 1080 aka 1080p (or 1920 x 1200 depending on aspect ratio).
    • Currently there are 24 inch - 30 inch 4k computer monitors on sale at different price points, carrying typically a resolution of 3840 × 2160, twice the resolution of 1080p and 4x as many pixels. 
    • TVs range from 32-105 inches, and typically carry the same 3840 × 2160 resolution and (16:9) aspect ratio. Extreme high end “5k” ultra wide screen TVs carrying a staggering 5120 x 2160 resolution have demoed at several press releases. 8k TVs also are in the works.

    image

    • 4k isn’t different than current “retina” display devices like the MacBook Retina, or mobile devices. It means that desktop displays will have roughly 1.5x - 2x as many pixels-per-inch (PPI) than current displays available today. 
    • However… 4k displays range from as small as 21 inch to as large 105 inches, meaning the that 4k doesn’t always mean retina. 
    • A 42 inch 4k monitor/TV has roughly 105 Pixels-per-inch, the same amount of pixels per inch that standard computer monitors have like the iMac 27” and iMac 21”.

    Why is 4k special?

    Desktop computers have been the last roadblock for adopting resolution independence strategies that include non-mobile devices, (phones/tablets). 

    In the near future, almost all computers, and most displays sold will feature high density displays, roughly 180+ PPI. This marks a drastic change in resolution as monitors from 1990-2012 have almost exclusively been 80-110 PPI. 

    image

    image

    Too many screen sizes

    To address the massive increase in screen sizes and pixels-per-inch, designers have developed several strategies for resolution independence.

    They are as follows:

    1. Responsive design
    2. High resolution images / resolution independent formats
    3. Content delivery methods

    Designing for the Future

    And other clichés….

    • Some elements are naturally resolution independent such as anything rendered by pure CSS (effects rendered by a web browser like shadows, gradients, shapes), typography (fonts and icon fonts) and vector formats like SVG and AI (Adobe Illustrator).
    •  Regardless where these assets are displayed, they will be rendered in the highest resolution possible as the shapes are represented by math instead of pixels.
    • Raster formats (things that contain pixels instead of maths) are not intuitively “resolution independent”.  These are typically photographs, common formats include JPG, RAW, PNG, BMP, GIF etc.

    Does this mean that formats like SVG are the future? 

    Not entirely. Simple graphics are easily represented by math. Photographs and complex images are not. For a full explanation of raster vs vector, go check out this article.

    • Anything that is a photograph or complex image needs a higher resolution version for the screens.
    • Photos, complex images and video (anything that is not vector based) are the Achilles’ heel of resolution independence. 
    • Upscaling raster assets beyond their native format results in blurry pixelated results for high density displays.
    • Serving high resolution assets to users without high density displays gain no benefit.
    • Larger images (videos/animation) result in much higher asset file sizes thus consumes more bandwidth, taxing users connections. 
    • Not all screens have the same PPI (some screens are sharper than others)

    Designs today not only need to be responsive, they need to be include high resolution assets to appear sharp on high density displays.

    Strategies for resolution independence during design phase:

    • Use vector assets when possible.
    • (including SVGs or formats that can converted into SVG). 
    • Make liberal use of Smart objects in Photoshop. Switch to designing in vector programs like Sketch and Illustrator
    • They can contain much higher resolution photos and scaled down in the design while leaving the original image at its full resolution. It results in higher PSD file sizes but makes it possible for high density images to be extracted.

    Content Delivery Methods

    High resolution assets require more bandwidth and vector images require browser compatibility but fortunately we have many tools we can use!

    1) Ability to detect high resolution displays via CSS and Javascript; use polyfills and picture elements

    Scripts like retina.js are able to detect pixel density and serve up higher resolution versions of your assets. This means users will only see the retina versions of images if their device supports them.

    2) PNG fallback for old browsers that do no support SVG 

    Scripts like modernizr.js are able to detect if a browser supports SVG and swap in PNGs for older browsers like IE8 and Android 2.2.  SVGs even can support PNG fallbacks when hardcoded into a page without javascript. Icon fonts are supported in older browsers that do not support SVG, which works for simple icons.

    3) Better compression

    Tools like ImageOptim and ImageAlpha allow us to further compress images, often without affecting the image quality or reduce the quality without much perceptual damage. Also, high resolution assets can be served up more highly compressed.

    4) Content Delivery Networks (CDNs) 

    Content delivery networks can delivery multiple quality scales based on compression + resolution and serve these up to fit scenarios based on resolution, screen pixel densities, and even bandwidth*.

    5) Bandwidth Detection

    Emerging technologies allow to detect if the user is on narrow band such as Edge or low 3G signal strength, slower DSL connections. High resolution assets can be served when user is has appropriate bandwidth.


    FontPrep - Drag and drop webfont generator - A Mini-Review

    image

    7/13/2015 Update

    I wrote this review in June 2013. Since writing this, FontPrep is now  open source. It doesn’t work under Yosemite and Mavericks.  I still use it today, but with a 10.7 Virtual Machine when FontSquirrel doesn’t suffice or I need to skirt around the copy-protection notices as FontSquirrel has no way to prove one’s license for a purchased font.

    It’s still an essential tool that I lean on a few times a year.


    Below is my original review:

    Intro

    Web fonts are a funny thing as there isn’t a universal format for all browsers. Most licensing is based on the honor system with each foundry arbitrarily picking different conditions (page views, domains, average visitors) and expecting one to pay based on it. Its a broken system, has really ushered in Google Fonts and Adobe Typekit for better or worse.

    Fonts have zero copy-protection. Essentially, embedded fonts are a localized install. While OS X and Windows do not use woff or svg, they do use ttf. Simply loading a webpage with an embedded font means you can source the TTF file, download it and install it. This is problematic as high quality fonts take hundreds of hours to create and foundries are left cold.

    Like many things, the internet has been a gift and curse for typographers, as they can distribute their work easily but have it stolen even more easily.

    Now this isn’t meant to be a lecture on web fonts, but its important when discussing FontPrep

    I ran across this app the other day, a drag and drop webfont generator. Web designers probably already know about FontSquirrel. FontSquirrel is a web service / font distributor that allows you to generate embeds of web fonts. This is awesome. Its free to use and it works wonderfully… except when it doesn’t.

    I had a client who insisted on using Papyrus on their website (yes I know, Papyrus is the new Comic Sans). They purchased a license for it, which was very non-specific but seemed to include embedding privileges.  However, I couldn’t create an embeddable version using Font Squirrel despite legally owning it because I was blacklisted. The license only included in the digital format a TrueType version. Eventually I used FontLab to strip the font data and created a new dummy Papyrus font, effectively removing the blacklist based on me liberally interpreting the license, which stated the font could be modified. I had entered a much more legal grey zone than had Font Squirrel let me embed the font.

    FontPrep

    image

    FontPrep functions almost to the letter (font puns!) like Font Squirrel but with drag and drop functionality and (currently) no blacklisting. That will change but for now, any font can be converted to a webfont. Its a good and bad thing, but when so much falls in a legal grey area, its nice to have a backup

    So how does it work?

    FontPrep is essentially a bootstrapped Ruby project wrapped in an app container. Its far from the pinnacle of application design as it has zero OS menu integration. However, drag OS X fonts in, and it’ll spit out a CSS file, a simple HTML Preview and SVG, TTF, WOFF and EOT although in my tests, EOT was zero bytes :/

    image

    Where FontPrep shines is creating subsets. You can simply check and see in real time your results. This means disabling characters you won’t use. Reducing your glyph set will drastically reduce the sizes of your font. However, if you’re expecting the expert features of Font Squirrel, you may be a bit disappointed.

    Font squirrel allows you to change the hinting, remove kerning and change rendering options.

    imageimage

    Final Thoughts

    At $5, its not too much to ask and worth it (if the bypass blacklisting feature is honored). I have legally purchased licenses only to be locked out of converting the fonts. I’m sure Adobe would have a field day over this, but I’m sure Adobe’s legal team is waiting to pounce on anyone dumb enough to try to embed Myraid Pro without a license.

    FontPrep easy but doesn’t really offer too much to distance itself from Font Squirrel. I don’t blame the developers for not selling in the Mac App Store, but I imagine some users may be willing to pay $1 more to have the convenience  of App Store updates and downloads.


    image

    This what the innards of the App look like, CSS, JS and Ruby


    Susy - Undefined mixin ‘span-columns'

    You may come across the following error in your web deving if you ever use the Susy math library:

    “>Undefined mixin ‘span-columns’.)" 

    Wondering how to fix this? If you’re opening a project that’s kicking out this error in Grunt or whatever task-manager you might be running, it means that your @import susy; is trying to call the most recent version of susy.

    Simply change your whevever in your Sass you call @import susy; to @import susyone;

    Took me longer than I’d care to solve that.