Blog Updates

    Not that I have regular readers, keeping with the tradition of announcing changes, like my stupid spam bot solution, complianing about AI, adding dark mode, or general changes, I finally removed jQuery. That's a 70k JS payload down to 7k. The only reason it existed was for FitVidJS, which I converted to vanilla JS.

    Also, now my slogan changes with well over 100 phrases; some are a bit spicy... well, only if you're the sort of low-acuity person who still thinks Elon is cool. It's okay, I'm just a mean guy who doesn't use Twitter. My opinions don't count.

    I'll probably look at integrating this blog into ActivityPub to continue supporting the open internet.


    Setting up a Synology VPN for your Mac

    I know why you're here. It's because you want to connect to your local network via WWW, but Synology's guides aren't complete, and a bit out of date, right? I don't know why I'm asking, as you can't respond. Anyhow, let's get to it.

    Step 1: Setting up the your Router

    Router Config

    In your router, you'll need to set up port forwarding or "Virtual Server" or "NAT Settings":

    • UDP port 500 (IKE)
    • UDP port 1701 (L2TP)
    • UDP port 4500 (IPSec NAT-Traversal)

    These must be pointed to your Synology's internal IP address, like 192.168.x.x or 10.10.x.x. You also may need to unblock these in your Firewall.

    Step 2: Setting up the VPN Server

    Synology VPN Config

    If you haven't installed the VPN Server package from Synology on your NAS, you'll need to do this.

    You have three main options for setting up a VPN, only two of which are real options: OpenVPN and L2TP over IPSec. The path of least resistance is L2TP over IPSec. You'll only need to configure the IP address, which should be in the same host range (the last two octlets). My network is 192.168.50.x. To avoid any IP conflicts, I just add +1, so my VPNed devices will be 192.168.51.x.

    The second piece is the Pre-shared key. Assign this to whatever you'd like, but you will need this password.

    Step 3: Setting up your Mac

    Locate VPN in the system settings; in modern macOS, search "VPN," and it'll appear in the sidebar. Click add VPN configuration, and select L2TP over IPsec.

    VPN Config in macOS for Synology

    You'll need to configure the following:

    • Display name: This can be anything
    • Server address: This is your network's external IP. I couldn't get the quickconnect URL to work, but there is probably a way. The easiest way to determine your IP is to use a "What's my IP" search in a web browser when connected to the same network as your Synology.
    • Account name: This must be a Synology user on your NAS. You can create an account just for VPN or use an existing account
    • Password: This is your password for the Synology user.
    • Shared Secret: This is the pre-shared key you set up in the Synology VPN server.

    To test the connection, you'll need to use an external network; if you have an iPhone, a quick and easy way is to connect your Mac to your iPhone as a hotspot and test the VPN connection.

    If you have issues connecting anything on your network, in your VPN configuration, click options and select "Send all traffic over VPN connection". This is a brute force method that'll do exactly what it says. Your Mac essentially exists on the same network as the rest of your devices.


    The Definitive Guide to iOS emulation

    As of April 2024, Apple has allowed emulation as long as they don't use JIT. This has opened up the floodgates to a technology that virtually every other modern platform in existence allows. Emulation can be used for many things, but gaming is the most popular use case for average users, and thus, this guide will focus entirely on gaming.

    This guide is a living guide and is in the process of being built out. The goal is to demystify iOS emulation and make it accessible.

    Also available in video!

    If you prefer a video version of this guide, I've made a video version that covers everything you need to get started with iOS emulation.

    8 Bits of Power

    Does this guide seem familiar? Perhaps you've seen my The Definitive Classic Mac Pro (2006-2012) Upgrade Guide, The Definitive Trashcan Mac Pro 6.1 (Late 2013) Upgrade Guide or The Definitive Mac Pro 2019 7,1 Upgrade Guide. These are all free of charge, free of advertisements and annoying trackers, labors of love. You can find me on YouTube and patreon.




    Glossary

    Emulation has a lot of jargon that comes with it. As a quick refresher, here's a list of terms that will be used throughout this guide.

    • BIOS (Basic Input/Output System) – A small program essential for some emulators to replicate the original hardware's startup process and functionality. Required for systems like PlayStation and Game Boy Advance.
    • Core – A specific emulator module within a front-end system (like RetroArch) designed to emulate a particular console. It is an application within an application.
    • Emulator - software or hardware system that mimics the behavior of another system, allowing one device or platform to run software or applications designed for a different environment. It replicates the original system's functionality, including hardware and software interactions, without requiring the original hardware.
    • Firmware – Low-level software stored in a device's ROM or flash memory that controls hardware functions. Some consoles require firmware files for proper emulation.
    • Frame Skip – A feature that skips rendering frames to improve performance, reducing the number of frames displayed per second, effecting lower in the frames-per-second (FPS). This was a common technique with underpowered hardware. Modern iOS devices almost never need to make this compromise. Generally poor perform is due to other factors like using too many game enhancements.
    • Front-End – A graphical user interface that simplifies the process of using multiple emulators and managing game libraries. RetroArch is a popular front-end for multiple emulators.
    • JIT (Just-In-Time compilation) - A method that dynamically compiles code during execution, improving emulation performance but restricted by Apple's policies
    • ISO (International Standards Organization file) – A disk image file that contains a complete copy of a CD/DVD game commonly used for PlayStation, Dreamcast, and other disc-based consoles.
    • Native Resolution - the original display resolution of the emulated console
    • ROM - (Read-Only Memory) is a type of non-volatile memory that stores firmware or software permanently and cannot be easily modified or erased. In emulation, a ROM refers to a digital copy of a game's software extracted from a physical cartridge or disc, allowing it to be played on an emulator.
    • Save State – A snapshot of a game's current state that can be saved and loaded at any time, allowing players to resume gameplay from that point.
    • Side Loaded – Side loading is the process of installing apps on iOS devices from sources other than the official App Store, typically by using tools like AltStore, which allows access to emulators that use JIT compilation and other features restricted by Apple's App Store policies..



    Getting Started /Requirements

    Requirements:

    • A device running iOS 17, iPad OS 17 or later
    • Free space on your device
    • Optional but recommended: A gamepad
    • A bit of patience

    Modern iPhones and iPads are powerful machines; the iPhone 16 Pro in raw CPU computing bests an 8-Core Mac Pro 2019. Any device capable of running modern iOS has enough processing power to emulate many different platforms. In the late 1990s, an iMac G3 233 MHz could emulate a NES and do reasonably well at SNES emulation. The biggest impediment for most devices will be storage as 32-bit era consoles like the Sony Playstation, Sega Saturn, or PPSSPP games can easily eat 600 MB per game, and in the case of the PSP, over 1 GB.

    An emulator cannot understand interactions the console was not programmed for, such as touching menu items in an SNES game. While emulators feature touch controls, gamepads are highly recommended as console games are designed specifically for controllers; thus, all touch controls are mapped to key presses. Touch controls are either unable or very difficult to use for some interactions, such as analog triggers.




    Legal Considerations for ROMs and BIOS Files

    The video above is a video I made about the story of Connectix VGS and how it enshrined emulators as legal

    The legal status of game ROMs and console BIOS files exists in a complex gray area that varies by country and jurisdiction. While emulators themselves are generally legal software, the content they run often raises copyright concerns.

    ROM Files

    ROMs (Read-Only Memory) are digital copies of game cartridges, discs, or other media. From a legal standpoint:

    • Personal Backups: In many jurisdictions, making personal backup copies of games you legitimately own is considered legal under fair use doctrines.
    • Downloaded ROMs: Downloading ROMs of games you don't physically own is generally considered copyright infringement, even if you previously owned the game but no longer do.
    • Time Limitations: There is a common misconception that games become "abandonware" after a certain period. However, copyright protection typically lasts for decades (in the US, copyright extends 95 years for corporate works), and most classic games are still under copyright protection.

    Of course said files are often distributed openly on the internet and found via search engines.

    BIOS Files

    BIOS files are even more legally sensitive than ROMs:

    • Copyright Protection: Console BIOS files are protected by copyright and are generally not intended for distribution.
    • No Abandonment Provisions: Even for discontinued consoles, the BIOS copyright remains in effect.
    • Reverse-Engineered Alternatives: This is why many emulators (like those mentioned in this guide) offer reverse-engineered open-source BIOS alternatives that don't infringe on copyrights.

    Best Practices

    To stay on the safer side of the legal spectrum:

    1. Only create backups of games you legally own.
    2. Don't distribute ROMs or BIOS files to others.
    3. Support developers by purchasing games when they're available on modern platforms.
    4. Consider using legal alternatives like official re-releases or subscription services that offer classic games.

    Many companies now offer legal ways to play classic games:

    • Nintendo Switch Online (NES, SNES, N64, Game Boy games)
    • PlayStation Plus (PlayStation classics)
    • Virtual Console and classic collections
    • GOG.com and other digital stores that sell classic games

    This guide is not intended to encourage copyright infringement. The technical information provided is for educational purposes and for those who wish to play games they legally own on modern devices.

    Note: This section provides general information, not legal advice. Laws vary by location and interpretation. When in doubt, consult legal resources specific to your region.




    iOS vs Android

    This guide will likely never feature a comprehensive breakdown comparing iOS vs Android, but Android has a considerable advantage compared to iOS.

    Due to its more open nature, Android has a decisive advantage. While iOS emulation dates back to the jailbreaking era of iOS, Android app stores have officially allowed emulation from virtually the beginning, meaning there are many more mature emulators. Android also places fewer restrictions on emulation; thus, emulators exist for more modern consoles, like the Sony PlayStation 2, Sega Dreamcast, GameCube, Wii, and even the Swìtch.

    The diversity of the Android ecosystem has spawned full-blown console-like Android devices such as the Odin 2, a high-end device that features a built-in gamepad akin to a portable videogame console. Devices like the Odin 2 feature memory cards, allowing for a relatively inexpensive way to store game collections.

    Android is also more forgiving about 3rd party controller mapping whereas iOS has a much more limited ability to map 3rd party controllers. This gives Android an accessibility edge as less-conventional layouts and input devices can be mapped according to user preference.

    Mainstream iOS emulation, while relatively young by comparison, still offers a great experience. Android does not make the setup easier in my experience, but rather, it offers a lot more options. This guide will help you get the most out of your iOS or iPadOS device.

    Android is the superior option if emulation is your primary concern.




    File Management

    wifi

    iOS has a very locked-down file system, but it does provide multiple ways to transfer data to and from your device. The most common methods are:

    • USB File Transfer - USB File Transfer is the most reliable and recommended method, but it requires a computer.
    • iCloud Drive - iCloud Drive allows for dynamic file management but requires a subscription for more than 5 GB of storage
    • AirDrop - Airdrop is the most convenient but is limited to Apple devices.

    Transferring via USB

    Transferring files via USB, as stated, is the preferred method due to speed, reliability, and accessibility.

    1. Connect your iPhone or iPad to your computer via cable. You may need to authorize the device on your computer and/or device.
    2. Open Finder on your Mac or File Explorer on your Windows PC.
    3. Your device should appear in the sidebar or as a drive in the finder. Click on it to bring up the iPhone pane.
    4. Click on the Files tab to access the file system. You should see a list of installed applications that support file transfers, including your emulators. Due to the limitations of Apple, you cannot access any files within a folder
    5. Drag and drop files to your device on the Application icon. You can drag and drop entire folders.

    Regardless of file transfer type, file management is almost entirely handled on the device using the Files app. For detailed instructions, see Organize files and folders in Files on iPhone. Files can be accessed between applications. This is very useful for sharing ROM libraries between emulations such as RetroArch and Delta, which can both emulate a subset of the same consoles (NES, SNES, GameBoy, Gameboy Advance, DS, and N64). They can share files rather than storing duplicate copies of the same game.

    To select all, tap a file, and then from the lower left corner, click select all.

    To move a folder or file, long press it, and then select move.

    Third Party File Management

    Dude to the arbitrary limitations Apple places on file management, there is a cottage economy of phone management applications, the most prominent being iMazing. These applications allow for viewing and editing the contents of directories that exist on an iPhone. Unfortunately, these applications do cost money but are easier to use than Apple's Files app.

    Adding games to Emulators

    Once games have been transferred, adding games to the emulator in question is relatively easy.

    Every emulator follows the same pattern for adding ROMs to its library by clicking some sort of add + or Add games button, then locating the files and selecting all or pointing a scan function to the directory. A few emulators have default locations like PPSSPP that will autoscan. Only DolphiniOS requires the ROMs to be located in an exact directory.




    Emulators

    Pick a Path

    Emulators on iOS exist in two camps: App Store and Sideloaded (see lists below). iOS's emulation selection is slim, but fortunately, pretty much all of the major consoles are covered up to the 32-bit era. Here is the list of consoles supported, all of which have RetroArch support. Emulators like Delta use the same cores that are found in RetroArch.

    • Amstrad - CPC
    • Arcade - MAME / NeoGeo / CPS 1-2-3
    • Atari - 2600, 5200, 7800, Jaguar, Lynx
    • Bandai - WonderSwan
    • ColecoVision
    • Commodore - C64,C128, Plus4, Vic20, Amiga
    • DOS - DOSBox
    • GCE - Vectrex
    • Magnavox - Odyssey 2 / Phillips Videopac+ (O2EM)
    • Microsoft - MSX+
    • MNec - PC Engine / TurboGrafx-16 / CD, PC-98, PC-FX
    • NIntendo - NES, SNES, N64, DS, Gameboy, Gameboy Color, Game Boy Advance, Virtual Boy
    • Palm OS
    • Sega - MasterSystem, Game Gear, SG-1000, Genesis / MegaDrive, Saturn
    • Sharp - X68000
    • Sinclair - ZX 81, ZX Spectrum
    • SNK - NeoGeo Pocket / Color
    • Sony - PlayStation, PSP
    • 3DO
    • Thomas - MO/TO
    • Uzebox

    App Store Emulators

    There are several emulators available on the App Store that are sanctioned by Apple. These emulators are limited in scope and are generally focused on older consoles.

    Side Loaded Emulators

    There several emulators that exist outside of the Apple App Store due the policies Apple imposes, chiefly around the usage of JIT runtimes. The author responsible for porting DolphiniOS has a short blog post that explains the state of JIT and iOS in more depth.

    The current roster of non-App Store iOS apps are:




    Delta

    Links: App Store, Official website

    Delta

    Delta is a multi-emulator that supports a wide range of consoles, including the Nintendo Entertainment System (NES), Super Nintendo Entertainment System (SNES), Nintendo 64 (N64), Nintendo DS, Game Boy, Game Boy Color, and Game Boy Advance and provides a very-iOS native experience for its UI making it the easiest to use emulator in this list. It is free on the App Store. It's a minimalist emulator that focuses on ease-of-use above customization and seems entirely focused on Nintendo consoles.

    Adding ROMs

    Transfer ROMs to your iPhone; from the main screen, click the + menu and locate your ROMs. Delta will automatically sort them between the supported consoles

    Save States

    In Delta, save states are triggered by clicking the menu button during gameplay. This will allow you to bring up the load state or save state menus. When saving a state, click the save state option and then click the + button to save a new state. Loading states is even easier; just tap Load State from the menu, locate the save you'd like to load, and tap it. Your game will resume in the exact place depicted in the screenshot.

    Optional Nintendo DS BIOS

    Delta uses the MelonDS core, which uses, by default, reversed-engineered open-BIOS and does not require Nintendo DS BIOS to function. However, there may be extreme edge cases where this might cause minor issues (likely saving games). You can provide your own BIOS files by transferring the 3 required files to your iPhone. In Settings, locate Core Settings and tap the Nintendo DS. You'll need to provide a bios7.bin, a bios9.bin and thefirmware.bin. Tap and link them.

    Nintendo (NES) Games not loading

    If you have a problem loading NES cores, quit the emulator and relaunch it. Swipe to your NES collection. Long press and hold on to a game to bring up a preview. You should see the game running in a preview. From there, tap open. NES games should be playable




    Gamma

    Links: App Store, Official website

    Gamma

    Gamma is a semi-paid emulator that offers an advertisement-supported free tier. It has received criticism as it collects some personal data (almost certainly for the advertisements). It is a reasonable one-time $5 purchase and is continually being improved.

    Adding ROMs

    Transfer ROMs to your iPhone; from the main screen, click the + menu and locate your ROMs. Gamma supports bin/cues and ISOs. The paid version will support compressed files but seems only auto-decompress them.

    Save States

    In Delta, save states are triggered by clicking the menu button during gameplay. This will allow you to bring up the load state or save state menus. When saving a state, click the save state option and then click the + button to save a new state. Loading states is even easier. Just tap Load State from the menu, locate the save you'd like to load, and tap it. Your game will resume in the exact place depicted in the screenshot.

    Optional BIOS

    Gamma uses the PCSX-ReARMed core, which uses, by default, reversed-engineered open-BIOS and does not require a BIOS to function. However, there may be extreme edge cases where this might cause minor issues (likely saving games). You can provide your own BIOS files by transferring a BIOS file to your iPhone. In Settings, locate Core Settings and tap the PlayStation. You must provide a bios.bin file. Tap and link them. It does not need to be named bios.bin




    PPSSPP

    Links: App Store, App Store (Gold), Official website

    PPSSPP

    PPSSPP is the gold standard for emulation experience as it cannot only play PPSSPP games but also enhance them beyond the original console's ability. Modern iPhones have enough horsepower to greatly enhance the graphics output of the PPSSPP titles.

    Adding ROMs

    Transfer ROMs to your iPhone; from the main screen, click the refresh menu and locate your ROMs. PPSSPP supports .iso, .cso, .pbp, and .chd.

    Save States

    In PPSSPP, save states are triggered by clicking the menu button or tapping the screen to bring up touch controls. This will allow you to bring up the load state or save state menus.

    Graphics Options

    Rendering Mode

    • Backend: Determines the graphics API used for rendering; different backends may offer varying performance and compatibility on your device.
    • Rendering Resolutions: Controls the internal resolution at which games are rendered; higher values provide sharper visuals but require more processing power.
    • Software Rendering (Slow accurate): Uses CPU instead of GPU for rendering, providing better compatibility for problematic games at the cost of significantly lower performance.
    • Display Layout and Effects: Controls how the game is displayed on the screen and allows the application of visual effects like shaders or filters.

    Framerate Control

    • Frame skipping: Skips rendering certain frames to maintain game speed on slower hardware; helps performance but may cause visual stuttering.
    • Frame Skipping Type: Determines the method used when skipping frames, affecting how smoothly animations appear during performance optimization.
    • Auto frameskip: Automatically adjusts frame skipping based on current performance to maintain target speed; provides dynamic optimization without manual adjustment.
    • Alternative speed (in %, 0 - unlimited): Allows custom adjustment of emulation speed as a percentage of normal; useful for speed runs or slowing down challenging sections.
    • Alternative speed 2 (in %, 0 - unlimited): Provides a second preset speed option that can be quickly toggled; convenient for switching between different emulation speeds.

    Speed Hacks (Can cause rendering errors!)

    • Skip buffer effects: Bypasses certain visual effects that require buffer operations; improves performance but may cause visual glitches in some games.
    • Disable Culling: Turns off the removal of non-visible polygons; can fix broken visuals in some games but may reduce performance.
    • Skip GPU readbacks: Prevents the GPU from sending data back to the CPU; can significantly improve performance but may cause graphical issues or crashes.
    • Lazy texture caching (speedup): Delays texture loading until absolutely necessary; reduces memory usage and improves performance but may cause brief texture pop-in.
    • Spine/bezier curves quality: Controls the precision of curved surface rendering; lower values improve performance, while higher values enhance the visual quality of curved objects.

    Performance

    • Render duplicate frames to 60 Hz: Repeats frames as needed to maintain smooth 60Hz output even when the game runs at lower framerates; reduces stutter but doesn't affect actual game speed.
    • Buffer graphics commands (faster, input lag): Queues up graphics commands for batch processing to improve performance; may introduce slight input delay as a trade-off.
    • Hardware Transform: Uses GPU acceleration for geometry transformations instead of CPU; typically provides significant performance gains but may cause visual glitches in some games.
    • Hardware Tessellation (greyed out): Would use GPU to generate additional geometric detail on surfaces; currently unavailable on iOS devices.

    Texture Scaling

    • Upscale Type: Determines the algorithm used to enhance texture quality; different methods offer varying balances between quality and performance.
    • Upscale Level: Controls how much textures are enhanced in resolution; higher values provide sharper textures but require more processing power and memory.
    • Deposterize: Reduces the banding effect in textures that have limited color gradients; smooths out color transitions but may slightly blur sharp edges.
    • Texture Shader (Greyed out): Would apply custom effects to textures; currently unavailable on iOS devices.

    Hack Settings (May cause glitches)

    • Lower resolution for effects (reduces artifacts): Renders certain visual effects at a lower resolution than the main game can improve performance and reduce visual artifacts in some games.



    RetroArch

    Links: App Store, Official website

    RetroArch

    RetroArch is a powerful front-end for emulation cores. The best way to think of it is as a bunch of separate emulators that a GUI glues together. The advantage of this is that emulators can improve and focus entirely on emulation and not user interfaces. It also means that certain aspects of configuration can be shared, such as user inputs, save-state preferences, and so on.

    Adding ROMs

    RetroArch can add individual games by using Load Content, but also, from the add menu, can scan entire directories, multiple folders deep. However, it will not monitor the directory; thus, a scan will be required if games are added or removed from a directory.

    Changing the user interface

    Change UI

    Out of the box, RetroArch is configured to use the GLUI, a user interface that is touch-friendly. If you intend to use a controller, I highly recommend changing the user interface.

    Tap "Settings" then "User Interface," scroll to the bottom of the screen, and tap "Menu." Select XMB. You'll need to quit RetroArch and relaunch it before the changes take effect.

    iOS RetroArch running XMB

    After the change, you should see the more visually appealing XMB interface. For a full menu map, see docs.libretro.com: XMB menu map as this contains the full list of where all the menu options live.

    This section is a work in progress!




    BIOS Files

    One Up

    The consoles that require bios for operation currently for iOS in RetroArch are:

    • Gameboy Advance - gba_bios.bin
    • NeoGeo - neogeo.zip (Placed in ROMs folder as well)
    • Nintendo DS - ios7.bin, bios9.bin, firmware.bin
    • Sega CD - bios_CD_E.bin, bios_CD_J.bin, bios_CD_U.bin
    • Playstation - scph5501.bin (you can use many different bios)
    • TurboGraphix-CD - syscard1.pce, syscard2.pce, syscard3.pce

    RetroArch maintains a list for every platform it supports and the BIOS required.




    Recommended Cores & Configurations

    Select a Core

    Retro Arch often has multiple cores performing the same function. In the cases where these exist, I've tried to narrow them down to what I've found to be the "best," in my opinion. Do experiment and read up on the core differences, as your priorities might differ. Some cores are designed for accuracy, some for speed, and some for enhancements.

    • Game Boy / Game Boy Color - Gambatte
    • Game Boy Advance - mGBA
    • NEC PCEngine / TurboGGraphix-16 / PCEngine-CD / TurboGraphix-CD - Beetle PCE
    • NES - Nestopia
    • Nintendo DS - Melon DS
    • Sega Master / System Genesis (Mega Drive)/ Sega CD - Genesis Plus GX / Genesis Plus GX Widescreen (may not work with all games)
    • Sega Saturn - Yabause
    • SNES - BSNES / BSNES HD Beta (for Mode 7 Games)
    • Playstation - Beetle PSX HW

    Select A Core and Core Configuration

    Select a Core

    When you first select a ROM, you'll be presented with a variety of options, and you'll likely need to select a core. You can always change this by using "Set a Core Association" or "Reset Core Association"

    Core Options

    Core Options in RetroArch are game system-specific settings that allow you to customize how each emulation core functions. Unlike global RetroArch settings that apply to all systems, Core Options are tailored to the specific emulated console or computer system you're using. Every core has its own set of options. From the menu when running a game, go to Core Options.

    Core Options

    Some of the emulator options exist outside of the Core Options, for example, BSNES's HD Mode 7 options exist in a menu outside of Core Option as do some settings for Beetle PSX HW.

    Each core has its own unique settings, which may include:
    • Graphics enhancements (resolution scaling, texture filtering)
    • Performance adjustments (speed hacks, frameskipping)
    • Audio quality settings
    • System-specific features (PGXP for PlayStation, HD Mode 7 for SNES)
    • Region options (NTSC/PAL)
    • Input latency adjustments

    Recommended Core Options for BSNES and BSNES HD Beta

    Mode 7 Enhancement

    BSNES is a cycle-accurate SNES emulator that aims to provide the most accurate emulation possible but also offers the ability to enhance Mode 7 games. Mode 7 titles include Pilot Wings, Mario Kart, and F-Zero. For a complete list, see wikipedia.org: Mode 7, Both BSNES and BSNES HD Beta can enhance Mode 7 rendering, with the HD Beta adding the ability to render games in widescreen during Mode 7 sequences.

    Core Options

    HD Mode 7 enhancement menu

    As of writing this, the HD Mode 7 for BSNES exists in different places depending on the version of BSNES you are using. In BSNES, this is listed in the core menu as HD Mode 7, whereas in BSNES HD Beta, it is in the core options. HD Beta is much more CPU intensive, so you may need to lower the scaling, especially depending on your device.

    • Scale - 3x (720p) for HD Beta and older phones, 5x (1200p) for BSNES on newer phones
    • Perspective Correction - On
    • Super Sampling - On

    Recommended Core Options for Beetle PSX HW

    Hardware comparison

    Beetle PSX HW is an emulation core that offers quite a bit of customization options to enhance PlayStation graphics, such as increasing the resolution, better polygon processing, anti-aliasing, and more. Here are some recommended settings:

    Core Options

    • Internal GPU resolution - 2x or 4x
    • Texture filtering - SABR
    • MSAA - 2x or 4x or 8x

    Emulation Hacks

    Beetle PSX HW has the ability to manipulate the way the PlayStation renders polygons to help improve visual fidelity. Beetle PSX HW's PGXP (PlayStation Geometry Precision) capabilities correct the polygon jitter and texture warping that occurs in PlayStation games due to the console's limited floating-point precision. This feature enhances 3D rendering by maintaining proper polygon alignment and improving texture mapping accuracy, resulting in a much more stable and visually pleasing image without the "wobbling" effect seen in original PlayStation games. PGXP can be enabled in the core options and offers various levels of correction, from basic coordinate precision to advanced perspective-correct texturing..

    Go to the Emulation Hacks menu and select PGXP. Enable the following:

    • PGXP Operation Mode - Memory Only
    • PGXP Primitive Culling - On
    • PGXP Perspective Correct Texturing - On



    Shaders

    Shader comparison

    Shaders in RetroArch are visual filters that can transform the appearance of games by applying post-processing effects, such as CRT scanlines to mimic old TVs, smooth scaling to reduce pixelation, or various color adjustments to enhance visuals. They allow players to either recreate the authentic look of original hardware or dramatically improve and modernize the appearance of retro games without affecting the actual gameplay. They can can be chained together.

    Shaders are performed post emulation, when a single frame enters the frame buffer. This means, they are agnostic or unaware of the emulator that produced them. A good way to think of shaders in RetroArch is as filters similar to filters you can apply to a photograph after it's taken. You can't adjust the shutter speed or aperture or what's in frame or in focus but you can simulate some of these things with a filter. The same goes for shaders, they cannot fundamentally change how a console is emulated to increase the resolution that it outputs but rather can take the image it outputs and manipulate it.

    Users on popular forums and places like Reddit will post often their preferred settings, that can yield incredible results, see Reddit: Shaders are game changing for retro games and emulation for a good example.

    1. While playing a game, bring up the RetroArch menu
    2. Navigate to Quick MenuShaders
    3. Enable the Shaders
    4. Select Load Shader Preset to browse available shader presets
    5. Choose a shader from the list (e.g., CRT shaders are in the "crt" folder)
    6. Once loaded, select Apply Changes to see the effect immediately

    Chaining Multiple Shaders

    1. From the Shaders menu, select Prepend Shader or Append Shader
    2. To save your chain, select Save Shader Preset from the Shaders menu

    Order matters! Place scaling shaders before visual effect shaders. >Common chains include an upscaler (like "xbrz") followed by a CRT effect. Some popular combinations:

    • Scale2x + CRT-Royale (good balance of sharpness and authenticity)
    • SABR + Lottes (excellent for 16-bit era games)
    • xBRZ + Dot Matrix (perfect for handheld console emulation)
    • ScaleFX + CRT-Geom-Deluxe (for a more realistic CRT look)

    Shaders can be configurred on a per console and per title basis. Generally shaders are more popular on non-polygon based titles or 16-bit titles as many modern emulators offer more opportunity to enhance visuals at the emulation level as opposed to post-processing the image.




    Save states

    While playing a game, bring up the RetroArch menu by tapping the screen with two fingers simultaneously (or using your configured menu button). In the Quick Menu, scroll down to find "Save States" and tap to enter this section. RetroArch provides 10 save slots (0-9) for each game. In the Save States menu, select "Save Slot" and choose your preferred slot number. To load a save state, from the Load State, select the save state from the slot you'd like to use.

    Auto Saves

    Auto Saves are a feature that will automatically save your game when you exit it, akin to how an iOS game can resume to where you last were when tabbing between apps. To enable auto saves, go to System Settings and then Saving. Toggle both.




    Hide overlay when Control is connected

    Retro Arch iOS hide overlay

    A default behavior that most users will want to enable is auto-hiding the on-screen overlays when a gamepad is connected.

    Tab to settings and select "User Interface," and then to "On-Screen Overlay." Locate "Hide Overlay when the controller is connected" and toggle it on. You can still access the touchscreen overlay by tapping the screen during gameplay.

    Controllers and iOS

    Zike Controller in Cottonwood Canyon

    iOS supports out of the box various styles of controllers such as the Backbone controller or popular repackaged Zike Z331 / EasySMX M15, ATUTEN Phone Gaming Controller

    Console Supported controllers include:

    • PlayStation DualShock 4 Wireless Controller
    • PlayStation 5 DualSense Wireless Controller
    • PlayStation 5 DualSense Edge Wireless Controller
    • Xbox Wireless Controller with Bluetooth (Model 1708)
    • Xbox Wireless Controller Series S
    • Xbox Wireless Controller Series X
    • Xbox Elite Wireless Controller Series 2
    • Xbox Adaptive Controller
    • Nintendo Joycon Controllers*

    For more information, see: Apple.com: Connect a PlayStation wireless game controller to your Apple device and Apple.com: Connect an Xbox wireless game controller to your Apple device. *The Joycons do not seem to be officially supported, but they work. See TheVerge.com: iOS 16 supports Nintendo's Switch Pro and Joy-Con controllers.

    iOS, unfortunately, is limited in its ability to map buttons, and thus, not every button a controller may work.

    Controller clips

    If you already have a game console controller, one of the least expensive ways to add a hardware controller to your iPhone is via clip mechanisms, like OIVO PS4 Controller Phone Mount Clip, or OIVO PS5 Controller Phone Mount Clip. There are a lot of these on Amazon, like Orzero PS5 Magnetic Controller Phone Mount Clip that makes use of MagSafe or Orzero Magnetic Controller Phone Mount Clip for X box Series X/S, X box One/One S/One X. There are many to pick from; I suggest searching Mobile Gaming CLip and then appending whatever parameters from there.

    Creating your own ISOs from PlaySation 1 games

    Creating PSX backups is pretty simple if you are terminal savvy. All you need is a Mac with a disc drive, making disc imagse can be done Disk utility and more quickly via the terminal. This process is often referred to as "Ripping".

    Disk Utility

    Disk utilty is an application located in Applications -> Utilities on every Mac computer.

    1. Insert the PSX game disc into your Mac's DVD drive.
    2. Open Disk Utility (located in Applications -> Utilities).
    3. Click on the PDX game disc in the left-hand column of Disk Utility.
    4. Click the "New Image" button in the toolbar.
    5. Choose "DVD/CD Master" as the Image Format and "None" as the encryption.
    6. Click "Save" and wait for the disc to be copied to your Mac.
    7. Once the disc has been copied, you can rename the .cdr file to .iso.

    Terminal

    Using the terminal is a bit quicker and more flexible than Disk Utility. The following command will create an ISO file from a PSX game disc:

    hdiutil makehybrid -iso -joliet -o ~/Desktop/PS1_Backup.bin /Volumes/PLAYSTATION

    Explanation of the command:

    • hdiutil makehybrid: The macOS utility for creating disc images
    • -iso: Creates a standard ISO9660 filesystem (required for PS1 games)
    • -joliet: Adds Joliet extensions for longer filenames
    • -o ~/Desktop/PS1_Backup.bin: Specifies the output file (note the .bin extension commonly used for PS1 game, .iso should also be supported by most emulators)
    • /Volumes/PLAYSTATION: The input directory (mounted PS1 CD)

    You will need to alter the paths listed. I highly recommend using AI to demystify terminal commands. While AI isn't the most reliable, it is when it comes to debugging terminal commands. Both Claude and ChatGPT even at the free tiers are great for copying and pasting error codes into or asking help when modifying commands.

    Version History

    cart

    • 03/22/2025 - Added to do, thanks to Reddit feedback, I didn't know Provanance was on the App Store. Added ISO section and Mac emulation links. Plenty to do!
    • 03/21/2025 - Minor Edits
    • 03/20/2025 - Flycast info, started recommended settings for Retro Arch, added Yabause recommendation
    • 03/19/2025 - Expanding content (added controllers, more images, more on RetroArch)
    • 03/18/2025 - Initial Draft

    To do list:

    • More RetroArch explainers
    • More on shaders
    • AltStore + JIT Streaming
    • Xcode App Signing
    • DolphiniOS / Provenance / MeloNX coverage
    • Animated Gifs where it makes sense to illustrate behaviors or settings
    • Evenutally make a Mac version of this

    Mac Guides

    I've made quite a bit of emulation related content related to macOS as well.


    WordPress Auth0 for All Users Without WP Accounts: A plugin

    Sometimes my blog actually covers web development, which is ostensibly what this blog is about instead of my random adventures in Mac geekiery.

    Recently, I needed to work on a portal site and integrate it into Auth0. The ask seemed simple: Users are required to use Auth0 to view a website. That's it. The problem was that the official Auth0 plugin assumes a few things:

    1. Users have a WordPress account
    2. Auth0 is only needed for the `wp-admin.`

    I am by no means any sort of security expert; thus, with a lot of googling and some AI-assisted development in about two partial days of development, I was able to create a simple solution in the `functions.php` and a must-use plugin. It'd force authentication on the front end, had a simple Auth0 callback, displayed a login screen if the user hadn't authenticated to our domain, and upon successful sign-in, redirected to the correct page.

    Then in the truest 2025 fashion, I copied my mess of code and pasted it into Claude, and after some fine-tuning, created a plugin, wp-auth0-for-all

    >
    1. Install and configure the Login by Auth0 plugin
    2. Download and install wp-auth0-for-all in your plugins.
    3. Activate wp-auth0-for-all
    4. Configure it in your settings
    Auth0 for All

    This also includes some bonus features like wildcard excluded domains, optional auto-redirect, ability to use custom Auth0 domains.

    That's it. It's a simple plugin that forces Auth0 for all users without a need for WP accounts. The plugin page has more information in the ReadMe.


    Running macOS using Docker... on a Synology Nas

    Docker lets you run macOS in the most unusual places, like a Synology NAS, a computer that lacks a dedicated HDMI output and uses an AMD chipset.

    Legal Disclaimer: According to Apple's EULA, virtualizing macOS is only permitted on genuine Apple computers. This guide is for educational purposes only.


    Why Docker and Not Synology's VM Manager?

    Synology provides its own virtual machine software, but it doesn't natively support macOS for two reasons:

    1. The legal restrictions mentioned above
    2. Until recently, it emulated an outdated chipset lacking USB 3.0 and PCIe support

    While Synology has updated their virtualization software to support the newer Q35 chipset for QEMU, I have not heard of anyone successfully installing macOS to run via it's virtual machine software.

    What is Docker?

    For those unfamiliar, Docker is a containerization platform that uses packages and small containers to run services. It's a lightweight form of virtualization popular among developers. The beauty of this approach is that someone has already created a Docker container for macOS, making our experiment possible.

    Setting Up macOS Ventura on Synology

    Prerequisites

    • A Synology NAS (I'm using a DS923+ with 32GB RAM)
    • Container Manager (Docker) installed on your Synology
    • At least 8GB of RAM to allocate to the virtual machine

    Installation Steps

    1. Open Container Manager on your Synology
    2. Click on Registry and search for "macOS"
    3. Locate "docker/macOS" and download the image
    4. Click Project, then Create
    5. Select your storage path
    6. Name your project (e.g., "ventura" or "sequoia")
    7. Create a Docker compose YAML file with the following configuration:
    version: '3'
    services:
    macos:
    container_name: ventura  # Or "sequoia" if using macOS Sequoia
    image: docker/macos
    environment:
        - MACOS_VERSION=ventura  # Change to "sequoia" for macOS Sequoia
    mem_limit: 8G  # Allocates 8GB RAM, default is 4GB if not specified
    ports:
        - "5999:5999"  # VNC port for accessing the virtual machine

    Once configured, access your virtualized macOS through a web browser using the IP address of your Synology followed by the port number specified in your YAML file (e.g., http://192.168.1.100:5999).

    Performance and Usability

    The DS923+ with its dual-core AMD Ryzen R600 isn't a performance monster, but it's surprisingly capable when running macOS Ventura:

    • It performed more than twice as fast as a Mac Mini 2010 in Geekbench 6 tests
    • The Weather app, which typically has graphical glitches on unsupported hardware, renders properly
    • Apple Maps, however, appears almost completely blank
    • Web browsing works well with Firefox (Safari experienced rendering issues)

    In my testing, the About This Mac page reported the system as an iMac Pro with a 2.6 GHz i3 processor, 7MB of VRAM, and 8GB of RAM (matching my allocation).

    Installation Note: During installation, the macOS installer may cause your Synology to reboot. Don't worry—just sign back in, restart the Container Manager, and continue where you left off. This might happen multiple times during the process.

    macOS Sequoia (macOS 15) Status

    As of this writing, macOS 15 Sequoia support is still problematic in this setup:

    • The installer background doesn't load properly
    • Performance is painfully slow
    • For now, older macOS versions like Ventura offer a better experience

    Running on Ubuntu

    The same approach works on Ubuntu (and potentially other Linux distributions) with a few additional terminal commands:

    # Install Docker
    sudo apt update
    sudo apt install docker.io
    
    # Start Docker
    sudo systemctl start docker
    
    # Add your user to the Docker group
    sudo usermod -aG docker $USER
    
    # Navigate to your Docker compose file directory
    cd ~/Documents/docker-macos
    
    # Run the container
    docker-compose up

    When running on more powerful hardware (like a Mini PC with an i9-12900H), performance can rival that of a Mac Mini M1, even without GPU acceleration.

    Conclusion

    While macOS is usable through this Docker method on a Synology NAS, it's still a bit of a novelty. For serious VM work on NAS hardware, a Linux distro would likely be a better choice. That said, it's an impressive technical achievement and demonstrates the flexibility of both Docker and modern NAS systems.

    As dockerized macOS continues to develop, we may see improved performance and compatibility with newer macOS versions. For now, it's a fun experiment for the technically curious.

    Note: Docker containers can consume significant storage space. Remember to clean up unused images and containers when you want to free up space.


    Transform Your Apple Silicon Mac into a Steam Deck with Asahi Linux, A Tutorial

    Pre-Requisites and Warnings

    Asahi Linux is for Apple Silicon Macs. Intel Mac users can dual boot into Windows or popular distributions of Linux, which is significantly easier. If you're looking for an easier method of playing PC games on your Mac, I'd suggest my Install Windows Steam games on Apple Silicon Macs Using Whisky (A free GPTK Front-End) tutorial. Also, Crossover offers an even greater range of compatibility than Whisky although is paid software.


    Warning: This is still fairly experimental, be sure to back up all your data as you can "brick" your Mac if you are not careful. This may leave the Mac unbootable and require a DFU restore.

    I need to reiterate the that this is an experimental process and may not work as expected and constantly changing. For support questions I'd recommend using communities like r/AsahiLinux.

    • Documentation: Check the official Asahi Linux docs for up-to-date support information (currently, only M1 and M2 series are supported; M3 series support is TBA).
    • Storage: Ensure you have a significant amount of free SSD space (e.g., at least 100 GB free). As of now, installation to an external drive is not supported.
    • Data Safety: This process can potentially render your Mac unbootable and may require a DFU recovery (which needs a secondary Mac). Back up all your data before proceeding.

    Step-by-Step Installation Guide

    Step 1: Download and Run the Installer

    1. Visit the official Asahi Linux website and copy the provided curl command.
    2. Open Terminal on your Mac and paste the command. This command downloads the installer script (typically alx.sh) and pipes it to sh for execution.
    3. The guided installer will launch. Choose to resize an existing partition to create space for Linux.

    Step 2: Partition Your SSD

    1. The installer will display your SSD capacity and free space. Decide on a partition size. Note that the value you enter represents the space to be kept for macOS, with the remainder allocated for Linux.
    2. For example, on a 2 TB drive, if you want to allocate 500 GB to Linux, enter 1500 GB for macOS.
    3. Wait patiently while the partitioning process completes (this may take up to 2 hours).

    Step 3: Install Asahi Linux

    1. Once partitioning is complete, select the option to install Linux into the free space. It will ask how much percentage of the free space you'd like to use for Linux. Typically most users will want to select 100%
    2. Choose your preferred desktop environment. Options typically include KDE Plasma (for customization) or GNOME (for simplicity).
    3. Decide how much of the free space to use for Linux (usually 100%) and assign a name to the new OS.
    4. Proceed with the installation by following the on-screen prompts.

    Step 4: Boot into Asahi Linux

    1. After installation, fully shut down your Mac. This should be done via the terminal as the installer prompt will ask you to hit enter to shut down.
    2. Hold down the power button while turning on your computer and release it when it lists the boot menu to access the boot selector, and choose Asahi Linux.
    3. Boot into recovery mode where a Terminal window will open. Enter your password and agree to add a custom boot object (this sets your security to permissive mode for Linux).
    4. Create your username and password if prompted.

    Step 5: Set Up the Linux Environment

    Congrats! You should now be in Redhat Asahi Linux! Proceed through the basic installation prompts (language, user account, timezone) etc. If you've made it here, you're past the difficult part. Now it's time to get things ready.

    1. Open Terminal in Asahi Linux.
    2. Update the system by running:
      sudo dnf upgrade
      This command updates all packages—including the latest Vulkan drivers.
    3. Install Steam by running:
      sudo dnf install steam
      The installation may take some time, and Steam should launch automatically once complete.

    Step 6: Configure Steam and Enable Proton

    You'll need to sign into your Steam account as you normally would and enable steam play.

    1. Log into Steam once it launches.
    2. To enable Proton (which allows Windows games to run), go to Steam Settings > Compatibility and check “Enable Steam Play.” Restart Steam or your computer if prompted.
    3. Before installing games, visit ProtonDB to verify compatibility. If a game isn’t listed as working, it will not function correctly under this configuration.

    Additional Notes

    • The first launch of a game may take longer due to shader compilation.
    • You can force different Proton versions per game. Right-click a game in Steam, select Properties > Compatibility, and choose your preferred Proton version.
    • Many games may still have issues due to the experimental nature of the setup.

    I need to reiterate that this is early stages of support, as it's only been recently where Vulkan drivers have matured enough to support Proton. Asahi Linux still has a long ways to go. It's a viable OS but has some beta-bugs, such as not sleeping properly. I found my M1 Max was warm when I pulled it out of my backpack despite having put it to sleep and the battery was mostly drained.


    How to play Playstation 2 Games on your Mac (PS2 emulation)

    This guide was originally written in 2022, and since then, it has been updated to reflect the latest changes in the PS2 emulation scene on macOS. The original guide was written when PCSX2 and AetherSX2 (An ARM native port of PCSX2) were both functioning and viable choices. AetherSX2 is no longer; thus, this is a new simplified guide with a new simplified video. I've also updated the guide in 2025 to include instructions on how to make your own game backups from macOS.

    For years, Windows has had a huge lead in emulating the Sony PlayStation 2 thanks to PCSX2, but as of roughly three years ago, there's been traction on updating the Mac port of PCSX2, now complete with Metal (Apple's Graphics API) support. This recent development makes performance better than ever for Mac users. This is an additional supplement to the video below, which covers the PS2 setup in greater detail.


    Advantages over a real PS2

    • Better visual fidelity, Ability to play in high definition, 4k and beyond!
    • Texture packs even higher fidelity graphics
    • Faster load times
    • Freeze states allow games to be resumed instantaneously
    • Ability to have virtual memory cards and download save states
    • Ability to load in high-resolution texture packs for games.

    This guide will cover the basics of playing PS2 games on your Mac. The things you will need:

    1. A (semi) modern Mac. Playstation 2 emulation doesn't require bleeding-edge hardware, but the faster the computer, the better the results.
    2. A game controller (preferably a PS4 or PS5 controller or Xbox series controller)
    3. Playstation 2 Games

    That's it. Pre-owned Playstation controllers are easy to come by, just like used games. This is a relatively cheap endeavor as there's a good chance you already have a controller and a USB cable to connect it to your Mac and even possibly the games.

    Downloading the Emulator

    For the first step, you'll need to download the correct emulator for your Mac. Since Metal is a recent addition to these emulators, we'll want the bleeding-edge versions. Intel Mac users will download the nightly build of PCSX2.

    Once downloaded, decompress the emulator. To open it for the first time on macOS 11 Big Sur or later, you must click right and select open to allow the application to open.

    macOS 15 users will need do the following:

    1. Double click the application and cancel
    2. Go to System preferences
    3. Go to Security and Privacy
    4. Scroll down to the "Security" section
    5. Below the "Allow Applications" should be a message "PCSX was blocked to protect your mac", click "Open Anyway" to whitelist it

    First Launch!

    PCSX2 now has a getting started quick launch that assists on the initial setup. This is remarkably improved and the emulator setup. You will need the following:

    1. A controller connected to your Mac
    2. A copy of the PS2 Bios
    3. Games

    PlayStation 2 Bios

    Next, you will need to obtain the Playstation 2 bios. Bios is the firmware for the PlayStation 2, which also contains its basic operating system. However, downloading it is legally grey at best, so I will not directly link it. However, it can be easily found using popular search engines searching phrases like "Archive.org PS2 Bios" or manually dumped from a physical PS2 for those who want to be 100% legal.

    PS2 Emulation - Controllers setup

    The PlayStation bios either need to be placed in a default location or a location of your choosing. This is done at the "Getting Started" sequence, but it can always be updated later; go to Preferences -> Bios and point the emulator's BIOS directory to your PS2 Bios. The bios should appear in the emulator's list if the versions are correct.

    Games

    Physical PS2 games can be dumped into ISO or even inserted into a DVD drive and played on your Mac (if you have a DVD drive). Games dumped as ISOs can be placed into a folder and displayed in a list format for easy browsing. From the Preferences, select the games list and add your folder to the games directory. If your folder has folders inside of it, allow it to scan recursively.

    PS2 Emulation - Games list

    Creating ISOs using your Mac is pretty easy. If you have a DVD-equipped Mac or. SuperDrive, you create an ISO by doing the following:

    1. Launch Disk Utility on your Mac (it's located in Applications -> Utilities)
    2. Insert the game disk into your DVD drive
    3. Right-click the disk from the right-hand corner, and right-click it using the "Create Image from..." option. This will likely take several minutes.
    4. Set the "Image Format" to "DVD/CD Master." Click the "Save" button
    5. Go to the location where you saved the disk image, and change the file name suffix from ".cdr" to ".iso".

    Please do not ask me for games, yes they exist on the internet. Yes you can download them but I will not respond to requests on where to download them.

    Controllers

    While you can play PS2 games with a mouse and keyboard, the best way to enjoy PS2 games is to use a controller, preferably a Sony Playstation controller. The Sony PlayStation 3, 4, and 5 controllers are all great candidates, as they can be directly plugged into your Mac via USB or Bluetooth. During the getting started sequence, it will prompt you for a controller. Once plugged in or connected, select Automatic Binding in the upper right-hand corner and find your controller from the list. The emulator will automatically map the controller buttons.

    PS2 Emulation - Controllers setup

    Graphics

    Easily one of the best features of the PS2 emulator is the ability to enjoy old titles in HD. 3D games (games using polygons) will render natively, even up to 4k (or beyond), resulting in much sharper and clearer graphics. From the settings menu, select graphics. Make sure the emulator is using the Metal renderer.

    PS2 Emulation - Graphics setup

    By default, the emulator will be set default, although you can force to Metal in the tab. The default settings largely do not need to be touched. However, you'll want to click the rendering tab as this is where the bulk of visual fidelity tweaks exist.

    The Rendering tab contains a plethora of options, but the two that are of the most interest are the Internal Resolution and Anisotropic filtering. The internal resolution will define what resolution you are playing games at. Anisotropic is a less noticeable but loved feature that affects how textures are rendered at extreme angles. The higher the filtering, the sharper textures will be when viewed from extreme angles. Both features come at a significant performance cost. It's best to play with the settings to find out what works for your Mac. Modern Macs with beefier CPUs and GPUs will be able to produce better results.

    PS2 Emulation - Rendering setup

    Some of the graphics fidelity options is a trial and error approach as not all games will perform the same, and different portions of games may perform differently. I suggest playing around after you've managed to successfully play a few games for a bit then experimenting.

    PS2 Emulation - post-processing setup

    One somewhat new addition is the post processing options, while they're largely gimmicky, many users will want to enable to FSAA (Full Screen Anti-Aliasing) as it will help eliminate harsh edges on polygon graphics aka "jaggies".

    Memory cards

    The original PS2 shipped with 8 MB cards and supported up to 64 MB cards. For a modern computer these are trivial amounts of space, and memory cards can be created and managed in the emulator's Memory Card section in the settings. Virtual memory cards can be downloaded from various sites with preloaded save states.

    To manage saves on Memory cards, boot into the PS2's bios. This will allow you to manage the memory cards like a regular Playstation 2.

    Creating your own ISOs from discs

    Creating ISOs from your own PS2 games is a great way to preserve your collection. It's also a great way to play games without needing to swap discs. To create an ISO from a PS2 game, you'll need a Mac with a DVD drive. Here's how to do it via Disk Utilty:

    1. Insert the PS2 game disc into your Mac's DVD drive.
    2. Open Disk Utility (located in Applications -> Utilities).
    3. Click on the PS2 game disc in the left-hand column of Disk Utility.
    4. Click the "New Image" button in the toolbar.
    5. Choose "DVD/CD Master" as the Image Format and "None" as the encryption.
    6. Click "Save" and wait for the disc to be copied to your Mac.
    7. Once the disc has been copied, you can rename the .cdr file to .iso.

    You can also do this from the CLI using hdiutil which is my preferred method.

    • hdiutil makehybrid: The macOS utility for creating disc images
    • -iso: Creates a standard ISO9660 filesystem (required for compatibility)
    • -joliet: Adds Joliet extensions for longer filenames (important for PS2 games)
    • -udf: Adds UDF support (critical for PS2 DVDs as they use UDF format)
    • -o ~/Desktop/PS2_Backup.iso: Specifies the output file path
    • /Volumes/PS2_GAME: The input directory (mounted PS2 DVD)

    Here's the command to create an ISO from a PS2 game disc. You will want may want to remove the default volume name and you will need to alter the paths to match.

    hdiutil makehybrid -iso -joliet -udf -default-volume-name "PS2GAME" -o ~/Desktop/PS2_Backup.iso /Volumes/PS2_GAME

    Complete 1:1 backup

    hdiutil makehybrid -iso -joliet -udf -verbose -all-files -o ~/Desktop/PS2_Backup.iso /Volumes/PS2_GAME

    While AI often is unreliable, it is extremely useful for debugging terminal commands and explaining terminal commands. Both ChatGTP and Claude even on the free tiers are very useful.

    I've also made a videos on emulating other PlayStation consoles.

    Playstation 3


    Playstation 1

    iOS Emulation (covers PPSSPP and PlayStation on iOS)


    Fixing Phoronix Test Suite for Apple Silicon

    So..... This is the sort of post that won't mean much to most people, but I think about three people who will be thankful. Right now, Phoronix Test Suite seems to be hard-coded for Intel. This means running many dependencies using x86 binaries and having Homebrew installed as x86 as well as arm64. The jump-off point for me was Phoronix Test Suite #110993 .

    For a quick summary, brew installs to /usr/local/ on x86 and /opt/homebrew/ on macOS. However, if you run, say, the pts/webp on an ARM64 Mac, regardless if you have already installed libjpeg, it will complain:

    % phoronix-test-suite debug-run webp       
    
     Evaluating External Test Dependencies ...........................
    
    The following dependencies are needed and will be installed: 
    
    - jpeg
    
    This process may take several minutes.
    Warning: jpeg 9f is already installed and up-to-date.
    To reinstall 9f, run:
     brew reinstall jpeg
    
    There are dependencies still missing from the system:
    - JPEG Library
    
    1: Ignore missing dependencies and proceed with installation.
    2: Skip installing the tests with missing dependencies.
    3: Re-attempt to install the missing dependencies.
    4: Quit the current Phoronix Test Suite process.
    Missing dependencies action: 3
    
    The following dependencies are needed and will be installed: 
    
    - jpeg
     

    This error message gives you the option to reinstall the missing dependencies. Seems great, until you try to use it. If you select option 3, if you already have the dependency installed, it'll report it as installed

     his process may take several minutes.
    Warning: jpeg 9f is already installed and up-to-date.
    To reinstall 9f, run:
     brew reinstall jpeg
    
    Phoronix Test Suite v10.8.4
    
     Installed:     pts/webp-1.4.0
     

    After the test attempts to run, we can se that the dependency for jpeg is reporting as not compiled, and thus the test file for jpeg cannot to be read.

    Test Run Directory: /Users/greg/.phoronix-test-suite/installed-tests/pts/webp-1.4.0/
    
    Test Run Command: ./webp -v -mt
    
    JPEG support not compiled. Please install the libjpeg development package before building.
    Error! Could not process file sample-photo-6000x4000.JPG
    Error! Cannot read input picture file 'sample-photo-6000x4000.JPG'

    You'll need to go into /opt/homebrew/opt/phoronix-test-suite/, the version number, 10.8.4 then open /share/phoronix-test-suite/pts-core/objects/client/pts_external_dependencies.php and locate the $possible_paths declarations (there should be two in all), and add to the array the following, '/opt/homebrew/opt/', '/opt/homebrew/include/', '/opt/homebrew/', .

    This will now enable Phoronix to search for the ARM64 binaries.

    You may still encounter issues, I found that I needed to declare in the terminal session the following to run the webp test as libjpeg:

    export LDFLAGS="-L/opt/homebrew/opt/jpeg/lib"
    export CPPFLAGS="-I/opt/homebrew/opt/jpeg/include"
    export PKG_CONFIG_PATH="/opt/homebrew/opt/jpeg/lib/pkgconfig"

    Apple Watch 5 to 10 Impressions

    Years ago, I purchased the first and only Apple Watch, the 5th edition featuring the always-on display. I picked up once covid shut down the world as a way to track my physical activity as I was no longer biking to work and no longer able to go to the gym. It largely worked; I set my workout goals at 820 calories a day and since then have averaged about 80% of the time hitting my workout goals.

    Apple Watch 5 and 10

    I am not a watch power user; the only watch notifications I allow are text messaging/phone calls, exercise reminders, and navigation cues. Otherwise, my watch stays silent; in fact, it almost always is in silent mode. My Apple Watch 5 did what I wanted and what I needed well enough that I didn't bother even considering any of the later models. As the first "Always-on" display, there have certainly been improvements, but few things of interest beyond the blood oxygen monitor (more on that later)

    What finally pushed me over the edge to upgrade was battery life. Apple, of course, tactfully prices its battery replacements in such a way that you might as well upgrade. Did I really want to spend $79 to replace the battery? I'd been living with a rapidly decreasing battery for nearly a year, and I made it work by buying an extra charger for my car to charge it before a hike or after, and I became used to mid-day charging. Finally, I decided it was time to either replace the battery or upgrade. I chose to upgrade.

    The Upgrade

    Over the five years since the release of the Apple 5, there's been a steady stream of upgrades: Apple has tacked on faster SoCs, moving from the 7nm node Dual-core Tempest to a newer 4nm node Dual Core Sawtooth, with more L1d and L2 cache, as well as added a more powerful neural engine. They also reduced the bezels, improved the screen brightness, improved the screen viewing angles, improved the optical heart rate sensor, improved the accelerator, included U1, better wifi, removed force touch, increased the faster charging capabilities, improved the screen hardness, made the speaker louder, changes the sizing fractionally (thinner and larger) and added a blood oxygen sensor that has been disabled due to legal disputes around the patent.

    Year over year, the Apple Watch generational change is not terribly meaningful, but stacked over a five-year window, they add up. This is the new normal as the space for meaningful upgrades closes. That said, do any of these matter?

    Yes. Overall, I've been rather impressed with how much better the screen is and how much smoother the animations are. In broad daylight, the screen is simply more visible. Also, the 42mm vs the old small 40mm is noticeable. As an active person, I don't like big watches, which is why I feel like the Apple Watch Ultra misses the mark. When hiking, exercising, biking, skateboarding, skiing, and the rare times I'm in a kayak, I want to be free of as many burdens as possible, and that includes my wrist. As an average-sized man, I could get the bigger watch, but I don't want the watch to be bigger. That said, I do appreciate the extra line of text it now can present. Apple tweaked the UI to display just a bit more information, which makes a world of difference.

    I've yet to really notice much difference in the heart rate monitoring, but I'm also not obsessively checking it. I look at my activity points and check my heart rate during a run or hike, and that's about it, but there is comfort in that it's more accurate.

    The speaker is noticeably better, to the point of seeming like it's a cheap laptop speaker from 20 years ago instead of a buzzing pest. Is it amazing? Absolutely not, but it is more viable to take a call on your wrist without smashing your ear to your watch if it's moderately noisy.

    Is it worth the upgrade from the Apple Watch 5 to 10? Mostly. If none of the features sound terribly exciting, just replace the battery and hold out for another year. Perhaps Apple will improve the battery life which has been the achilles heel of the Watch or maybe another nice small quality-of-life upgrade.

    There in lies the issue with the Apple Watch. If you've used one, there's not much to say; it does the Apple Watch things, like unlocking your computer, answering phone calls, sleep tracking, exercise monitoring, environmental noise monitoring, ECGs, functioning as a remote, finding your misplaced phone, and so on. These are all things my previous Apple watch did and did well. It's just that everything now has a bit more polish. I suppose that's the sign of a very mature product. I like it, but unless Apple has a breakthrough in health tracking, such as blood sugar monitoring and/or body temperature monitoring, there won't be a compelling reason to upgrade for years to come, and that's ok.


    Play Dune 2, Command and Conquer: Tiberian Dawn, Dune 2000, and Command and Conquer: Red Alert on macOS

    Updated for 2024!

    Every now and again, I get a hankering for retro gaming, and it ends up on this blog. I never played Dune II: The Building of a Dynasty on a PC, only the Sega Genesis port Dune: The Battle for Arrakis, so it was news to me that you could play Dune II on MacOS. I assume anyone who is reading this probably knows the place that Dune plays in gaming history. Still, it's largely considered the title that defined the genre of the real-time strategy (RTS) or the first real-time strategy (even if not entirely correct). Also, Sim Ant is the real first RTS ;)


    I can't say I have a special affinity for the genre, as pretty much the only other RTSes I've played are the original Command and Conquer and Warcraft 2, but I always liked Dune: The Battle for Arrakis. I've revisited via emulation a few times. I hoped Dune 2 or Dune 2000 would end up on a service like GOG.com, but sadly, it hasn't. Thanks to open source, both Dune II: The Building of a Dynasty and its sequel, Dune 2000, can be played on Mac OS, natively and with some modern improvements.

    Disclaimer: By the letter-of-the-law, abandonware isn't 100% legal, but there's no real legal vector to obtain these games, each over 2 decades old. I don't see a moral quandary here, but you can always obtain the original game disks if you see fit.

    Dune II using Dune Legacy

    Dune Legacy on macOS 10.14

    Dune Legacy gives a nice modern twist to the original shortcomings of Dune II, including better AI, head-to-head, ability to group select units, more hotkeys, modern resolutions, HD graphics, and so on.

    1. Search "Dune II Abandonware" or use archive.org. Download it.
    2. Download Dune Legacy
    3. Open the DMG, and drag the Dune Legacy app to your Applications folder. Also, decompress the PC copy of the abandonware Dune II
    4. Right-click the Dune Legacy App, and click Show Contents. Open within the app, Contents -> Resources
    5. Drag all the .PAK files from decompressed Dune II into the Dune Legacy -> Contents -> Resources
    6. Double-click to start. You will most likely see a security message. Instead, right-click the application and select open. You'll then see a message with an "Open" option. Select open. On older macOS versions, Go to system preferences -> Security and Privacy.

    The security message relates to the signed code. The developers of Dune Legacy do not pay for an Apple dev license; thus, the code is unsigned.

    Dune 2000 and/or Command and Conquer using Open RA

    OpenRA Dune 2000 on macOS 10.14

    Open RA stands for "Open Red Alert" but also includes Dune 2000 and Tiberian Dawn support with features that are much like Dune Legacy's modern screen resolutions and minor tweaks. Unlike some of the other ports, OpenRA Dune focuses on delivering re-creation rather than improvements and online play. OpenRA has been progressively improving over the years. It used to require installing Mono, an opensource framework for .NET functionality but no longer requires independently installing it. Also, at some point in the future, it'll support Tiberian Sun.

    1. Optional: Nab the Dune II ISO for Windows from a site like myabandonware or archive.org.
    2. Download OpenRA. Each game comes with an independent app. Install to your applications folder
    3. Launch the game and either select "Quick Install" or if you have the original game discs, mount the ISO and select advanced install.
    4. Also, see d2kplus for mods, some are supported in OpenRA.

    Game assets are installed in /Library/Application Support/OpenRA. If you delete OpenRA games, be sure to delete this folder as well to completely delete OpenRA. (This may also delete your game saves).


    Winning the war on Spam bots through stupidity

    It finally happened, a spambot defeated my super simple email obfuscation. I've been using a simple JavaScript function to encode my email address for a month. It's not bot-proof, as anything that renders out the page and performs interactions creates a barrier that the page requires rendering out in full and an interaction, making it far more costly.

    Here's the email that defeated my spam protection:

    Hi Greggant Team,

    I trust this message finds you in good spirits. Your finance platform stood out to me – your content is both informative and engaging.

    Given our mutual interest in finance, I believe there's potential for collaboration between our platforms. Would you be interested in discussing this further?

    Looking forward to your response.

    -- Sara Evans
    Creative Writer
    itsevanssaraaaaa@gmail.com

    These emails are nothing new to anyone who runs a blog, usually some sort of backlinking scheme or scam to improve SEO. Obviously, this is pretty low effort as my blog is not related to finance in any way. It's a spray-and-pray approach. It must sometimes work as I used to get several a week, and it's been happening for years.

    My stupid solution

    I've added a simple question to the contact page. It's a simple math problem that requires a human to solve, adding up 3 + 5.

    That's it! Well.... actually, it's a little more complicated, I'm using crypto-js to obfuscate the email to make it more costly for a would be spammer. The logic of my code looks like the following:

        function decryptEmail(encryptedEmail, key) {
            const bytes = CryptoJS.AES.decrypt(encryptedEmail, key);
            return bytes.toString(CryptoJS.enc.Utf8);
        }
        // Prompt the user with a challenge
        const userAnswer = prompt("To reveal the email, please solve: 3 + 5");
        if (userAnswer === "8") {
            const decryptedEmail = decryptEmail(encryptedEmail, secretKey);
            const emailElement = document.getElementById("email");
            emailElement.href = "mailto:" + decryptedEmail;
            emailElement.textContent = decryptedEmail;
            emailElement.classList.remove('not-active')
        } else {
            alert("Incorrect answer. Please try again.");
        }

    The hilarious part is I have the key and encrypted email in the source code. It's not secure, but it's not meant to be. It's meant to be a barrier that requires a human to interact or at least a bot to load the entire DOM, including crypto-js. From my observation, this has been more effective than using Recaptcha.

    The basic principal of rolling your own email obfuscation is to make it so it requires a sophisticated bot, that consumes resources, a proof-of-work. It's trivial for a single user but costly for a macro operation. The issue with popular off-the-shelf solutions is they can be specifically targeted. This isn't a forever solution as eventually the bots with improve and the cost will fall further but I imagine this will be a workable solution for quite some time.


    Apple Maps web beta doesn't support FireFox but you can still use it

    FireFox is not supported in Apple Maps

    It's been a long time in the making, but Apple Maps finally has a web version. This is significant as Apple, for years, has allowed iOS app devs to use MapKit for free. For those unaware, Google charges for its Google Maps API, meaning for larger web apps or more complex ones, developers are on the hook for footing the bill for maps behaviors, generally to the tune of $5 per 1000 requests. However, depending on the interactions this can be more.

    I don't think I need to explain why this is a big deal. The downside has always been that there isn't a web analog for Apple Maps, meaning apps that extend to web versions would require entirely different ecosystems. Apple hasn't said if or when it'll offer web integrations or at what price point, but any competition is healthy.... except when it doesn't support the open web.

    Perhaps this will change but according to Apple, the only supported browsers are Chromium or Webkit. If you visit it, you'll see the above message. The bigger irony is that Apple Maps works in Firefox. Simply faking the User-Agent to Safari and Firefox can access Apple Maps.

    I'm guessing this will certainly change in the future, but it's still strange to see Firefox blacklisted. The irony is that because of the release channels for FireFox and Chrome not being tied to the operating system updates, generally they remain more up-to-date than Safari.


    Blogging in the age of AI

    There's an air of futility in writing blog posts in the age of "Artificial Intelligence," as anything you write can and will be stolen without recourse. There's absolutely nothing I can do to stop billion-dollar corporations from hoovering up over a decade's worth of blog posts made in good faith to provide information freely to the open internet. Estimates are world-wide traffic will fall roughly 30% as features like Google's A.I. overviews cobble together broken synopsises of information.

    Videos aren't safe either, as YouTube's transcriptions are easily stolen for A.I. data. Everything is a race to the bottom.... or is it? It's pretty easy to go full doomer in the face of A.I. but there are a few things worth calling out.

    Probably the biggest roadblock working against our current large-language models; the first is "good" data. All data pre-2021 can be assumed to be non-LLM trained, and we're running out of it. To use >Multiplicity as a reference point, "You know how when you make a copy of a copy, it's not as sharp as... well... the original." Well, we're fast entering the age of the copy-of-a-copy. We've moved well past the enshittening to the dead internet. Bots on bots.

    The other great hope is the cost of A.I. Right now it's assumed OpenAI is losing a staggering $700,000 to run ChatGPT. Make no mistake, this cost can and will come down, and local large language models can be paired down with quantization to lower bit-depths for models palatable for personal computing but for now we may be at the limits of LLMs and the solution seems to be more LLMs which isn't bringing down the cost of compute.

    Finally, there's legislative and legal, which I hold less hope for. As much as ChatGPT has reduced the friction of my job, I'd trade it in a second for stability.

    If I were to shake a magic 8-Ball, it's read "uncertain, ask again." but here are my few predictions:

    • A.I. will continue lower the barrier even further for low-effort spam content, farm content like Apple Daily and iLounge.
    • A cat-and-mouse game will arise from Google vs Dead Internet content farms and zombie sites.
    • A new value will be placed on social proof content, such as YouTubers who show their faces and demonstrate they are indeed human, as it'll be a long time for purely A.I. convincingly recreate the difficulties of long-form video without errors, especially in changing/complex environments. For written word, SubStacks from authors who have established presences that extend to the real world will function as the social proof. Musicians have live performances. Graphic artists have physical media. If you're purely digital, expect a diminished return in the future.
    • We are fast approaching the law of diminished returns. GPT3 was the great leap forward but the differences between 3.5 and 4 vs 3 are much less mind blowing. Other models, like Claude, are impressive, but none have been game-changing.
    • Future breakthroughs are likely to be task-specific. We've seen voice, text/coding, music, images, and video. Now comes more particular. We're likely to see say, in music software, a scored section of midi translated more accurately to a strings section, mimicking how a musician might actually play the score. We may see LLMs and machines applied to spreadsheet management. There are almost certainly companies looking at these two examples.
    • Snowcrash, Deamon/Freedom 2.0, and hyper derivative young-adult lesser work, Ready Player One all had the idea of AR/VR wrong. While metaverses have, do and will exist into the future, the backlash is happening as schools are starting to experiment with removing cellphones from schools and states are pushing back against social media. The federal government continues to flirt with banning TikTok. Instead we'll see divisions. People may consume A.I. tailored bullshit entertainment for cheap hits of dopamine, but we will also see a pressure for the measurably human, akin to the DIY and right-to-repair movements.

    Now the fun part: To see if I'm totally off base in roughly 2-3 years time....


    House keeping a blog, a decade later

    This blog is fairly static by design, minimalistic, and simple, but it's not immune to changes that affect the end user: I've added printing, dark mode, topics, and so on.

    Changing Taglines

    Pictured: Examples of the random taglines


    • My contacts page uses cryptography to encode the email address, and a simple problem now reveals it. This has already resulted in a dramatic reduction in spam. It's not bot-proof, as anything that renders out the page could defeat my solution, but it creates a barrier that the page requires rendering out in full and an interaction, making it far more costly.
    • I've updated to jQuery Slim. Sadly, the fitVid.js used for YouTube embeds requires jQuery. At some point, I'll rewrite it to pure JS and completely abandon jQuery. jQuery Slim shaves off about 20k of a page load.
    • The tagline now changes randomly. It's brain-dead simple JavaScript, but it should inject a bit more whimsy into the site. There are about 50 slogans in total. My long-standing tagline, "Adventures of a Front End Architect," was something I punched in nearly a decade ago. At the time, it seemed fine, but I grew to find it a bit "cringe," as the kids say.
    • My last several posts have been adapted from YouTube videos. This trend is sure to continue, as it makes sense to double-dip on my content. People can engage with my content however they prefer, whether via video or blog.

    I missed an important milestone, but my blog crossed a remarkable anniversary in early 2023, ten years. I backed off posting in late 2022, realizing how my content would only be mined, stolen, and regurgitated by AI without attribution. Ironically, YouTube is what brought me back to the blog, realizing a small but not insignificant amount of my traffic could be attributed to my blog. Google might be eating its own tail with AI content, but it's still directing people to this blog. Plus, this blog is an expression and love letter to the internet as it once was. By 2013, before I even typed my first character into this blog, the internet had already become aggregated by social media, as lay people could now participate in content creation.

    When I first started this blog, I was living in a different city for a failing company and realized that I didn't have a social media presence. I started this blog with the ambition of writing about web dev, to create a footprint that potential employers could find. It sort of worked; I ended up being laid off, selling my house, and packing for a much better job as I worked for a cool hipster agency in PDX. (RIP Emerge). This blog took an unexpected turn, first documenting my experiences (for the first time) in an empty city during the pandemic and later launching a YouTube presence.

    Originally, I launched this blog on Tumblr, of all places, misunderstanding it as a platform for actual blogging. I quickly learned that Tumblr was Imgur with a feed for sad teenagers, not grown-ass adults, blogging about development. I went to great lengths to create my own minimalist theme (removing many Tumblr interactions) and was one of only a handful of people with a custom domain for their Tumblr blog. It's been a throwback since its inception, and the final straw was in 2016 when I abandoned Tumblr. While there is more friction in self-hosting, I've been happier with the results. Even if YouTube implodes tomorrow, the tutorials I've created will live on.


    Running OS X 10.6 Snow Leopard in 2024

    This blog post is an adapted script from a YouTube video I wrote in 2023 for one of my more popular videos. The video is the superior version and outlines the entire process of downgrading a Mac Pro 2008 to run Snow Leopard and demonstrates running 10.6. This is a companion piece, that serves as a general outline as opposed to a comprehensive overview, think of it as the TLDR or cliff notes. I highly recommend checking out the video.

    OS X Snow Leopard remains to this day the most loved version of macOS. I made a video about which version of macOS is the "best," and I felt I may have been a bit harsh as, like most people, I absolutely adored Snow Leopard.

    The question is, can you use OS X 10.6 from 2009 to 2024? The answer is... yes but with a lot of asterisks.

    • Snow Leopard is only supported by Intel Macs from 2005 - 2010
    • Modern software will not work on it. Generally, software releases dropped support for it in the early 2010s.
    • Upgraded Macs like a Mac Pro 3,1/4,1/5,1 may need to be downgraded to hardware that was originally supported.
    • High resolutions beyond 1440p likely are not supported, and 10.6 does not have resolution scaling.

    For example, in the video, I had to install the original GPU on my Mac Pro 2008, downgrading from a GeForce 760 to an ATI Radeon 2600 XT. I also could not use Wi-Fi, as I'd upgraded the AirPort card to 802.11 AC/Bluetooth 4.x.

    Performance

    Snow Leopard was loved for feeling snappy, and it does live up to the hype, although this shouldn't come as much of a surprise running this with a Mac Pro 8-Core 2.8 GHz 2008, off an SSD and 8 GB RAM, which was well specced for 2009. Ironically, at the time of 10.6, my Mac Pro 2008 had more RAM sitting at 12 GB in 2009.

    Snow Leopard's speed, however, is a bit skin deep as in Lion. CNet in 2011 found that Lion had a mild performance uplift over Snow Leopard. If you're looking for speed, most of it likely existed in simpler software of the era than any magic elixir, and it lacks the modern 6. management that was introduced in Mavericks, such as virtual memory compression. 10.7 also has more modern browser support.

    Internet

    The biggest barrier to using Snow Leopard is the internet. Apple's high release cadence and constant API library changes mean there isn't a lot of long tail support. More modern CSS3 and especially JavaScript ES6 are not supported Safari and it also lacks TLS 1.2 and TLS 1.3, Fetch API, WebSockets, IndexedDB, Content Security Policy (CSP) and Subresource Integrity (SRI). For the truly nerdy, the Safari JavaScript engine was still using "SquirrelFish", instead of the current engine, JavaScriptCore. This means a large portion of the internet is not accessible out-of-the-box with Safari 5.

    The last officially supported browser for 10.6 was Firefox ESR 45 ESR from August 2016, which is now eight years old, an eternity in internet years, making for limited capability. While it supports many more features, trying to surf the internet is a very broken experience. The web is semi-usable, but viewing websites like Apple's homepage is a mess.

    However, there are some much more modern browsers. They are as follows:

    Whatever "newness" Roccat 8 had didn't extend to better support. Unless future releases radically improve, this one is best avoided. The others were much more interesting.

    SpiderWeb vs InnerWeb vs ArticFox

    Historically, both Firefox Legacy was the legacy browser of choice, but it was sunsetted years ago. Fortunately, a new crop of browsers has risen up, although none can be considered a cutting-edge browser. Each of these is a Firefox hack, and all are fairly similar in ability, with the last active development stopping around 2022/23~. This means they're mostly able to surf the modern web... for now.

    SpiderWeb is a bit janky. It requires a polyfill XPI, Palefill (which is included with the browser but must be manually installed). A polyfill is a small JavaScript code snippet or library that allows modern web features and APIs to be used in older browsers that lack native support for those features and an XPI file, FireFox's plugin format.

    Innerweb is a simple double-click experience that doesn't require manual installation or hacking.

    ArticFox is yet another Firefox spin-off, but with the caveat that it's still being actively developed. Installing it takes a bit more work; there weren't any instructions included, so it took me a minute to figure it out. When you download ArticFox for 10.6, you need to download its lib files in the format of two Libc++ dylib files. These must be installed manually `usr/lib`. To do this, you need to first enable invisible files, which requires the terminal command on the screen, and then restart the finder.

    None of these browsers are truly modern as they're hacks at best, relying on stacks of work-arounds like polyfills, and shims to extend the functionality. Snow Leopard is 15 year old operating system and thus few users (if any) are daily driving a Snow Leopard.

    Creative Software

    While I did not test CS6, Adobe CS5.5 works great in Snow Leopard, but this places it massively behind. Connect my iPhone 14 Pro to my Mac, use image capture to import a DNG (RAW) image, and edit in Photoshop. However, DNG is an established format, unlike "RAW," which is on a per-camera maker basis. Modern cameras shooting in various manufactured RAW variants probably will not work.

    This sort of behavior extends to all creative software. It's possible to do real creative work well; however, you'll be locked to the tools of the 2010s. Editing video in Final Cut Pro is certainly possible, but the lack of the modern conveniences and more modern codec support like AVC mean either shooting in supported formats like MPEG4 and ProRes, and the hardware of that era is generally illsuited for 4k. This shouldn't come as a surprise but a Mac Mini M1 with only 8 GB of RAM with Final Cut Pro X would dog walk a Mac Pro 5,1 with 64 GB of RAM and Final Cut Pro 7 in Snow Leopard.

    Networking

    I haven't spent much time with 10.6 Snow Leopard in the domain of networking beyond the absence of support with my newer AirPort card but I did notice a quirk that it was not able to connect to my Synology NAS via SMB. If I get around to exploring this, I'll certainly update this section. Networking between other Macs worked without hitches. I was also able to connect via Apple's Screenshare to Snow Leopard and operate the computer from other modern Macs.

    Legacy Support and Rosetta

    Snow Leopard is the last version of macOS that supports Rosetta for PowerPC emulation. Early OS X games are unlikely to work or work well, but many have worked with Rosetta, whereas general software has a greater chance of working.

    CNET has an article on Rosetta's compatibility and the supported applications are fairly mixed. It's great for reminding us how smooth the transition to Apple Silicon has been compared to the PowerPC to x86 transition.

    Should you run Snow Leopard?

    No, you should absolutely not run 10.6 as a daily driver. It's woefully out of date for security. However, if you're looking for a bit of nostalgia, it's entertaining.


    How Memory Works in macOS (why Apple can get away with shipping computers with 8 GB of RAM)"

    This blog post is adapted from a YouTube video script. The video can be found below.


    When Apple Silicon first launched, you'd hear goofy statements from Apple and various publications regarding RAM and Apple Silicon. One of the most common assertions Apple made was that 8 GB of RAM on Apple Silicon was equivalent to 16 GB of RAM on an Intel Mac. I'd argue today, the majority of users understand 8 GB of RAM is not 16 GB, regardless of the process type. What makes 8 GB still usable in 2024, even if not ideal, is the memory management in macOS. This will be a high-level overview of how macOS manages memory, so you better understand your own Mac.

    Understanding Memory Usage in Activity Monitor

    Activity monitor in macOS 13


    First, open Activity Monitor on your macOS and click on the Memory tab. Here, you'll see a list of all the applications and processes currently running.

    To quote Apple:
    "The Memory pane displays how much memory your Mac is using, how often it is swapping memory between RAM and your startup disk, the amount of memory provided for an app, and how much of it is compressed memory."
    Apple Support

    The Memory Pressure Graph

    Memory Pressure


    The most important thing to understand is the memory pressure graph at its most basic:

    • Green: Your Mac is using memory efficiently.
    • Yellow: You might need to free up RAM as performance could be reduced.
    • Red: Your Mac needs more RAM, and performance is suffering.

    This is also reflected in the graph itself. Freeing up RAM is generally accomplished by quitting applications and processes or by rebooting.

    Memory Usage Columns

    To the right of Memory Pressure are two columns that provide an overview of your Mac's memory usage. Starting with the first column:

    • Physical Memory: This shows how much RAM is installed in your system. Note that on Apple Silicon Macs, this cannot be upgraded.
    • Memory Used: This details how much RAM is currently being used, broken down into several categories:

    Breaking Down Memory Categories

    • App Memory: The amount of RAM being used by applications.
    • Wired Memory: The RAM required by the operating system to function, which cannot be cached.
    • Compressed Memory: RAM that has been compressed to free up space for other processes.

    The Role of Cached Files and Swap Used

    Cached Files: These are stored in unused memory to speed up performance. With modern macOS, unused RAM is wasted RAM, so it's uncommon to see macOS with a lot of free RAM. This is important to understand for long-time Mac users who remember checking for the amount of unused memory to gauge system performance. This no longer applies to macOS.

    Swap Used: This indicates the space used on your startup drive for memory page outs, functioning as a memory extension when the physical RAM is fully utilized.

    Virtual Memory Management

    macOS uses a technique called virtual memory management. Here, each application thinks it has access to a large block of memory, which is actually a combination of physical RAM and swap space managed dynamically by the OS.

    If you have less physical RAM, your system will rely more on swap space. Modern SSDs, which are very fast, make this process generally transparent to the user. Over usage or reliance on swap space can cause wear and tear on the SSD over time. A larger SSD will have more memory cells to rotate, thus will have a longer life. It's not uncommon to see SSDs advertise terrabytes written (TBW), a common metric for advertising longevity of an SSD, radically increase as the size increases. Apple gets a lot of well-deserved criticism for selling RAM-starved computers with small SSDs that aren't user-serviceable. While SSDs generally are considered more stable than their spinning disk counter parts, they have a finite shelf life.

    Unified Memory in Apple Silicon

    Apple Silicon uses unified memory for both its CPU and GPU. This means both can access the same memory, reducing redundancy and improving efficiency. While Intel's iGPUs used unified memory, this is a significant upgrade, which required separate memory pools. The downside, when compared to a dedicated GPU, is that the RAM pulls double duty as video operations are now in the RAM. This means that data such as frame buffers, shaders, textures, vertex data, geometry buffers, render targets and so on, are no longer stored within VRAM, rather RAM, thus the RAM is now pulling double-duty as the RAM and VRAM. For lower memory configured devices, this can tax the RAM further. At more extreme configurations like a Mac Studio with 192 GB of RAM, this means access to far more video memory than a traditional GPU would have.

    Advanced Memory Management Features

    • Virtual Memory Compression: Introduced in macOS Mavericks (10.9), this feature compresses inactive processes to free up more RAM.
    • App Nap: This reduces the priority of inactive applications, saving memory and battery life.
    • Application Save States: Allows apps to be quit and relaunched to their previous state, freeing up memory.

    App Nap


    Some of Apple's energy saving and performance techniques also affect memory management such as App Napping. App napping works by detecting inactive applications, reducing their priority to minimize the amount of resources they consume. If an app meets certain criteria such as the app isn't visible to the user, isn't playing audio, or is performing a service like downloading a file, it can be put to sleep. This has memory implications, as napped applications are generally prioritized for memory swaps and memory compression.

    If we go to the CPU section in the activity monitor, we can add the column App Nap and see the apps that are actively in a nap state.

    Command Line Utilities

    For those who want to get geekier, macOS offers several CLI utilities out of the box. These can be accessed from the terminal by running the following commands:

    • vm_stat: Shows virtual memory stats.
    • memory_pressure: Provides detailed memory pressure information.
    • top: An terminal-based activity monitor. If you'd like an even more powerful activity monitor consider using htop via Homebrew for a more user-friendly experience).

    Final Thoughts

    Switching from Mac OS 9 to OS X brought many improvements, including protected memory, which enhances security and reliability. While modern macOS has made significant strides in memory management, it's always a good idea to keep your system optimized and understand how it uses resources.

    For a deep dive, check out the Developer Documentation and this Informit Article.


    How to play Sony PlayStation Games (PSX / PSone) on your Mac

    Emulating the PlayStation, also known as the original PlayStation, PS1, PSOne, or PSX, is straightforward and can give you a better experience than the native hardware with features like widescreen support, 4k rendering, and texture enhancements. If you've used PCSX2 or RPSC3, it will feel very similar. This tutorial is adapted from the video below. If you're interested in PlayStation 2 emulation, you can check out my guide here.


    There are multiple options to emulate the PlayStation on a Mac, including even on PowerPC Macs with Connectix Virtual Game Station.

    While emulators like OpenEmu do a reasonable job of emulating the PlayStation, they don't have the advanced graphical features of DuckStation. You can experience PlayStation at 4k in widescreen, with texture smoothing and faster load times. This tutorial explores the many features of DuckStation.

    Step 1: Download the Emulator

    First, go to the official website duckstation.org. The Mac port is listed under "Other Platforms" alternately, click here to go to the latest releases. This will take you to the GitHub page for the latest release. Scroll to the bottom and grab the Mac release. If for some reason you're experienccing issues, try downloading a different release.

    Decompress the .zip file (if it does not do this automatically), then drag the application into your "Applications" folder.

    Step 2: BIOS

    right click Duck station

    If you double-click the app, you'll probably see "DuckStation," which cannot be opened because the developer cannot be verified. This is common for open-source software as the developer has not paid for an Apple account. Instead, right-click the application and click "Open" to whitelist the application.

    You'll be presented with language and theme options. Make your selections and click "Next."

    Duckstation bios

    The next screen is for the BIOS. The PlayStation BIOS is firmware built into the console that initializes and manages hardware components and provides runtime services for games and programs. These are copyrighted, so I won't be linking them directly. Use your own moral judgment here. I personally own a PlayStation, so that's my vindication. Places like Archive.org are a good place to look for them. DuckStation requires multiple BIOS: Japan, US/Canada, and Europe/Australia. This enables compatibility with all regions.

    You can install these whenever you'd like. If you move them, then you will need to relink them.In DuckStation, click "Browse" and navigate to where you placed your BIOS files.

    Click "Next."

    Step 3: Adding games

    Games can be distributed in multiple formats, and it is possible to rip your own games into ISOs. Games are often ripped in bin/cue, .ecm, and iso formats, and all are compatible with DuckStation. Distributing games over the internet isn't legal, so I will not be linking any sources.

    I have a folder with a collection of games in it, so I will add this to my games directory library. Since this directory has folders inside it, I want to say "Yes" to scan recursively.

    Step 4: Setting up a controller

    Duckstation controller setup screen

    The next thing we'll want to do is set up a controller. The PlayStation had several controllers. Since my controller is a PlayStation 4 controller, I will be using the Analog controller. Using Ventura or later makes Gamepads a little easier to manage. There are tutorials on how to connect a controller wirelessly; I'm using the easiest method, which is USB. Plug it in, and it works. It is also possible to use wireless.

    Duckstation controller setup screen

    Using the automatic mapping, you can automatically bind the controller buttons. If, for some reason, you cannot use automatic mapping or wish to change a button layout, you can manually map the controller buttons by double-clicking the setting and then pressing the correlating button. It is recommended to use a Dual-Shock style controller as some later games, such as Ape Escape, require them.

    Step 5: Improving the Graphics

    Rendering options

    At this point, DuckStation is ready to play games. However, some additional configurations can still be performed.Next, let's go back to the preferences to configure our graphics. Before we get started, go to "Interact" and make sure you have "Apply per-game settings" enabled; that way, you can tweak graphics settings on a game-by-game basis. One of the advantages of the DuckStation emulator is the ability to play games at MUCH higher resolutions. Your mileage will vary; newer Macs with dedicated GPUs or Apple Silicon-era Macs should be able to handle higher resolutions. The video version demostrates the effects of changing these graphical options.

    • Internal Resolution: Determines the internal resolution of the rendered image. Higher values increase quality but require more processing power.
    • Down Sampling: Downscales the rendered image to fit the screen resolution, reducing aliasing. Disabled means no down-sampling is applied. Used for 2D games
    • Texture Filtering: Enhances the appearance of textures by smoothing them. xBR is a specific filter that improves quality but is computationally expensive.
    • Aspect Ratio: Adjusts the width to height ratio of the display. 16:9 is widescreen format.
    • Deinterlacing: Reduces flickering and artifacts in interlaced video. Adaptive FastMAD is a specific method of deinterlacing. This will be ignored if disable interacing is enabled.
    • Crop: Crops the image to remove the overscan area, which is the part of the picture that might be outside the viewable area of older TVs.
    • Scaling: The method used to scale the image to the desired resolution.
    • True Color Rendering: Enables rendering in true color, providing better color accuracy.
    • PGXP Geometry Correction: Corrects geometry errors in PlayStation games. Checking this will unlock advanced PGXP correction options. Recommended if seeking to enhance visual quality. It is not compatible with all games.
    • Force 4:3 For FMVs: Forces full-motion video sequences to display in a 4:3 aspect ratio.
    • Disable Interlacing: Disables interlacing to reduce flickering in some games. Recommended.
    • Widescreen Rendering: Forces games to render in widescreen, potentially expanding the field of view. Compatible with most games.
    • PGXP Depth Buffer (Low Compatibility): Improves depth perception in games but may have compatibility issues.
    • FMV Chroma Smoothing: Smooths the chroma (color) in full-motion videos to reduce blockiness.
    • Force NTSC Timings: Forces the game to use NTSC video timings, which can affect the speed and synchronization of video playback.

    PGXP options

    The PGXP tab also adds several options of interest providing a more authentic and visually pleasing experience. Below is a description of options.

    • Geometry Tolerance: Sets the tolerance for geometry corrections. Lower values can increase accuracy but may reduce performance.
    • Depth Clear Threshold: Threshold for clearing the depth buffer to improve rendering accuracy and reduce artifacts.
    • Perspective Correct Textures: Ensures textures are rendered with correct perspective, improving visual fidelity.
    • Culling Correction: Corrects issues with object culling, ensuring that objects are not improperly hidden.
    • Perspective Correct Colors: Ensures colors are rendered correctly with respect to perspective, improving visual quality.
    • Preserve Projection Precision: Maintains higher precision in projection calculations, improving the accuracy of the rendered scene.
    • Vertex Cache: Utilizes a vertex cache to improve performance by reusing vertex data.
    • CPU Mode: Forces the emulator to use the CPU for certain graphical calculations, which might be slower but can improve compatibility with some games.

    DuckStation also provides various post-processing options that allow you to emulate various effects such as simulating a CRT found in the post processing tab. You can mix and match effects.

    Step 6: Memory cards

    Open up the Memory cards in the preferences. The default option is a separate card per game title, and this is the recommended setting, as you'll never need to think about memory cards. When you launch a game for the first time, you'll need to initialize it. This will not delete game saves for other titles.

    Step 7: Emulation settings

    You can change the emulation speed, and it does exactly what you'd expect—games play speeds can be altered to the user's preference. The other interesting thing here is Vsync, which prevents screen tearing.

    Step 8: Save States

    Save states are one of the best things about emulation, as you can literally save a game at any point at an exact moment. Simply go to Save State, and it'll save the game state. Now you can resume back to that point without having to boot the game.

    More Emulation Stuff

    I've made more than a few blog posts and videos on emulation, related to emulating semi-recent game consoles such as my Sony Playstation 2 Guide.



    And finally, my Xbox emulation guide which has a written and video version.


    Half-Life and it's failed OS 9 port


    Half-Life should have existed on the Mac. To be fair, it does, and it did, but it could have much sooner. This an adapted script to a blog post. Below is the original video. This written version includes entire quotes whereas the video version includes more interview clips and actual captured gameplay footage of Half Life.

    Half-Life has a strange relationship between OS X and macOS. During the hey of Mac Gaming, popular franchises routinely were ported to the Mac, and of course, Half-Life should be one of those games. In gaming, there are few PC games as critically acclaimed as Half-Life. While it may not have been revolutionary, it certainly represented the evolution as it exhibited a level of polish games rare for the games of its era, where cutscenes were largely integrated into the game through scripted segments, environmental storytelling, subtle cues to enhance immersion and featured fully voiced characters. It even considered pacing as it featured puzzle breaks between action sequences. While it wasn't the first story-driven first-person shooter or the first cinematic game, it encapsulated the best game design of 1998.

    It was natural for the game to be ported to Mac OS and OS X, as other high-profile first-person shooters from the era, like the Doom series, Hexen series, Quake series, Dark Forces, Deus Ex, Duke Nukem, and Unreal series, were all ported to the Mac.

    Announcement and Cancellation

    In April 1999, Logicware under Sierra Studios announced that a Mac OS version was in the works, but by October it was completely canceled. The official reason why the port was axed was given by Gabe Newell, president of Valve, citing the lack of Team Fortress Classic and multiplayer with PC users and fear of releasing an inferior product.

    Gabe said the following:

    There's been a lot of speculation about Half-Life for the Macintosh - its feature set, its compatibility with the PC version, and so on. Andrew Meggs at Logicware has been doing a good job on the port, and it's mostly done. At this point we've spent a bunch of money on the Mac product and have spent a lot of time thinking about what we need to do to make sure Macintosh users are happy with it when it ships.

    Which is why we are canceling the Macintosh version of Half-Life.

    When we started Mac Half-Life, there was a lot of optimism about the opportunity for Macintosh games. As someone who worked on Macintosh software starting in 1983 before the 128K Mac had shipped, it was pretty exciting to think that there was going to be a resurgence in the Mac gaming market.

    However, as we got closer to shipping the product and reality set in, it was increasingly obvious that in order for us to break even on the Mac version, much less be profitable, we were going to have to cut some corners. OK - I guess we won't have Team Fortress Classic available at shipment. Maybe people will accept it if we update them with TFC later. OK - I guess I understand why we don't have an automatic update facility. Maybe people will accept that they have to manually update. OK - I guess I understand why we might carve out a separate multiplayer space for Mac users from PC users because of the on-going interoperability issues. Maybe that won't be the disaster I think it will be.

    But the more I thought about it, the more I felt that this was nonsense. Our existing Half-Life customers are really happy with us. They were happy with the original game, they were happy when we released TFC, they were happy with our on-going investment in Half-Life, and there's even more coming for them in the next couple of months. They are happy because we do our best for them, and that's what they expect from us in the future. Given the realities of the Mac gaming market, our Mac customers were always going to be mad at us. They were always going to be second-class customers where we couldn't invest to the same degree in the Mac version as we did elsewhere. I don't want to be in that business. I would much rather we just eat the money we've spent so far than take money from Mac customers and short-change them.

    It's disappointing to me on a personal basis that we won't ship Half-Life for the Mac. Everyone here, and I'm sure the people at Logicware are disappointed. The Mac gamers who were looking forward to Half-Life are undoubtedly disappointed as well. However that's a lot less disappointment than what would have happened if we had tried to get Mac gamers to accept second-class treatment on an on-going basis.

    Source: Mac Half-Life Cancelled!

    Logicware did shed some light on the situtation the next day. I understand that previous names can be a sensitive topic but I need to clarify in the sources to avoid confusion, the quotes are attributed to Bill are Rebecca. Rebecca of Logicware briefly spoke on the issue, releasing the following statement:

    Sigh. Yes, Half-Life for MacOS is cancelled. I'm very disappointed that all the work that was done will not see the light of day or the Mac communities screen across the globe.

    Sierra was a pleasure to work with. They have been very helpful and supportive through the entire project and I look forward to doing work for them in the near future.

    I still have a glimmer of hope that Half-Life will eventually be on Macs, but for today that hope does not exist.
    Please don't ask us for a copy of Half-Life. Please don't ask us to "finish" it. The game belongs to Sierra, not us.
    I want to thank Jeff Pobst at Sierra for all the work he did in this project, and Andrew Meggs for all the tireless hours he put into this project to make Half-Life a true Mac experience.

    We are still on track for Aliens vs. Predator and this does not affect the project in any way.

    And after that:

    Then I'll say it -- the game was nearly done. Sierra had labeled the most recent build as beta. Single player had been done for some time. We played on a PC server some weeks ago, and had been playing on a Mac server (with both Mac and PC clients) for the last week. The only things left to do were to add some UI screens in the launcher, get the memory usage under control so it could play on a 32MB iMac, and fix outstanding bugs.
    Obviously, I can't release the code. It belongs to Sierra and to Valve. If you want to get together a petition to send to them, that's your business, but knowing the full situation I think they would respond mostly with annoyance.

    Source: Logicware Staff on Half-Life Cancellation, Part II, Inside Mac Games

    The Real Reason for Cancellation

    For years, this was the accepted narrative. The port was nearly complete but didn't live up to Valve's high standards... that is, until recently, when Rebecca Heineman spoke on the Retro Tea Breaks podcast, covering the ill-fated original port of Half-Life. Below is a transcript of Rebecca speaking about Half-Life.

    Apple pissed off Valve. That's the long story short. Because we did such a great job on Quake II, Sierra approached us. Valve was interested in porting Half-Life to the Mac because they had a conversation with someone at Apple, a games evangelist, who said they would sell 500,000 copies on the Mac. Valve thought it was a great market opportunity and decided to commission the port.

    They came to us, we looked at the code, gave them a price, and they agreed. They even threw in an early completion bonus: if we finished the game by a certain date, we'd get an extra 20K. So, I dedicated three people to the project. We were all excited about working on one of the top franchises ever and getting it onto the Mac.

    Then, three weeks from shipping, when the game was done and we were just fixing bugs, I got a phone call from Sierra. They told me they were canceling Half-Life for the Mac. I was shocked and wanted to know why. They said they couldn't sell the rights at any price but appreciated our work and would pay us in full, including the early completion bonus, on one condition: our silence.


    I recommend watching the entire interview with Rebecca as she was formerly the lead developer for Interplay, and worked on games such Wasteland, The Bard's Tale, Out of This World, Wolfenstein 3D ports to the Mac, 3DO and even Apple IIGs. She also was responsible for the Linux port of Doom Legacy and Apple IIGs port of Sim City.

    Valve didn't want the bad publicity and preferred letting people think the port was bad rather than revealing the real reason. The truth was that an Apple representative had initially told Valve they would sell 500,000 copies. But as the game neared completion, the actual pre-order numbers from retailers were only 50,000. The Mac gaming market wasn't healthy, and 50,000 copies were considered good.

    Valve felt misled by Apple, especially since the original representative had moved to another company. The new Apple rep denied ever quoting potential sales numbers. This angered Valve, leading to an internal policy that no Valve title would ever be ported to the Mac.

    We didn't know about this policy and neither did other Mac game companies like MacPlay and Aspyr. They tried to negotiate with Valve, but Valve demanded nothing less than a million dollars, effectively pricing their games out of the Mac market.

    We archived everything, and there's a disc in my archives labeled "verboten." If someone finds it, they'll see familiar files and an executable for the 1999 version of Half-Life for Mac. Maybe one day it will see the light of day.

    So, if you take Rebecca at her word, Valve canceled Half-Life on the Mac over sales figures quoted by Apple, a misrepresentation by a factor of 10x. You can watch the entire interview above. Rebecca's career is impressive as she was also formerly the lead programmer for Interplay, and very much worth the watch.

    Other Canceled Ports and Later Developments

    Interestingly, it was not the only canceled port of Half-Life as the Sega Dreamcast also faced similar treatment, except unlike the Mac version, it leaked online.

    Dreamcast Half-Life

    The reason given for its cancellation was changing market conditions, but it had already been delayed more than once. The near-complete versions of the port featured inconsistent frame rates and long load times.

    It featured a sub-campaign, Blue Shift, that would be folded into future releases of Half-Life.

    However, Half-Life wouldn't stay away from the Macintosh platform forever, as in 2013, Valve finally released Half-Life for the Mac.

    Valve and Apple: A Rocky Relationship

    A few years later, Valve again soured on Apple. Famously, Valve originally intended to release Proton for macOS. For those who aren't familiar with Proton, it is a compatibility layer that translates Microsoft's DirectX graphics library to Vulkan instructions, allowing Windows games to be played on Linux. It powers the SteamDeck and has ushered in a new era of Linux gaming.

    Andrew Tsai has an entire video on the subject, in the description. Apple and Valve went as far as to feature SteamVR in the WWDC 2017 keynote, but then the relationship soured again due to Apple's moving goalposts. Apple dropped OpenGL and 32-bit support and did not adopt Vulkan graphics API. Valve wasn't alone in this complaint, as Apple has never been able to amass a library of games due to constant breaking changes in OS X and macOS.


    Playing Half-Life on the Mac Today

    You can experience Half-Life on the Mac today using Mac Source Ports - Xash3D FWGS. However, it requires a copy of the "valve" folder from a PC install of Half-Life to be placed into "~/Library/Application Support/Xash3D". This will work on modern macOSes.

    • Intel Mac owners running 10.9 - 10.14 can install the official port of Half-Life on Steam.
    • Intel Mac owners running 10.5 - 10.8 can install the legacy Xash3d port on Macintosh Garden. However, this port will not work under modern macOS.
    • PowerPC Mac users can install the Xash 3D alpha on 10.4 - 10.5. It has a few asterisks as there are some texture issues and it requires an OpenGL 2.0 compatible card.

    Mac Half-Life port guide

    Xash3D isn't the only way to experience Half-Life on the Mac. With Crossover, you can play Half-Life and it's less picky about which version you use.

    Conclusion

    It's unlikely we will ever see the official port of Half-Life that was done by Rebecca and her team, which is sad. Due to the relationship between Apple and Valve, I wouldn't count on Valve making official Half-Life ports again.

    If you're into retro Mac gaming, I've made a video about a cursed port of Grand Theft Auto 3 for PowerPC Macs and one about the history of Connectix Virtual Game Station, embedded below.

    Additional Mac Gaming stories




    What the Hell is a Neural Engine?

    This following article is an adapted script from my YouTube Video: "What the Hell is a Neural Engine?"


    If you've purchased an iPhone or iPad after 2017 or an Apple Silicon Mac, it has the Apple Neural Engine. The short answer to my rhetorical question is that the ANE was initially designed for machine learning features like FaceID and Memoji on iOS and debuted on the iPhone X with the A11 chipset.

    Machine Learning uses the power of algorithms and statistical models that enable computers to perform tasks without explicit instructions. Machine Learning learns to make predictions or decisions based on data, known as training. The learning process generally involves feeding large amounts of data into the algorithm, allowing it to learn and improve its accuracy over time. It varies a lot, and training can take on many forms, such as using tagged data and/or unsupervised learning or Neural Networks. For example, Large-Language models use a mixture of unsupervised and supervised fine-tuning and, later, human reinforcement when stealing the collective works of humanity.

    Machine learning is used in mundane tasks like email filtering to catch spam or more exciting things like computer vision, such as the ability to identify objects in photos. With the AI choo-choo express hype train, many machine learning and neural networks are being rebranded as AI.

    Machine learning requires a lot of computing power, and CPUs are not the most efficient at training and executing machine learning. For example, GPUs are parallel processors that can quickly execute millions of certain math operations in a single clock cycle; thus, they are much better suited for the needs of machine learning.

    Apple designed the Apple Neural Engine (ANE) to supplement certain types of machine learning tasks, both in training and executing, using CoreML.

    It's essential to understand Core ML, Apple's machine learning API, doesn't exclusively utilize the ANE; it leverages the CPU and GPU and, if present, the ANE. To quote Apple,

    Apple's Cores for ML

    "Core ML then seamlessly blends CPU, GPU, and ANE (if available) to create the most effective hybrid execution plan exploiting all available engines on a given device. It lets a wide range of implementations of the same model architecture benefit from the ANE even if the entire execution cannot take place there due to idiosyncrasies of different implementations." Apple.com - Deploying Transformers on the Apple Neural Engine

    This means when using CoreML, it will automagically use all the tools it has available. The advantage of this approach is that developers do not have to worry about programming for various hardware configurations. If you use Core ML, you're likely getting the best performance, regardless of the device the tasks are being executed on.

    Unlike, say, a GPU, there is no public framework for directly programming on the ANE. There are some esoteric projects designed to measure the Neural Engine performance, and so are not-so-esoteric ones like Geekbench ML, which does not seem to properly isolate the Neural Engine.

    Apple has provided some graphs and has stated that the M1's Neural Engine could perform up to 11 trillion FP16 operations per second, the M2 and M3 neural engine process up to 15.8 trillion operations per second, and the M4 can do 38 trillion operations per second.

    The ANE isn't just an accelerator for floating point math; it's better thought of as a low power consumption optimizer as it can be leveraged for certain types of ML tasks. It's faster and uses much less memory, less power allowing for on-device execution of machine learning tasks.

    NPUs

    The ANE is not unique to Apple as it is generally considered a neural processing unit, or AI accelerator, or NPU. Neural processors can be found in the AI engine of Qualcomm Snapdragons, the NPU of Samsung's Exynos, and the Da Vinci NPU of Huawei's Kirin. There's a common thread that many readers probably noticed with the aforementioned chipsets: they are all ARM-based. The lack of NPUs for x86 has to do with several factors, the first of which is that x86 hasn't been found in extremely low-power devices like phones and wearables, where every watt counts. The second reason is the existence of exceptionally powerful dedicated GPUs in high-end computers. GPUs can perform the same operations as an NPU and perform more operations, making them more useful for both training and executing machine learning tasks at the cost of a higher TDP. The M4 ANE has 38 Trillion operations per second, but high end Nvidia GPU can hit 1,300 Trillion operations per second.

    Another reason why NPUs aren't typically found on x86 are the type of AI tasks that NPUs really excel at, like facial recognition and computation photography, which doesn't really exist on desktop computers. Lastly, for serious AI tasks like model training, buying expensive GPUs or leasing computer time on cloud services with hardware acceleration would be more effective than designing NPUs for x86.

    However, we're seeing a shift in the role of machine learning on desktops with the rise of "AI" and more and more demand for the raw compute power required for AI. Windows 11's questionable Copilot + requires 40 trillion operations per second.

    What is an NPU exactly used for?

    Let's use a real-world example. Core ML is a foundation for Apple's computational photography. As everyone hopefully is aware today, when one snaps a photo, there is no longer anything such as "no filters," and billions of operations are performed to process the image, including everything from face detection to color balancing, noise reduction, smart HDR, video stabilization, emulating depth of focus in cinema mode, and scene analysis. This requires millions of operations to happen, in real-time or near instantaneously. Rather than send the matrices of floating-point operations to the CPU and GPU, the Neural Engine can take on heavy lifting.

    These are incredibly dense operations, like scene analysis, which might sound simple, but Apple has developed an entire ecosystem called Apple Neural Scene Analyzer or ANSA. This is the backbone of many features like the Photo app's Memories, where images are tagged, aesthetics are evaluated, detection is done for duplicates or near duplicates of photos, objects detected, and locations are grouped. This is all done on the devices using another principle Apple calls differential privacy , where Photos learns about significant people, places, and events to create memories while protecting the anonymity of the users. Exploring how Apple's memories work probably should be an article in itself. While this feature makes extensive use of machine learning, it's not dependent on the ANE alone; instead, it assists in performing the analytics.

    However, it's hard to evaluate how much of this chain occurs on the ANE. That's due to the lack of information Apple has published. One can find frustrated developers complaining about the lack of info. One of the main sources for information is The Neural Engine — what do we know about it?

    The TLDR is that the neural engine is an on-device Neural Processing Unit part of Apple Silicon that is leveraged for machine learning along with the CPU and GPU. It's very good for certain math operations and is partially a power-saving mechanism designed to assist low power computing, rather than utilizing a more power-hungry GPU.

    Screenshot of Apple Watch Webpage

    This is especially the case with the Apple Watch, which needs to be ultra-efficient. Since the series 4, the Apple Watch line has included a stripped neural engine to assist with faster on-device processing of inputs. In Apple's marketing material for the series 9 Apple Watch, Apple suggests that the Apple neural engine is even used for the double tap gesture.

    It will be interesting to see how Apple leverages it in the future. It seems increasingly likely that Apple will be doing some of its AI using cloud services. Also, AI functions are very RAM intensive. In a recent video, I demonstrated the limitations of 8 GB of RAM when a Mac mini m1 was bested by a Mac Pro 2013. Apple may regret shipping low RAM configurations.


    This year's WWDC was very focused on Apple Intelligence, Apple's branding on AI, a term that gets increasingly obfuscated day by day. Apple plans to bring AI on multiple fronts, running local AI models and upchaining requests to the cloud when local isn't enough. There are a lot of questions to be answered on how well this strategy will work, and perhaps when you read this, many of them will be answered. One minor reveal is that only M series Macs and the A17 Pro, as of recording, are confirmed to support Apple's AI strategy.

    There are plenty of posts and videos breaking down the features of Apple Intelligence. Still, just as a refresher, they included generative text editing, generative AI for uninspired images and emojis, with one truly dystopian example on the iPad where a stylish sketch is turned into a soulless rendering, some very impressive natural language interactions, and personalized notifications. It's very unclear when and which interactions are on-device, but on-device services likely include dictation and personal contexts, and some of the textual generation; by that, I mean Siri responses. This, of course, will be revealed in the upcoming months. If executed well, it will be the most cohesive and useful AI strategy we've seen by any major company for everyday people, but I expect growing pains.

    We should fully expect more emphasis on the NPUs moving forward, but companies haven't managed to communicate effectively the value of NPUs or what they do to consumers and are often cagey even towards developers. This is certainly not the first time a coprocessor was nebulous to its potential buyers, be it early GPUs or math coprocessors, and if anyone remembers the failed attempt at selling Physics processing units for gaming.

    Training and FP16

    In Apple's AI page, the Neural Engine isn't mentioned as part of the chain used for do-on-device training. This is likely because the ANE is primarily optimized for the execution (inference) of machine learning. This is evidenced by it only supporting FP16, GPUs and CPUs can execute FP32 which is higher precision, which is needed for many small adjustments from the gradients calculated during backpropagation. CPUs and GPUs can do mixed precision training, where FP16 data can be converted FP32 when more precision is needed.

    To translate that back to human, NPUs in consumer devices are targeted for running existing models as opposed to creating new ones. The ANE is not for AI model creation for developers.

    None of this should be a surprise. As I stated earlier in this article, typically if one was performing serious ML training, one have a very expensive GPU step-up or lease cloud computer time.

    Without going too deep into computer science, 1 bit can store two values, 2 bits can store 4 values, 3 bits can store 8 and so on. 16 Bits can store 65,536, and 32-bit can store 4,294,967,296.

    For non-whole numbers, such as those with decimal points, one would need to express where the decimal is. For example, 1245678 could be 12.345678 or 123456.78. A floating point format is used to handle this by specifying the decimal's position. This involves components like the mantissa and exponent, but in essence, it allows the number to 'float' to where the decimal is needed.

    In machine learning, different bit depths are used, and a 16-bit floating point (FP16) is popular because it offers a reasonable balance of accuracy, memory usage, and processing power. Models can be quantized from 32-bit to 16-bit, trading some accuracy for performance. This process is similar to downsampling a 24-bit image to 8-bit rather than simple rounding.

    Apple now provides developers with the App Intents framework, which opens up applications for interactions performed by Siri using the personal context awareness and action capabilities of Apple Intelligence. This allows developers to integrate features based on predefined trained models without having to create their own. How useful and widely adopted this is remains to be seen.


    Become an art legend with your Mac, iPad or iPhone

    This blog post is adapted from a video version of my Draw Things Tutorial, a stable diffusion application avaliable on the Mac, iPhone and iPad via the App Store.Using any of these devices, you can follow this tutorial. Draw Things is a frontend, or a graphical user interface, for Stable Diffusion.

    AI art generation is pretty CPU and GPU intensive, so for anyone using older devices, this may or may not work. The screen captures are from an iPad M2 but the Mac version looks exceptionally similar.

    download model

    When you first launch the application, you will need to download what is known as a model. A model in machine learning, such as Stable Diffusion, is a trained neural network that has learned to generate images by analyzing extensive sets of images and text. To translate this back into human-peak, each model uses different sources for images and text. This radically changes the sort of images that a model can generate based on prompts. Some people create different models based on art styles or content. Some of them are really good at people, some are general-purpose, some are really good at meme-making, different styles of illustrations, and some are photorealistic. They also vary quite a bit in quality.</p download model list

    > If you click on the model, you'll see quite a few models available, but what we want is to start with SDXL Refiner. SDXL Refiners is by Stability AI, the people who created Stable Diffusion, and it's very general-purpose and generally pretty high quality.

    Draw Things Screenshot

    The interface looks kind of confusing, but we can see that there's a default prompt. The text on the screen, the prompt, is the thing we'd like to see. Right now, the default prompt says, "A samurai walking towards a mountain, 4K, highly detailed, sharp focus, grayscale.". If I click generate, I will see the following:

    Draw Things Screenshot

    We could continue to use this same prompt and generate more images that would be similar in nature. If we were to change the prompt, it would change the content of the images that it's rendering. To reiterate that, if you type something in and hit generate, it'll spit out an image. That's pretty simple, so let's do something a lot more advanced. Since we are just talking about models, let's download a different one. This is done by tapping the model name to bring up the menu.

    Draw Things Screenshot

    I can switch the model and download a new one. For my example, I'll use "Realistic Vision version 3." Realistic Vision focuses on rendering images of humans. Then I'll generate an image with the same prompt again.

    Draw Things Screenshot

    The results again are fairly interesting.

    Now, let's try altering the prompt and adding a negative prompt. Prompts are the results you want to see, and negative prompts are the results you don't want to see.

    Inpainting lets us fill in missing pieces to an image. However, you need an inpainting model, so let's download one. I hope they change this in future versions of Draw Things, but right now, the grid layout is worthless. I can't read the full file names, so let's switch to list view and search for the inpainting. This is still not perfect, but at least now we can see 1.5 versus 2.0. By the time you watch this, some of these version numbers may have changed, so just keep that in mind. Usually, the newest version is the best, so I'm going to download the 2.0 version.

    Draw Things has a few basic image editing options, and one of them is the erase tool. Click on it and let's erase this guy's face. Now that I've completely erased his face, it's time to adjust the strength. I could put this at 100%, and it'd probably do a reasonable job of filling in this guy's head. I am going to adjust this to 90% because I want it to look somewhat like our samurai. One last step, I'm going to adjust the text guidance. The on-screen description is a very good explanation. The higher the text guidance value, the more it will try to follow your prompt. If you want to know more about it, there's a fantastic article on GetIMG.

    It can't always do a perfect job. Some models have inpainting versions, so you can download the matching inpainting version and have much better results. In this next section, we're going to try to incorporate real-world photos that we've already taken. I have a picture of my cat, Jeff, in Pixelmator on my iPad, and I'm going to remove some unwanted stuff from this image. Now that I've saved it, let's import it into Draw Things.

    First, let's click the new page to create a new canvas, then click the camera icon so we can import Jeff's photo from our photo library. I'm going to speed through switching the model, typing up a description, resizing the image, and then setting the strength to 70%. Now it's time to finally talk about steps.

    Get img.src: Time over time, Stable Diffusion improves the image, but there is a law of diminishing returns. This interactive graphic explains this better than I can do with words. Get IMG recommends roughly 25 steps. This number of steps is generally a good balance between quality and time. Each step is essentially a refinement of the image. The more steps, the more refined the image becomes. However, after a certain point, the improvements become less noticeable, and you reach a point where additional steps may not significantly enhance the image. It's important to experiment with the number of steps to find the sweet spot for your specific image and desired outcome.

    I'm going to lower mine to 28. Let's speed through a few more last-second updates, and now we're ready to generate. While it's not Jeff, it definitely took some inspiration from him. Now, I like this image, but I wish it just wasn't square. Well, we can fix that. If you notice, when we zoom out, there's still the carpet background from the previous image. I could use the eraser tool, but it's just easier if I save this image and start a new canvas.

    So, let's re-import the generated image. As we previously explored with the samurai, the inpainting will paint in the missing pieces of the image. Let's resize the image and move back to inpaint as our model. On second thought, I think I'm going to change the image size just a bit. I decided to off-center this image slightly so it would render more to the left of the cat. The final step is to change the strength back to 100%, which might seem counterintuitive, but watch the results. The results are actually pretty good, except for there's a line in both the images. I've zoomed in so you can really see it.

    Since I'm rendering all my images to my iPad, I can go into a program like Pixelmator, open up the image, and start touching it. In Pixelmator, I can use tools like the heal tool to remove the line, use the sharpen tool to give the face a little more detail, and then even go through with the warp tool and give the cat a bit of a tummy tuck, since this is kind of a weird render. The end result is impressive, although the cat only has three toes now.

    Let's revisit models once again, but this time, we're going to download models that aren't included in Draw Things off the internet. If we click the models, we can then click manage. Realistic Vision 3.0, I know for a fact, has a later version. I'm going to delete the Realistic Vision 3.0 by clicking the trash can. There are a lot of options when importing models into Draw Things, and that's unfortunately just a quirk of Stable Diffusion. I'm not going to go into all of these because it gets really complicated, but just be aware, some models do require tweaking these settings. The thing we're most interested in is "Downloaded File." If we click it, we can enter a URL to a model.

    Let's cancel out of this and go to Safari. Probably the most popular host of AI models is Hugging Face's Model Hub or another similar platform. I'm not sure how it's said. From this website, we will search for Realistic Vision. This particular model is right now version 6.0 beta 1, and if you notice, it also has an inpainting version, but for this video, we're just going to download version 6.

    We will do this by clicking the download icon and then copying the link. For the Mac users, you can just download this file. Now, let's switch back to Draw Things. We will click "Enter URL," paste the URL, click continue, and it'll start downloading. Depending on the model, Draw Things might download supporting files. Realistic Vision is pretty good for people, so let's just do a quick one of a woman reading a book in a coffee shop at night, and it's peaceful. The results are pretty good until you look at the cursed AI hands; it's just a thing with AI.

    Speaking of cursed, let me draw a not-so-good picture of my cat, Jeff, very quickly on my iPad. To my defense, I purposely wanted a bad drawing for this demo. If I go back to Draw Things, I can import this image and then use one of the models I already have downloaded; I can then use this as a source image to generate a new image. After refining my prompt just a little bit and changing the strength to 80%, I got a pretty good result. There are some striking similarities here because it mimics the pose on a flat background.

    iOS, macOS, and iPadOS all have the ability to lift objects out of photos. I've already imported my cat Jeff, and we're using the inpainting model. Apple's ability to lift objects out of photos is impressive, but it's not perfect; it has some weird edges. So, to correct for this, I'm going to do something that might seem counterintuitive. I'm going to erase the entire image, then I'm going to scroll down and set the strength to about 25%. The Inpainting 2.0 model isn't that great, but we could always download a different one and see if we could find one that would produce better results. Maybe we could put the cat in a better sci-fi background, and probably we'd tweak this strength to be even lower.

    Now for our final main topic, and that's LORA or Low Order Rank Adaptation. Think of these as expansion packs for Stable Diffusion. LORAs can contain quite a bit of different stuff like characters, poses, or visual effects. Also, generally speaking, they are a lot smaller than models. I know of one that's really cool that works with Realistic Vision, and we already have version 6.0 downloaded. The process is pretty much exactly the same as importing a model. We are going to go back to the website CivitAI, search for Vector illustration, and go to that LORA. I'm going to copy the URL and then go back to Draw Things and import it by clicking manage next to the LORA. Mac users again get off easy; they just click download and then can import the downloaded file. Once it has been downloaded, we will click import. Often LORAs need trigger words. This one is a vector illustration, but if for some reason we forget it, we can go back to the website. If I click the info button, I can see the prompts used to generate this really sweet monster truck. Note the use of the word Vector illustration. I'd like to see a vector illustration of a palm tree, so we're going to use the prompts for vector illustration and palm tree. Note that I have the LORA weight set to 100%. When using LORAs, you have to pay attention to the documentation. Some will recommend settings of like 60% for the best results. Some have multiple trigger words to produce different types of images. Some were designed to work really well with a certain model, like this one, which was trained on Realistic Vision. The results are pretty cool.

    There is a lot of trial and error. I tried to generate pictures of the Oregon coast in Vector art style. Some still came out as photos, and some did not. There are some topics I didn't touch on, like Samplers, and there's an amazing article all about Samplers and how they affect the image quality at stable diffusion art. If you haven't put it together by now, a lot of those scammy apps on the Mac App Store, iOS Store, and iPad Store that let you generate AI images are just using Stable Diffusion. I'm debating making a full-fledged course on this, but it wouldn't fit under this channel. I do have my Patreon, and I could also put it on Udemy or something. That way, people could really dive in with me because there are a lot of topics I didn't cover, like Control Nets or even making your own LORAs and training, because this program goes deep. If that's of interest, just let me know in the comments. And I think that's it.