the death of linux at (my) home
why linux
In 2002, I prepared to head off to college. I had just finished AP Computer Science, and was excited to really understand the newest tools of logic. I remember wandering around Best Buy with my dad and starting to do the math on my first semi-custom desktop. A mostly standard gray box from Gateway, a dual-VGA radeon graphics card, and Windows XP. I don't recall any of the other specs, just dual monitors and windows. I was surprised how well it worked; everything was a lot simpler then. After settling into my first CS classes, I wanted to get more familiar with the UNIX environment I was working in regularly. I played around with partitioning the hard drive to make space for a second OS, and then... started burning CDs? I had a CD-R drive in the desktop, and a stack of 50 blanks from Costco was somewhat affordable.
I played with a variety of distributions (distros, in common parlance), and settled on Fedora for a while. I never did get both monitors working on that setup, but I tried! There was a non-zero amount of Xorg.conf crafting in that time of my life. I didn't have a printer, and I didn't have wifi, so I avoided some common pits of despair. I loved the consistency and extensibility of the tools. Despite some bloat in UI systems over the years, the ecosystem became consistently, incrementally, better.
The computer science program at Gonzaga was taught around C++, and had an unofficial cross-platform flare to it. The computers in the lab were all Windows machines, however coursework was submitted through an HPUX mainframe (named Grace, after Grace Hopper). Your home folder was available on both platforms, so most folks wrote their code in the lab in Visual Studio 6, then ran final tests on Grace before submitting source code. Some upper-level classes worked with other stacks; a web-development class was mostly HTML and vanilla JavaScript, and there was a Java-specific class separate from the Object-Oriented course. Object-Oriented Programming was particularly bizarre -- we used Visual C++ .NET, a sordid mixture of managed and unmanaged code which no-one really wanted to exist. It unnecessarily complicated the subject matter, and it was the professor's first use of the tech as well -- this was .NET 1.0, and we were all learning together. I still remember the final project: a playable version of Othello, with extra credit for whomever won the bot-vs-bot tournament. That was the first of many times that I spent way too much time in the lab working on a programming project instead of studying for <insert liberal arts class name here>
.
My final semester, one of my professors sponsored my independent study in AI and distributed systems. I created small clusters of machines and tucked them wherever I could find space and a network port, with a control system in an empty office I had claimed. The clusters ran Gentoo with openMosix
kernel extensions, and the control server was running some RHEL version (probably 3, based on the timing: early 2006). That was my first foray into system administration and monitoring; with ~50 EOL machines scattered across campus, it was an adventure keeping things online.
After school, I had a brief period of time where I heated my apartment with CPUs -- I found myself flush with cash after a random windfall: Sun Microsystems (which was acquired by Oracle), was creating a grid computing platform, and hosted a Cool Apps competition. It's still unclear to me how many competitors there were, but I walked away with a tidy prize, and spent (half of) it on the exact type of things you'd expect a 22-year-old to spend it on:
A Taylor 314-CE
A 2nd-gen iPod Nano, my first Apple product
5 eMachines T5212s, on sale at BestBuy (whose clerk thought I was nuts)
The last of those I used initially as another openMosix cluster, continuing the work I had started during my independent study. Over the years those machines acted as network gateways, media servers, raw storage and compute, and a variety of other roles. Disks, network cards, RAM, and power supplies were added/replaced/maxed-out, and eventually the machines fell to cannibalism. The final survivor was retired unceremoniously in 2013, when its role as a firewall/gateway (running Smoothwall) was given to a WRT54GL running dd-wrt. I personally marked it as the end of an era, whereas others in my household marked it as the end of "those ugly gray boxes stacked everywhere".
Post-graduation, most of my jobs had me working in a *nix environment. I did some work with a contract firm on a unix-based wireless utility metering system, then at SRI where I worked in a mixture of research code in RHEL2 and consumer-friendly Win32 COM ports. At Siri, I remember a massive Dell XPS 720 workstation (with dual, dual-head graphics cards) running Fedora 6, and my personal Gateway MX8738 was running openSUSE 10.3. These were the golden years.
At Amazon, I had a RHEL3 desktop and an Ubuntu ThinkPad (which was uncommon, I had to PXE-boot the blessed image from the corporate network). At that point, my MX8738 was showing its age (it had mostly been relegated to wireless-bridge-duty for the Xbox), and I decided to replace it with a ThinkPad (having had good experiences with them so far). In a fit of anti-frugality, Amazon paid for a docking station at home, and I loved the simplicity of swapping the work and personal computers. I had a mirroring dual-monitor setup at work (the second monitor snagged from a departing intern), so transitions were seamless. I still prioritize those transitions today.
While this covers the how
, the why
is much more straight-forward: I wanted my development environment to match the production environment as closely as possible. I was deploying linux services and web applications, and keeping character encodings, path separators, and others compatible between dev and even the build machine was just an unnecessary friction. Folks have solved this problem in other ways over the years (developing in VMs or containers, cloud IDEs, etc.), but I enjoyed my tidy solution. There was still always some drift -- RHEL rarely had new enough dependencies to function as a daily driver (especially on laptops), so I'd pick the consumer flavor -- Fedora over RHEL, Ubuntu over Debian. As my work was primarily in interpreted languages (Java, JavaScript, Python), I rarely needed to worry about anything but a sufficiently-close version of the runtime.
I also really liked that I could take things I had learned in industry back home easily -- until their last breath, my old laptops act as servers (of various flavors). I have multiple laptops (and raspberry pi's) running either piku or dokku, and can easily host little home automation projects. My most recently retired laptop, a ThinkPad X1 Extreme Gen 2, is (at the moment) hosting an inference server for my LLM projects. It's a little sluggish, but it helps offload from other components of the stack that are running on pis.
the last 5 years
The proliferation of web-based platforms (for mail, messaging, music, and more) along with portable runtimes for developing and deploying those tools, made living in linux a real possibility. If you do everything in a browser, and the best browsers are available everywhere, you're pretty much set. As a bonus, the (IMO) best development tools -- the JetBrains stack -- all had native desktop linux support. Outside of developing native apps for Apple OSes, I was able to plot a career compatible with desktop linux.
As I've worked at more and more companies with significant corporate security, the trend has been towards Apple devices. The device management tooling found a sweet spot with full disk encryption and shared user/administrator access that linux/LUKS had never polished enough to justify investment. It's certainly possible to have a secure setup, but not one that's also easy for a corporate IT team to manage, especially for off-boarding. Most everything else, excluding Apple-ecosystem developer tooling, worked smoothly. At both Snap and hulu, I used linux machines as my daily driver. I had to work with IT for both, and the process at Snap was almost two years long.
At the same time, the bulk of issues I've had with Apple computers have been around not just the lack of keyboard-only-friendly UX, but the lack of customizability built into the OS. The community has addressed the majority of these concerns -- I use Rectangle for window tiling, iTerm2 is a very solid terminal shell, and Raycast is a great (recent) alternative to spotlight. homebrew still fills the gap of a command-line package manager, the absence of which continues to wiggle the eyebrows.
During this period, I was able to coerce my employer-owned macs to feel familiar to Gnome, my window manager of choice for the last decade. It's by no means perfect, but there is a world where everything except the ⌘ behavior is the same, and switching between the two on my external keyboard results in minimal frustration. I've recently been dusting off my AI excitement, and that combined with the latest developments in Apple Silicon led to what now appears to be the inevitable.
the switch
In May of 2023, I made a backup of my development directory on my thinkpad, and migrated over to a refurbished 16-inch MaxBook Pro with the M1 Max processor. This is the same model that Stripe issued to me in 2022, and I have to admit, I was impressed. I had a lot of reasons for the switch, but the biggest was the ability to iterate quickly on LLMs locally. I investigated a number of options for home GPU rigs (notably System76 and Lambda Labs), and the results were pretty clear -- the M architecture, with a single shared memory bank, won out on cost for training over any laptop or desktop with dedicated GPUs (and an unbounded amount I might spend on a cloud solution), when solely considering personal use. That combined with the known reliability of the hardware and the exceptional power usage (directly translating to battery life, long a problem with linux laptops), and the M1 Max starts to look like a steal.
To be clear, I picked my new machine because it was the most bang-for-my-buck tool for the work I wanted to do -- it's definitely not the right choice for everyone, and not a cheap one. I also have not given up on linux: I'm following Asahi Linux's progress, and I look forward to when they have things like "speakers" and "GPU" working. Snark aside, their progress is amazing, and I wish I had something (technical) to offer the community other than vibes.
Shortly before this switch, I walked away from the android ecosystem as well. I've been getting the Pixel a
phones for the last few years (somewhat for cost, but mostly because of size), and had become increasingly frustrated with them. Specifically, they had a nasty habit of overheating and hard-rebooting them when battery was less than 80%, especially when I needed them. On multiple occasions, I found myself without a usable device when trying to launch navigation, call for a ride, or open the Uber app. I don't go out often, so after a 3-month-old Pixel 6a started exhibiting the same behavior I had seen on my 3 and 4a, I made the switch. This was, as they say in the theater, the gun on the mantle in the first act.
the concessions
I haven't owned a personal Mac until this year. Those that I used were provided by an employer, and frequently crippled in some way. Around 2010, Amazon was giving out MacBooks with a heavily tuned OS image, which was surprisingly dated, and disallowed installing any useful utilities. Snap went a step farther, and had instrumented the kernel in ways I wasn't aware was possible. Stripe has now done the same. At Amazon, laptops were primarily remote-access devices -- not intended for development, just an email and RDP client. At other employers, the laptop was the primarily development environment, and there was a constant tension between "your laptop has access to user data and must be secure" and "yes, I would like the JDK to be able to listen on this port when launched by the terminal in my IDE." Having the freedom to tweak what I want or need has made a big difference.
That being said, there's a lot of options in this space, and it feels a bit like the wild west. At a bare minimum, I need a tiling manager (Rectangle), a browser (Chrome), and Homebrew. From there, I've cobbled together a number of native apps for reading, writing, and productivity. I'm actually surprised by the number of tools that don't merit a native experience, that have packaged their web applications in ElectronJS and consider it something new. I found that both Notion and Linear desktop apps provide a notably worse desktop experience, both from a latency perspective (which is unexpected), and the ability to quickly share links. In fact, linking culture is something I've found extremely lacking in this new paradigm -- from tools that make it difficult to copy a link to the current view to your clipboard, to links that have unnecessarily opaque identifiers for everything, making higher-level tools (like GoLinks) difficult to adopt. Native deep links (originally via intent-interception of URLs) have struggled to succeed across platforms, and this proliferation of apps without standards seems like it's pulling in the wrong direction.
Lastly, the two things that have really bothered me about MacOS (or OSX, or other prior monikers) remain -- mouse focus (sometimes referred to as click-through focus) adds extra clicks or keypresses to the day, and dual external displays cannot reliably retain window positions. While I have keyboard shortcuts to speed up the process, every time my monitor goes to sleep, windows collect on one of the monitors. No set of magic incantations appears to resolve the problem completely.
the future
Around 2015, I started working on a mostly-offline voice assistant, whose core components were adopted by Mycroft AI, and I worked on the first version of many of the components. My original goal was a platform that could run completely self-contained on a Raspberry PI, and the combination of improvements in local computing power and increases in LLM efficiency bring that closer to reality every day. I've started reading about local-first, the long-needed catchy name for "not somebody else's computer", and I see a lot of alignment there. When I acquire a new piece of hardware, I expect to gain more control of it over time, and to squeak every last bit of value out of it that I can. Short of predicting the future of computing, I can at least promise that.