Photos for Mac: Deleting Items From Within a Playlist

As part of my workflow for archiving downloaded videos from TikTok, I put TikTok videos into a manually created album in the Photos app so that I can drag them in one step into the Finder.

But after I was done, I couldn’t delete the items from my photo library; the option doesn’t exist when I right-clicked the items, and the Delete key only deletes the items from that album.

I searched the web for this, thinking maybe I’d find an iOS shortcut that does the trick. Glenn Fleishman had an article on MacWorld that outlined a way to do it that involved tags and a smart album, but that seemed really convoluted.

But there’s actually a really simple solution: you drag the items to Recently Deleted.

demonstrating dragging items into the Recently Deleted folder

Sometimes the best solutions are the simplest ones.

It wouldn’t kill Apple to add a context menu item for deleting, though.


Use Codespaces with Mutagen and keep your local dev tools

At GitHub, we recently went all in on Codespaces for developing GitHub itself. If you were a VS Code user and did all your Git stuff either in VS Code or on the command line, it was great. But for the curmudgeons like me who are big fans of local Mac development tools like TextMate and Tower, it was less great, and I was initially pretty frustrated by the forced change in tooling.

But we’re in luck! I recently learned about a utility called mutagen and it offers a great general-purpose solution that allows you to use a Codespace for its compute and the environment, but also maintain a working copy on your local machine for code editing and source control.

How to use it

At the heart of Mutagen is a lightweight agent that watches for filesystem changes on your computer and codespace, syncing them bidirectionally.

If you’ve got a repo already set up with Codespaces, you can be up and running with Mutagen with just a few steps.

First, you need to install Mutagen on your computer (this assumes you have Homebrew installed):

brew install mutagen-io/mutagen/mutagen

If you haven’t already, install gh too:

brew install gh

You’ll want to set up gh by running gh auth login. Once this step is completed, you should be able to ssh into your codespaces by typing gh cs ssh. GitHub’s docs have more info.

Next, we’ll SSH into the codespace, but we’ll be passing gh an option to also open up an SSH tunnel. This tunnel will let you ssh into localhost at the specified port and be connected to your codespace:

gh cs ssh --server-port 1234

You’ll be prompted to pick the codespace to connect to.

You can use any available port you like. When you exit the shell it opens, the tunnel will be disabled too.

Once a connection is established, gh will provide you with a connection string and shell.

Next, open up ~/.ssh/config and add a configuration entry for your codespace. We need to ensure that when connecting to the codespace, the NoHostAuthenticationForLocalhost is always used (mutagen doesn’t have a way for you to specify this option to its CLI). We’ll create a config entry for a host called codespace that will automatically use the correct port and options.

Host codespace
  HostName localhost
  Port 1234
  User root
  NoHostAuthenticationForLocalhost yes

Swap port 1234 for whichever port you actually chose earlier.

Also, I recommend setting up a ~/.mutagen.yml file with some sensible defaults. Setting the vcs: true option is essential here; your codespace’s .git folder contents will be totally different from your own machine’s.

      vcs: true
          - "/vendor/*"
          - "/tmp/*"

More on setting up ignores in the docs

Now, the fun part! Make sure that your codespace and your local working directory have the same branch and SHA checked out (this is important; more on that shortly), and set up the sync profile:

mutagen sync create --name=codespace path/to/project/folder codespace:/path/to/codespaces/working/directory

This creates a synchronization session and initiates it. You can check on the state of this session with mutagen sync list.

Once you complete this, you will have bidirectional sync between your local working directory and the working directory on your Codespace. Having the sync be bidirectional is handy; it allows you to, for instance, run your linters while ssh’ed into your codespace and have those file changes synced right back up.

Once this sync session is created, you don’t need to create it again; if the Mutagen daemon is running it will handle disconnects and reconnects just fine. However, sync sessions are immutable so if you need to change settings, delete the sync session:

mutagen sync terminate codespace

This approach isn’t without its faults. This is a fairly tightly coupled arrangement and it is a bit more fiddly than a purely local dev environment. Because codespaces come with their own Git working directory, you do have to take special care to make sure both working directories are on the same branch/SHA before you start your tunnel, and then you’ll need to reset your working directory on the Codespace after you disconnect the tunnel (or vice versa if you plan to perform your Git operations on the Codespace).

But I’m really quite fond of it; for a little bit of extra fiddling around, I get to fully enjoy codespaces, but I retain full flexibility and freedom to use the local tools I enjoy. I can keep using Tower and GitHub Desktop to write my commits or do rebases. I can use Kaleidoscope to make sense of complex diffs and resolve merge conflicts. And I can keep using my beloved TextMate until someone pries it out of my cold dead hands.


Apple Laptops: The icanthascheezburger buyer’s guide

I am absolutely delighted at the new pro laptops Apple announced last week. I’m not going to make this post about Apple’s overall history of laptops over the last decade (others can explain it better) but this new release is Apple showing they are back on track to making excellent Mac laptops for their pro customers after a couple years of neglect leading to 2016, and then years of stubbornness after 2016.

Other sites actually have real reviews, which I guarantee will amount to “they’re incredible and super fast”, but I want to instead focus on a question that I get a ton from friends: “what Mac should I buy?” Now that Apple’s got a full lineup of laptops powered by Apple’s custom silicon, I can now answer this question for most friends without having to tell them to wait just a little longer.

Non-Pro users: MacBook Air

If you’re not a pro user, then the MacBook Air with M1 processor is great. These were first announced about a year ago, so they might be due for a refresh in the not-too-distant future (which might come with a more substantial redesign), but I don’t think one is imminent, and either way, the M1 MacBook Air is a solid all-around computer that will serve you really well for years.

RAM: 16 gigs
Do not buy a Mac with less than 16 gigs of memory.

I’m serious, don’t.

Even if all you do is “lightweight” stuff like browsing the web.

Fun fact: modern web browsers use a lot of memory to do their thing; web sites are very complicated!

SSD: at least 1 TB. If you know you need more, get more! But if your budget is tight, I’d rather see you skimp with the SSD instead of the memory.

You’ll be asked which processor you want. It doesn’t really matter. You’re choosing between a 7 and 8 core GPU. Most non-pro users won’t see much of a difference here. The reality is that some of their processors come off the line with only 7 GPU cores that work, and instead of throwing those out, Apple just disables the non-working GPU and sells the processor for a little less money.

Pro Users: MacBook Pro 14“ or 16”

A pro user can be a pro user for a lot of different reasons, but my recommendations here are pretty general:

14“ or 16”: It’s up to you! You can spec up either system however you like; the 14“ one isn’t less capable just because it’s smaller. I believe there is a high power mode offered on the 16” model only but I believe that’s strictly for sustained heavy workloads on battery. Otherwise, both the 14 inch and 16 inch systems are identical in terms of how they perform, so it really does just come down to what size you like.

Memory: Pick at least 32 gigs. If budget allows, get 64 gigs (you have to pick the M1 Max processor for this to be an option)

CPU/GPU: The biggest difference you’ll see is the number of GPU cores, otherwise you’re just picking between 8 and 10 CPU cores.

Going from 8 to 10 cores will give you a nice boost in performance on highly parallel workloads. Encoding video files or compiling code might be parallel. But adding 2 cores won’t make your computer feel 25% faster overall, but certain long-running tasks might be 20–25% faster with 2 extra processors.

There’s also a growing number of tasks where more GPU cores will be beneficial. We don’t use GPUs just to put graphics on your screen; a lot of software will use them for general compute. Pixelmator, for instance, has done this heavily, and I believe Photoshop does this now too. Adding more GPU cores will make work like this noticeably faster, and you can crank up to 24 GPU cores if your budget allows. Depending on what kind of work you do, this might be worth it. I’m a software developer, and most of my work is CPU-based so I never feel much need to a super beefy GPU.

SSD: Again, at least 1 TB but if you’re dealing in large files you might want to crank this up. It does get expensive very fast, though.

If you don’t really feel confident making a decision here, pick the base M1 Max machine, equip it with 64 gigs of RAM, and an SSD of 2 TB or more. Done.

Desktop users: you might want to wait

Prefer a desktop? I’m the same way, as you might recall! If you’re not a pro user, the M1 iMac is great and it’s spec’ed the same as the M1 Airs; just follow my non-pro guidelines and you’ll be fine.

Same applies to the Mac Minis.

If you have more demanding needs, wait for Apple to release new desktops with the M1 Pro and M1 Max chips. I’m particularly excited to see a Mac Mini with an M1 Pro that will likely become my new home server.

And I am positively giddy thinking about the high-end pro Macs in the pipeline; if the M1 Pro and M1 Max processors are any indication of Apple’s strategy, then we might very well see high end pro machines with 20 and 40 CPU cores, and god knows how many GPU cores. In the meantime, I continue to love my Mac Pro.

iMac users who want more power than the existing iMacs, just sit tight; I suspect that in the coming months we’ll see iMac models that incorporate the M1 Pro and M1 Max chips, at which point in time my pro laptop guidelines will apply.

AppleCare: Yay or Nay?


I always buy AppleCare for my computers; it has always paid off; it just needs to be serviced once out of warranty for the policy to pay for itself. And anecdotally, I find that AppleCare customers can get better treatment in certain scenarios than non-AppleCare customers, particularly if the rules need to be bent in a particular scenario, like getting a machine replaced after many repair attempts, or maybe issues that are just a wee bit out of warranty, or an accessory that isn’t covered.

And now you can even buy AppleCare coverage that covers accidental damage, and you can extend AppleCare past the three year mark, which is great news in a world where people keep computers for longer.

What Not To Buy

  • Any machine with 8 gigs of memory. Seriously if that’s the only advice you take from reading this post, I’ll be content.
  • Any M1 Pro or Max machine with only 16 gigs of memory – come on, bump it up to at least 32!
  • The 13" M1 MacBook Pro – it’s a weird in-between product that isn’t really Pro, doesn’t offer any meaningful difference in real-world performance over the Air, has the Touch Bar we all agree is not useful, and is mostly just there to be a slightly-more-expensive-than-the-Air product.
  • Any Intel Mac unless it’s a high-end model and you know for a fact you have good reasons to buy one (like if you know you need to virtualize the x86 version of Windows or something)

We are entering a golden age for Macs and I’m super excited about it.


Meta update (not the Facebook kind of meta though)

Hey readers! It’s been a little quiet here lately.

I’ve got a Drafts app full of half-baked drafts, none of which I’ve been quite ready to pull the trigger on. I’ll keep finishing those up and drip them out over the next couple weeks.

I’m no longer expecting to hit my 100 posts in 2021 goal, though.

I’m not mad about it. Despite great optimism about 2021, the last few months have been a particularly difficult time in my life, and my mental energy has been spent chewing on these difficulties instead of thinking up new stuff to write.

Instead, my writing energy went largely into private journal entries, and that was where I needed to be directing my writing energy.

I do remain highly opinionated and I continue to love firing off opinions into the ether. I’ll post as many good posts as I can for the rest of 2021, and try not to feel too bad that there will be less than 100 of these.


About that Ted Wheeler recall campaign…

The campaign to recall Portland mayor Ted Wheeler hit a bit of a snag after the campaign failed to collect enough signatures to get the ball rolling.

But Total Recall PDX isn’t going to let itself be inhibited by such minor details as “the people of Portland clearly don’t want to recall the mayor”. Instead, they are now going to sue the city because the city didn’t grant the campaign an extension to collect more signatures.

homer simpson saying this is everbody's fault but mine

But that’s really thematically consistent for Total Recall PDX. The campaign has never really was centered around giving Portland a better mayor (as evidenced by their complete lack of effort to actually find a better candidate than Ted Wheeler which really shouldn’t be that much of a challenge), but instead it’s just being a misguided fringe effort whose main effect is producing noise, and now instead of just wasting their own time, they’re wasting the city’s resources as they need to now defend themselves in this stupid lawsuit.


Bullshit reporting on sex work

Sex work reporting in the US is broken.

Case in point: the headline for this USA Today article:

Largest human trafficking sting in Ohio history nets 161, including city councilman

Okay, that sounds pretty serious! Let’s take a look.

Most of those arrested were charged with engaging in prostitution, a first-degree misdemeanor.

“We want to send a message to everybody in the country: Don’t buy sex in Ohio,” [OH AG Dave Yost] said.

For every arrest made, officers contacted up to eight “johns” whose actions didn’t meet the elements of any crime

So, in other words, this was less of a “human trafficking sting” and more of a “dragnet that mostly caught people looking to engage with sex workers.”

Of course, that sounds a lot less heroic, so instead, law enforcement instead partnered with local reporter Dean Narciso who committed journalistic malpractice by running a story whose byline indicates it’s about human trafficking, but from its content was overwhelmingly about people seeking otherwise consensual sex work.

There were some welcome outcomes as part of this sting. The article indicated there were a couple people interested in paying for sex with minors, and there were some drug and firearm related charges. The article also indicated that ten missing minors were found as part of this operation as well. Those are good things, but that paints a dramatically different picture than finding “161, including city councilman” involved with sex trafficking. There is no detail about the minors or where they were from, and there is no indication that anybody at all was charged with any crime related to human trafficking (and I assume if there was, that would have been prominently featured). The entire article is very hand-wavy with the details and appears to use statistics cherry-picked by law enforcement.

The icanthascheezburger stance on sex work is simple, and the stance is backed by things I consistently hear from actual current and former sex workers:

  • Sex work is work. Sex workers sell their body in exchange for money and benefits, which describes literally every other job.
  • Doing sex work isn’t morally problematic (and being a sex worker doesn’t make you a “victim” any more than anyone else is a victim of economic inequality). Neither is being a customer of a sex worker. Consent (ideally enthusiastic consent) is key.
  • The reason prostitution is illegal in the US isn’t because it’s in the public interest for it to be; it’s because of religious zealots that pushed for it.
  • People are wising up to the fact that sex work should be legal, so law enforcement and others against sex work now lie about it and conflate it with human trafficking
  • Human trafficking of sex workers is largely only as big of an issue as it is because sex work is illegal; if it wasn’t super off the books, the very abuses anti-sex-work crusaders rant about are structurally a lot less likely to happen.
  • Human trafficking as a general issue is not accurately reported on, and facts get widely distorted. There’s an excellent You’re Wrong About podcast episode that goes into more detail about this.
  • Most of the issues that sex workers struggle with are a direct result of the fact that it’s illegal; it’s introducing sex workers to unnecessary risk
  • Journalists need to stop serving as a mouthpiece for law enforcement and actually start holding them up to scrutiny, instead of repeating their talking points. Law enforcement does not automatically have credibility.

Demand better news coverage of sex work and law enforcement.


Remembering Steve Jobs a decade later

Steve Jobs standing by an image of a street sign representing the intersection of technology and liberal arts
10 years ago I posted a few quick memories about Steve when I was saddened to learn of his passing.

I’d like to again share a passage from his Stanford commencement:

No one wants to die. Even people who want to go to heaven don’t want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because Death is very likely the single best invention of Life. It is Life’s change agent. It clears out the old to make way for the new. Right now the new is you, but someday not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it is quite true.

I’ve grown older and wiser over the past decade, and while my appreciation for the fruits of Steve’s work remain as strong as ever, I’ve grown less okay with how he built these products. The truth is, he could be a real dick, and although a lot of people who worked with him forgave him for this, he inspired a lot of other people in tech to be total assholes in how they ran their teams, and those teams didn’t make better products for it. Perhaps the luckiest thing about Steve’s leadership is the fact that we were able to get such incredible products despite how abrasive and quick to opinion he could be.

I’ve watched this industry mature in the last ten years and I’m really glad for it. Teams are a lot more caring. Work/life balance is increasingly valued and we are learning that the hustle/grind culture of startups is empty and often ineffective. We’re more mindful of structural issues like sexism and the lack of diversity in tech companies, especially the higher you look up the org chart.

We still have company leaders that are Steve Jobs-esque in their temperament (cough Elon Musk), but we came a long way as an industry, and I’m reminded of Steve’s own insight that even he would someday no longer be the new, and it would be his time to go and make way for the new.

This commemorative post isn’t to shit on Steve Jobs now that I’ve gained new perspective on some of his behaviors, though; it’s just that I now have a more refined appreciation of his contribution to the world, and importantly, a lot of my appreciation now is less about what Steve as an individual was able to do, but rather what his teams were so often able to do.

And people seem to understand that better now than ever before. In the late 90s we would see household appliances being manufactured in some knockoff of Bondi Blue because people thought that’s what made the iMac so iconic, but today entire companies are now more deeply driven by great design as well as the understanding that design goes deeper than the color and material on the case of your product.

People understand better than ever before that technology is more than just gadgets and apps. To make great technology products requires deeply understanding humans too, and that’s why Steve’s Apple always proudly stated that they stood at the intersection of technology and liberal arts.

That’s a hell of a legacy.


The Pull Request Is More Than Just a Pull Request

Editor’s note: This is a programming-related blog post; if you don’t program or your don’t know what a pull request is, you probably won’t get much value out of reading this.

Disclaimer: I currently work for GitHub, and I work on the pull requests team. I’m writing these thoughts about pull request in my personal capacity, and I don’t speak for GitHub (although I bet GitHub agrees with every word I say here).

The pull request’s most obvious value is as a tool to ensure that you are merging in high quality changes to your software. And that’s probably what you’re primarily using them for. You’ll post a PR with your changes, and a story behind those code changes, and you’ll spur a conversation that ideally ends with you improving and merging the code (or sometimes closing the PR without merging because of stuff you hadn’t considered, and that’s okay too!).

But that’s not the only thing a pull request is good for!

As someone is reviewing your pull request, it gives your PR a second, and equally important purpose: spreading knowledge! Now that others are reading your PR, you’re not the only person who knows how it works. It’s less risky for your company when more than one person knows how something works.

And if that’s not enough, a PR is a gift that has the potential to keep on giving, even long after it gets merged.

A merged PR serves as an excellent artifact. It is an immutable record of the actual changes to the code, but also it’s a glimpse into the developer’s state of mind from when they wrote it.

Awhile back, I was working on preparing some queries for an upcoming change in our database setup, and I noticed that there was a piece of code that conditionally would either take a list of IDs or a scope that represented a set of records. A comment in the code told me that taking the scope instead of the IDs was faster, but it didn’t say much else.

I was about to revert that change, so I was interested in knowing more.

GitHub ProTip™: If you’re looking at a commit page in GitHub, it’ll show you the pull request where it got merged, which is super handy:

screenshot of a GitHub commit page showing that you can see a link to the pull request a commit was merged to the main branch in

And in my case, I was able to look at fuller context behind that optimization, and I learned two important things: 1) the PR was over six years old, and that decision was based on very old assumptions about our database size and setup, and 2) the actual improvement in performance at the time was not huge. Armed with that knowledge, I deployed my change with better confidence (still using Scientist, of course), and I was tickled to learn that indeed, my approach came with a modest performance improvement in most cases.

Pull requests aren’t just good for developers looking for extra context behind an old code change. While your individual commit messages might be full of gems like WIP and fix that linter error for real this time and this really should work now, the PR itself becomes a place where you get to craft the story the way you want to.

If you have a docs team or a team that constructs release notes, PR descriptions are probably a lifeline for them, serving as an initial draft of what will become the user-facing release notes.

And if you release something and discover a bug that a PR introduced, the PR is even more helpful, because it’s an opportunity to see if the bug or bad behavior was expected or considered.

The PR is an important cultural contribution to how software development teams work together. They might have been introduced as a way to make it easier to merge in changes, but secretly they were a Trojan horse that also got software developers unknowingly to produce useful, human-generated documentation without skipping that step, because now it’s part of the software development process.


The “Recall Ted Wheeler” Boondoggle

There’s been a campaign in Portland to recall our mayor, Ted Wheeler.

Ted Wheeler is a milquetoast husk of a mayor, true, and I look forward to seeing him get replaced with a better mayor.

But here’s the thing: we didn’t need to recall Ted Wheeler; he was up for getting replaced already.

Last year. Twice!

He failed to hit the 50% threshold in the earlier election, leading to a runoff in November. The progressive candidate that was the best alternative was picking up some serious steam. And we failed to elect the viable alternative to Wheeler by a margin that can directly be attributed to a write-in campaign for another candidate.

If we couldn’t pull together during a full election cycle to find a viable alternative candidate and get them across the finish line, this halfhearted fringe effort a year later sure as hell isn’t going to move the needle.

And indeed, the Recall Ted Wheeler effort has so far failed to even generate enough interest to get the necessary signatures to kick off the process. The campaign raised money just to hire people to collect more signatures just to continue getting the recall effort off the ground. The effort had until a few days ago to collect enough signatures and does not appear to have gotten them.

And even if somehow this had picked up steam, we had no viable candidate to replace Wheeler. If progressives couldn’t replace Wheeler during an actual campaign season last year, how the hell are they going to be able to even find a viable candidate and then mobilize voters to support that candidate when they can’t even get the necessary signatures to even start the recall process?

Every hour of effort and every dollar spent on this pointless campaign could have been better spent on just about any other initiative. The whole campaign feels less like an actual effort to accomplish something politically and more like a personal vendetta campaign to make Ted Wheeler feel bad whenever he travels through the town as he realizes how little Portlanders care for him.

I’m deeply sick of progressive movements being a synonym for ineffectiveness.