Atoms

Multimedia particles in a similar style to a tweet, also serving as a changelog for this site. Cross-posted to an atom feed and Spring '83 board. Frequently off topic.

Go 1.20 is released. From the notes:

Work specifically targeting compilation times led to build improvements by up to 10%. This brings build speeds back in line with Go 1.17.

Excellent. New features are nice, but it’s the boring fundamentals like build times that really make a difference for everyday users. The addition of generics in Go 1.18 had an impact on build times, but those losses have been recouped. Generics are good. Fast builds are good. We can have both.


HBO’s The Last of Us televisation is our latest cultural zeitgeist, and I’m watching it like everybody else. Last night’s episode (ep 3) was the best so far, although it’s notable what they changed to appeal to a broader audience.

— SPOILERS —

Without going into the specifics, sufficed to say that Bill and Frank’s life in the game version is more antagonistic and stark. You can spoil yourself here, but the last piece in Bill’s ark is grim, and it’s not because he dies or is injured. He’s also more abrasive, and exchanges some great one-liners with Ellie as they insult each other. The TV version portrays a far more idyllic life and very gentle redemption ark. It plays on your emotions well and I liked it better overall, but it does some to detract from driving home the brutal edge of The Last of Us’s world.

By the way, I love how they kept the game’s military-style angle-head flashlights looped onto their backpack straps (see second image below). You can actually buy these things.


See: PEP 703 – Making the Global Interpreter Lock Optional in CPython.

Fascinating stuff, and the detail and attention that goes into these PEPs is just incredible. The GIL (global interpreter lock) has been the age-old Achilles Heels of Python and Ruby, and given the advanced age of these languages, I’d assumed they were problems that weren’t going to be solved as an increasingly intricate implementation makes more fundamental changes harder to tackle every year. But an increasingly parallel world of computing seems to have created demand to attract people with the necessary gumption to pull this off.

Ruby gave up on the GIL in favor of Ractors which suggest the use of many parallel environments, each with their own GIL. They might’ve worked, but were extremely backward incompatible, and two years later, nothing supports them.

Python’s optional GIL will be opt-in via compiler flag, but provides a path for the ecosystem to migrate incrementally so it could be default in a not-so-distant future.

A couple things that stood out to me. The implementation will move to a stop-the-world GC to freeze necssary state:

When running without the GIL, the implementation needs a way to ensure that reference counts remain stable during cycle detection. Threads running Python code must be paused to ensure that references and reference counts remain stable. Once the cycles are identified, other threads are resumed.

And because the GC becomes stop-the-world, its generational GC is disabled:

The existing Python garbage collector uses three generations. When compiling without the GIL, the garbage collector will only use a single generation (i.e., non-generational). The primary reason for this change is to reduce the impact of the stop-the-world pauses in multithreaded applications.

Single-threaded performance takes a 10% hit, but there’s already suggestions for changes to recoup some of that. Over time most of that likely gets optimized away.


Published sequence 039, on the site of the bird site.


Two years ago in the depths of San Francisco Covid-mania, I wrote up some predictions on what would happen to the city over the medium term. Today I published Revisiting my two-year SF predictions, and by my count, got 7 out of 9 right.

Some of these seem obvious in retrospect, but I wrote them down because they were starkly contrarian compared to claims made by the city’s commentariat. A popular opinion at the time was that tech needed San Francisco, so the longest and hardest lockdown in the nation would be perfectly okay as a closed downtown would spring right back to life on a mere word from London Breed, like a dog waking on command. Policy apologists guffawed with dramatic bemusement at the very idea that any of these departures could be permanent. Today, we see that almost every one of them was.

Network effects in cities are powerful, and that indeed has been the story of SF and the Bay Area writ large over the decade as companies swooped in soak up innovative energy, capital, and talent, which further compounded and cascaded. My prediction today is the city will continue to observe a reverse network effect as more companies give up expensive leases which have continually lessening ROI.


Published sequence 038. You look nice today.


Posted today: Stripe Sets One-Year Timetable to Decide on Going Public. Along with this classic Stripe press “leak”, the company also sent a simultaneously email to alumni, which honestly, was nice of them.

Predictably, there was a lot of carthasis from ex-employees, with even more chatter than during the 14% layoff a few months ago, and a hundredfold above ambient Slack levels. With the first batch of RSUs set to expire early 2024, there’d been a lot of consternation over the last couple years to say the least, so it was a big event.

It’s unambiguously a good thing, but I couldn’t help but notice that even with this BEST NEWS EVER message, the company is still tacking into its usual non-commitalism. Instead of unambiguously planning an IPO, it’s “either an IPO or private market transaction”, leaving huge error bars and uncertainty. Maybe something cynical, or maybe just unavoidable given the unpredictable market conditions of 2023. Hopefully, elite 4-D chess in pursuit of profitable ends that lowly grunts like myself aren’t privvy to, and couldn’t possibly understand.


Is this a food picture blog now? I hope not. I’ll try not to make a habit of it.

My curiosity was piqued when I found out there’s a Mensho subsidary in the ground floor of the Twitter building. Mensho’s a ramen company out of Japan that opened a shop called Mensho Tokyo a few years ago in San Francisco, notable for having the best ramen on this continent, and an omnipresent 50+ person lineup out the door and down the street, something I thought would clear with time, but never did. In 2021, they expanded into Twitter’s ground floor with a new installment – Jikasei Mensho.

Unlike the original, Jikasei Mensho’s more of a fast lunch stop sort of deal, and as far as I can tell, not great? You walk up to a counter, order from a computer McDonald’s-style, and your minimum tip option is 15% on a $22 bowl of fast food ramen for the privilege. It arrives in a plastic bowl. The chashu was good, the eggs were good, but the noodles and broth subpar, and some of furnishings questionable, like sliced lemons. Maybe I happened across it on an off day, or maybe ramen tastes worse out of plastic.

Post-Elon return-to-office Twitter is a bit more lively. There were quite a few people down in the cafeteria area where Jikasei Mensho is, where only a few months ago it all sat empty.


Published sequence 037 on the stairs at Oyster Point.


Witnessed a laptop snatching this morning. Woman is sitting next to window in a crowded cafe. Thief comes crashing in, grabs woman’s computer, and bolts. Five people run out after him, but as with a lot of criminal activity in San Francisco these days, it was organized, and a getaway car was waiting at the corner. Thief slams car door shut and rides off into the sunset (not literally; this was 11 AM).

It sounds blasé described post-hoc, but even “just” a theft with nobody hurt (luckily, it could easily have gone the other way) is hectic in real time. Downshifted mentally at the time in computing mode, it took a good ten seconds to process what’d even happened. People yelling. Tables knocked over. Coffee cups in pieces on the floor.

Whenever something like this comes up, San Francisco apologists are quick to make platitudinal statements like “report it to the police”, as if this is some kind of deep insight that only an enlightened California pro(re)gressive could have imagined. The situation was a perfect microcosm for why most people have stopped bothering. The SFPD was called immediately, and I waited around an hour without a single officer showing up, despite being within six blocks of a station, a cruiser rolling by every few minutes, and a situation that easily could’ve ended with someone seriously hurt. The crime had dozens of witnesses and clear HD footage from two separate cameras, but even if the PD did eventually appear, no one will be caught, let alone see a day in prison.

I started chatting with the guy next to me, who having just moved to the city a week previous was somewhat surprised. (But not that surprised, having gotten an accurate taste of the city’s culture from various viral videos.)

I told him that this wasn’t unknown, but not common. San Francisco, intent to preserve its reputation as deteriorating municipal hell stemming from the worst mismanagement of wealth in a thousand years, made sure to prove me wrong. Five minutes later, a woman walks in, grabs a handful of bills from the tip jar, and before anyone can react, leisurely strolls back out.


Published sequence 036 on 510 Townsend St.


I finally finished God of War Ragnarök this weekend. It’s a great game, with creative interpretations of Thor and Odin that turned out very well. Thor’s a lumbering giant who’s highly able, even if overly loyal to his father. Odin’s got mob boss vibes, which is unconventional, but a gamble that paid off. The game is huge, and I was surprised multiple times thinking I’d gotten to the end only to find a whole new world to explore.

I 100%‘ed it unlike the 2018 God of War, which meant that I wasted a ton of time following YouTube guides to find artifacts and ravens. I downgraded the difficulty to “give me grace” for the last two optional bosses and don’t feel bad about it. They spam attacks like no tomorrow, have a criminal number of unblockables, and the target lock system kind of sucks, especially in multi-enemy fights.

It’s notable that although very good, for all intents and purposes it’s the same game as the one released in 2018. There’s new story and new areas (although with some reuse), but the game engine, game mechanics, upgrades/skills system, and combat are all practically identical. Development for 2018’s started in 2014, which means that back then it took five years to build a full game and whole new engine, wherein Ragnarok was six for a game plus some minor updates. I’m sure some of that was lockdown delay, but I kind of suspect the game industry is hitting a plateau for large projects in the same way NASA did for space missions, Lockheed Martin for fighter jets, or Oracle for databases. Horizon Forbidden West (another triple-A title) was the same – great game, but practically indistinguishable from the original.


Exercises in verbosity: Loading data in Go is a heck of a lot of code

I’ve been writing Go professionally for something like a year and a half now, and compared to my previous daily driver Ruby, almost everything is better. Readability, speed of runtime, speed of tests, speed of refactoring, IDE insight, tooling – I could go on all day.

But when building out a larger-scale app with a non-trivial amount of domain logic and objects, it’s got holes. Some are as wide as Jupiter, and I can’t believe how little I see about them online.

The biggest I’m looking for an answer to is data loading, or more specifically, how to do data loading without thousands of lines of extraneous boilerplate. We solved our SQL-in-Go problem by moving to sqlc, but that in itself isn’t enough. I still write code like this daily:

team, err := queries.TeamGetByID(ctx, uuid.UUID(req.TeamID))
if err != nil {
    return nil, xerrors.Errorf("error getting team: %w", err)
}

if team.OrganizationID.Valid {
    org, err := queries.OrganizationGetByID(ctx, team.OrganizationID.UUID)
    if err != nil {
        return nil, xerrors.Errorf("error getting organization: %w", err)
    }

    if org.MarketplaceID.Valid {
        marketplace, err := queries.MarketplaceGetByID(ctx, org.MarketplaceID.UUID)
        if err != nil {
            return nil, xerrors.Errorf("error getting marketplace: %w", err)
        }

        return nil, apierror.NewBadRequestErrorf(ctx,
            errMessageTeamDeleteMarketplace, marketplace.DisplayName)
    }
}

In Ruby (or any language with some dynamicism and a good ORM), that whole block compacts comfortably to a single line of code:

raise ... if Team[req.team_id].organization.marketplace

And the Go version is only as short of it is because our foreign keys mean that we can skip handling some types of user-facing errors. i.e. We don’t have to worry about translating a missing organization to a 404 because foreign keys guarantee its existence. Sqlc also saves hundreds of lines – before the Go code for every one of those queries like TeamGetByID had to be written by a human and tested. Now we write the SQL and let sqlc do the work, but using Go built-ins database/sql you still get to do all of it.

Another bad-but-unavoidable Go pattern is preloading objects to avoid N + 1 queries, but then having to manually map them into structures that your code can actually use:

teams, err := queries.TeamGetByIDMany(ctx, sliceutil.Map(unsentInvoices,
    func(i dbsqlc.Invoice) uuid.UUID { return i.TeamID }))
if err != nil {
    return xerrors.Errorf("error getting teams: %w", err)
}

teamsMap := sliceutil.KeyBy(teams,
    func(t dbsqlc.Team) uuid.UUID { return t.ID })

Again, with an ORM this is:

Invoice.load(..., eager: [:team]).each { |i| i.team.name }

And the Go code was >2x longer just one release of Go ago. Notice the functions sliceutil.Map and sliceutil.KeyBy which were impossible without generics. Before, these were an initialization and a for loop – 4-5 lines each.

I’ve experimented with custom data loading frameworks that sit a layer above sqlc to reduce boilerplate:

err = dbload.New(tx).
    Add(dbload.Loader(&loadBundle.Cluster, req.ClusterID)).
    Add(dbload.LoaderCustomID(&loadBundle.Provider,
       req.ProviderID)).
    Add(dbload.LoaderCustomID(&loadBundle.Plan,
        dbsqlc.ProviderAndPlan(req.ProviderID, req.PlanID))).
    Add(dbload.LoaderCustomID(&loadBundle.Region,
        dbsqlc.ProviderAndRegion(req.ProviderID, req.RegionID))).

    // must be loaded after cluster
    Add(dbload.LoaderFunc(&loadBundle.PostgresVersion, func() *uuid.UUID {
        return &loadBundle.Cluster.PostgresVersionID
    })).
    Add(dbload.LoaderFunc(&loadBundle.Team, func() *uuid.UUID {
        return &loadBundle.Cluster.TeamID
    })).
    Load(ctx)
if err != nil {
    return nil, err
}

But so far it’s nowhere near enough – this one makes point loading by IDs a lot more succinct, but doesn’t handle anything less trivial like associated objects or one-to-many relationships.

I’m fairly sure that there’s nothing approaching an answer to this problem in the Go ecosystem. But aren’t there millions of lines of production Go out there by now? How aren’t more people running into this?


An article from 37 Signals on cloud spend.

They’re concerned enough about their AWS spend that they’re moving to their own hardware, a path paved before them by the likes of Dropbox and GitHub. It’s also always interesting when companies are transparent about their cloud spend. Hey (their email service) for example, costs $89k/month or $1.1M/year in AWS, with the biggest component being RDS, which eats a quarter of that.

The AWS bill is a creeping concern that tends to trend slowly upward over time without anyone really noticing, then suddenly jumps out of a bush to knife you in jugular. It in the beginning it’s “wow, look at all this stuff we’re getting for a few bucks a month”. Then a few years later it’s “what?! how many f* millions?? holy shit!!” It’s amazing how much money can be spent at $0.023 a gigabyte. At Heroku and Stripe we were eventually forced to engage in major AWS cost reduction projects, and in both cases the engineers involved ended up paying their own salaries many times over.


[3 months since day zero. The day Elon Musk bought Twitter, and the world ended.]

Know anybody who loudly quit “the bird site” in anger to go to Mastodon, and has stuck with it?

Me neither.


Archive ⭢