Sunday, March 13, 2022

Computers Are Now Fast Enough

I've been using Macs since 1992, and whenever I bought a new one, the speed felt exhilarating. Returning to my old computer, I couldn't understand how I ever tolerated the slowness.

In 2020 I bought a 2018 refurbished Mac Mini. The speed felt exhilarating, and I couldn't understand how I ever tolerated the slowness. But, since then, I've tried newer computers - including much faster ones - and they don't seem the least bit faster. I also added another 8gb of memory to my Mac Mini, and...nada. It's the strangest thing. It feels like I'm stuck in mud.

My first theory was that I'm no longer taxing my computer. I used to produce audio and video, processing immense log files and performing multiple search and replace macros on whopping book manuscripts. Now I'm just a-surfin' and a-bloggin'. Wussy stuff!

But a programmer friend, who uses his computer for gnarly tasks, recently bought a brand new MacBook Pro with the Apple-designed M1 chip, and pronounced it "meh". So it's dawning on me that computers are finally fast enough. After 30 years of relativism ("what once felt fast will soon feel slow"), we've hit the wall of the absolute. Pretty much everything we need to do on personal computers happens pretty much instantly. It's over. The field has finally matured.

Of course, there are still those who'd benefit from further gains in processing speed: video editors and large-scale number crunchers (scientists, game designers, etc). But that's a very narrow slice.

Unless you understand this, you will process the "meh" outcome of your expensive processor and memory upgrades as disappointing, when the reality is that there's simply no room for improvement. We no longer need computers to get better. Time to reframe!



All of this is interesting to me. But more interesting still is that no one's taken note. Have you seen this point made anywhere? Any writers pronouncing the arrival of the "maturity" stage of personal computing? Any reviews of new computers noting that "it's much faster, but 99% of users won't need, and won't even notice, the extra headroom"?

From childhood right up until I became a journalist and writer (and discovered first-hand how lazy, trendy, and timid most writers are), I assumed, like most people do, that things are efficiently noticed and reported. It's futile to try to make fresh discoveries because it's all been thoroughly covered. There are libraries full of books! Anything you could come up with is probably already in there somewhere!

If some deli were cooking amazing tacos, the newspaper food writer who makes swaggering declarations like "Best Pizza in Bushwick!", and who therefore must be all-seeing and all-knowing - would be sure to let me know. It's all covered. No need to seek.

The Chowhound community embarrassed the bejesus out of those palate potentates, finding megatons of deliciousness never previously noted in print (as the NY Times' Eric Asimov graciously noted last week, it flipped the whole process, and experts began slavishly looking to us peons).

This is true of internal as well as external phenomena. The concepts I explore here on the Slog have been brewing in my head forever, but I always assumed I was working out notions that smarter people previously covered. I never imagined I might have a fresh insight (to this date, I'm still intensely skeptical). I never imagined there could even be any fresh insights, period! Every notion - and every taco - detectable by the likes of shmucky me would surely be nothing but restatement of the obvious.

So if personal computers ever matured to the point where Moore's Law became a strictly academic observation, we'd have heard about it, right? Yet it happened! Years ago! And no one said so!

The real credo behind Chowhound was "Go out and discover for yourself! The 'experts' are lazy-asses who don't come close to covering things thoroughly!" This credo extends into every aspect of life. And, oh Jesus, take me now, I once again, whoops, find myself encouraging and amplifying America's most insidious problem. I'm stuck in a loop. Send help.


Note 1: This will hurt Apple. Their business is built upon the presumption of obsolescence. Of course, computers will still break and need to be replaced, and new features will be conjured up to entice upgrades. But the maturity of personal computers changes the dynamic fundamentally. It becomes a whole other business.

Note 2: Apple is currently discounting refurbished 2020 iMacs with 27" displays more steeply than I've ever seen them discount anything. You can buy a good one for under $1500. Meanwhile, they're selling a new 27" Studio Display for $1600-1900, way above most people's budgets. And there is exactly one other 5K 27" display on the market, made by LG, and some Mac users love it while others hate it.

So if you want a reasonably priced awesome 5K 27" screen, you can either spend crazy cash, or wait a few years for prices to drop, or buy a refurbished iMac quickly, as I just did. The computer itself doesn't use Apple's fantastic M1 chip, but that chip appears to be a red herring. We don't need it. We won't notice it. Except....

Note 3: Chowhound "Limster" points out that when machine learning becomes a useful thing that we run locally on our computers (because there's less delay that way than via cloud computing), we'll start re-appreciating greater power and faster speed. I'm betting I have at least one computer lifecycle before that starts to happen, so it'd be nuts to up-spend for an M1 machine right now.


No comments:

Blog Archive