This Quadra 650 I'm typing on is almost too fast. Somehow, it feels zippier than
the 630 it replaced, but given the near-identical specs, that's probably just my
imagination. The video card is objectively better though; I can turn up the
resolution to 832x624 and everything seems small, distant, so much space to work
with, and all of it clear and sharp.

I've been spending a lot of time thinking about old displays lately. Recently, I
replaced my Dell LCD monitor with a Sun Microsystems CRT, capable of about the
same vertical resolution, but cropped horizontally. I like the way CRTs display
color, so much richer and more vibrant than anything but top-of-the-line OLEDs.
Probably more accurate too, though I haven't bothered to calibrate the Sun yet.
Faster response times, more elegant contrast, there are plenty of reasons to use
a CRT in 2023, though few do.

When most people think about technology, they imagine a line that moves forward,
constantly. Even the complaints about it take this as a natural assumption: it
moves too quickly, or it bypasses the vulnerable. But technology moves at
different rates across different axes, and when it advances across one of them,
it obscures its shortcomings on another. Technologists, or, more accurately, the
bosses who direct technologists, make decisions about prioritizing certain
characteristics of a technology, whether that's ease of use, ease of
manufacture, portability, fidelity, or a number of other concerns. When it comes
to displays, in the mid-2000s, it became capable to produce LCDs that were
smaller and lighter than most CRTs. People bought these displays in droves
despite the fact that, even compared to low-end modern LCDs, they kind of looked
like trash. But, they were more convenient to carry around, brighter, required
less fuss.

Every morning, when I start my computer, I take a minute or two to calibrate
convergence. The way a CRT works is that, inside the tube, there are beams for
red, green, and blue light. These beams need to be directed to “converge” on a
single point in order to get the color right, if they're out of alignment even
slightly, the image looks oddly blurry, smudged, in a way that's difficult to
put your finger on. Calibrating this, with a white crosshatched grid, is a
little ritual that tells me the monitor is displaying things just as well as it
possibly can for the day. It's a way to get me and the machine started on the
same page.

When Tim Berners-Lee famously published his paper on the Web in 1990, hypertext
was already a teenager. ZOG had been in use on aircraft carriers for a decade,
HyperCard was a successful product by Apple, and Ted Nelson was selling Xanadu
to anyone who'd listen. But now, when we think of hypertext, we think of links
embedded in a page's text, authored by a single writer (a webmaster, to use 90s
terminology). But Intermedia, a hypertext system running on Apple's Unix
distribution, A/UX, stored links outside the text, and allowed each user to
author their own links by marking up the text with annotations. These links were
bidirectional, meaning that every link to a child page would link back to its
parent, and could also be made to link to multiple places, via a simple menu.
Editing and permissions were also built into the system, features which are
difficult to reason about for even experienced web developers today.
Berners-Lee's system won out not because it was advancing the state of the art,
but because it was practical and accessible.

Thirty years on, we're all supposed to marvel at it, as if it were a massive
leap forward, when, in truth, it was just a compromise that caught on.

Back to displays. Showing anything on a CRT is an analog process: a VGA cable
carries signals for the various colors that need to be displayed, and over it,
the monitor and the computer decide to transmit those signals at a certain rate.
In order to get the smoothest motion possible and avoid headaches, the rate
should be set as high as the hardware supports. In the early 2000s, there were
CRTs made that could display resolutions higher than HD displays at refresh
rates nearing 90Hz. A standard LCD these days will display at a rough equivalent
of 60Hz, gaming panels can go up to 120. Last week, I fought with my series of
adapters to try and push the refresh rate and resolution as high as possible,
but due to strange behaviors in either the adapter or the modern graphics card
on my computer, I fell far short of what the monitor was capable of. There was
a point in 2004 where the average computer could display content on that monitor
in a way that was noticeably better than what I can do today.

As time marches on, decisions are made, opportunities are foreclosed upon, and
technology moves ever-forward. But the technology we have today reeks of decay
and stagnation; Twitter is dying under the rule of an inept petty tyrant,
Facebook tried in vain to make the Metaverse happen, TV manufacturers pushed 3D
and failed, each successive generation of mobile phones is more boring than the
last. Pair that with layoffs, bank runs, and tightening controls on labor, and
it's clear: the old world is dying.

I don't know what comes next, but Ive seen enough to say that this iteration of
tech is a failure. If the road ahead of us is inevitably full of VCs and their
lackeys hawking the next NFTs or cryptocurrency, we could do worse than look
behind us and see what got left by the wayside as we marched headlong into this
dead end. There might be some interesting things that, in our hurry, we left
scattered on the side of the road.