I think I've written about this a couple of times now.
Every time I start an old PC, like my Pentium 133 from 1997, this effect becomes
immediately obvious: If that computer is running period-appropriate software,
the UI doesn't feel all that much slower than modern PCs.
Sure, individual tasks like image processing will be objectively slower. But
not the UI. Not as much as you'd expect from a machine that's 20-30 years old.
Here's an example: Compare a word processor, say StarOffice 3.1 running on OS/2
Warp 4 on that P133, with a modern web-based "document writer". Open a file.
You will notice that this takes a noticeable amount of time in both cases. Text
rendering, text layout, all that. Opening dialogs and new windows takes time.
You don't *have to* reach for web-based programs. Just compare StarOffice 3.1
with LibreOffice 25 (StarOffice is the ancestor of LibreOffice). I feel like
this should be a day-and-night difference in terms of speed, but it isn't.
It is as if the speed and responsiveness of UIs is stuck at a certain level,
even though the underlying hardware has improved dramatically.
As a comparison, run something like MenuetOS on a contemporary PC. Even if it
is just running in a virtual machine, you'll notice just how fast this thing is.
You click on something and, boom, it's there. So it's not some weird, magical
fundamental property of UIs to be "slow". They can be fast.
We just don't write programs that are *as fast as they could be*.
We instead make them *as slow as tolerable* and/or *as fast as barely needed*.
Is there a term for this? I only found the "Doherty Threshold" (but I have to
note that there is no Wikipedia page for this, so take it with a grain of salt):
It claims that there is a certain threshold after which users will no longer
perceive a program as "sluggish". And I claim that we only ever put in the
minimum amount of effort to cross that threshold, and then we call it day.
Add to this the fact that the machines of developers are often much more
powerful than those of the average user, and you have an explanation why many
modern programs are, in fact, really, really slow and sluggish.
I propose that we use Retro Computing as a tool in education. Put young
developers in front of very old machines, show them the UIs of that era. Make
them realize what these systems were capable of (not just in terms of speed)
before those devs were even born. Make it clear to them that, if StarOffice 3.1
had an acceptable speed, the dev's new program should be *blazingly fast*.
(I realize that if you want to make something web-based, like virtually all
people do nowadays, you inevitably add network latency to the equation. I'd
respond with: Maybe not make it web-based unless there's a really compelling
reason to do so. Or find ways to mitigate that latency. The main point is that
you should be *aware* of this and of how your application is suffering from it.)