TIMEWARP TOOLSET
I stick to my old computers, mainly this average mid-90s desktop
and an average early 2000s laptop. They can do everything I want,
so (as examined here before in vastly more words) why upgrade?
Well one reason is that the rest of the world doesn't agree and
finds niggling ways to make it difficult to keep using a PC from
the 90s or 2000s with the products of PCs that other people use
today. With my 'Internet Client' system on its low-power
modern(ish) SBC, I cheat by running software like Firefox on it
remotely using X over the network. But also I can use it to convert
documents into formats that are more suitable for old computers to
handle directly.
Converting formats is generally pretty basic stuff, but picking the
right formats and settings to suit old computer hardware/software
might be just about obscure enough now to be worth documenting, so
here are some notes on doing so from a modern Linux environment:
VIDEO
I convert this with ffmpeg, mainly videos downloaded from YouTube
or via EDonkey2000. The latter can often only be found in HD, so as
well as converting to an older video/audio/container format, the
video resolution needs to be scaled down to something that won't
cause an old computer to run out of RAM. Storing multi-GB files
also won't work so well, so I drop the bitrate to something I deem
sensible as well. These days I also convert the audio from stereo
to mono to reduce its bitrate further. If an old computer/device
pauses or loses A/V sync during video playback of some videos, even
though its CPU cycles aren't fully used, the bitrate might be
higher than the speed it can read from the storage device.
Formats are a bit of a minefield. DIVX/XVID was a common early
standard that isn't too bad on compression. It's MPEG4 video and
MP3 audio in an AVI container. I use the following command to
convert to it with FFmpeg:
ffmpeg -i "video.mkv" -r 24 -acodec libmp3lame -b:a 64k -ar 44100 -ac 1 -vcodec mpeg4 -f avi -b:v 270k -vtag xvid "video.avi"
I first use "ffprobe" to check the input video resolution and if
it's too high, add the "-s" to resize it. I try to keep the width
somewhere around 640 pixels and the height an equal fraction of the
original hight to preserve the aspect ratio. A common example is
1280x720 becomes "-s 640x360".
AUDIO
Very old audio software predates compression and therefore often
only works with WAV files. At modern sample rates these become
massive and too much to deal with on very old computers so you need
to drop the sample rate way down. How far you can go without just
getting noise depends on the audio, and you may want to experiment
with different settings to find the minimum practical bitrate.
Since I think Gopher deserves old-fashioned formats, I converted
the Darkplace "welcome jingle" down to a 22KHz 8bit mono WAV in
Microsoft PCM format:
gopher://aussies.space/s/~freet/DARKPLACE/welcome.wav
Even DOS has MP3 players available though (eg. Open Cubic Player
which is also great for playing tracker modules), so generally MP3
is the best target since it makes storing long tracks practical so
long as the computer is fast enough to decode it. But again
bitrates and sample frequencies may need to be pulled back from
modern standards to avoid pauses or slow-down on low-powered
computers (from the mid 1990s or earlier). FFmpeg can be used for
audio conversion with the "-a*" option flags as used above for
video.
IMAGES
Image formats have been wonderfully stuck in the past for years,
with JPEG ruling supreme and PNG as the new kid on the block
replacing GIF for the last couple of decades. But now WEBP and AVIF
are replacing them, so sometimes a little Image Magick is in order
to kick things back to the 1990s:
convert photo.webp photo.jpg
More of an issue really is photo resolutions. Images many thousands
of pixels wide, as are common today, will easily fill up the
typical RAM of an early to mid 1990s PC before it's even had the
chance to spend minutes scaling it all down to something you can
see on the limited resolution of its own display. To have a hope, I
recommend a maximum image width/height of 800 pixels. Image Magick
'convert' conveniently supports automatically adjusting the other
dimension to retain the aspect ratio, so with a big image you just
need to use "convert -scale 800", eg.
convert -scale 800 bigphoto.jpg photo.jpg
If you need to cram them onto a small storage device, experiment
with the "-quality" setting too, where I usually find the minimum
acceptable value somewhere between 50 and 75. Document scans might
be converted to greyscale or B/W before saving as PNG or GIF to
reduce their size significantly without introducing artifacts by
saving as JPEG. This is best done after adjusting gamma and
contrast in a graphical image editor (eg. mtPaint) to reduce noise.
I play about with this a lot to try and get the most
space-efficient storage of scans in the History Snippets section:
gopher://aussies.space/1/~freet/historysnip
DOCUMENTS
TEXT
Document formats tend to be a pain. Good old plain-text is ideal,
although perhaps still needing a pass through 'unix2dos' or
'dos2unix' in order to swap the newline characters. UTF-8 can be
converted to old-fashioned ASCII on Linux using iconv:
iconv -f utf-8 -t ASCII//TRANSLIT newtext.txt > oldtext.txt
Most characters aren't able to be substituted though and will end
up replaced with '?'.
DOCX
Some people will insist on sending these attorcities to you
regardless of how much you beg them to use a format that isn't tied
to an over-priced M$ product that magically manages to make a
simple page of text take up more than you could fit on a floppy
disk back in the 1990s. Pandoc is a useful tool for turning DOCX
into something sane. Although it might be sacrilege to admit here
on Gopher, I really like HTML for this purpose. It loads easily in
old/lightweight web browsers and images can be embedded inside it
(although again watch out for high resolutions (really big ones
don't seem to work at all and Pandoc skips them anyway)):
pandoc -s --embed-resources --ascii -o document.htm document.docx
XLSX
Unless you're as mad as I am - doing business tasks on 20+ year old
computers - spreadsheets probably won't be so much of an issue. But
even on modern systems some might prefer the ease of converting
DOCX to TSV format using the little sc-im spreadsheet program
instead of rousing the graphical juggeraught which is Libre Office:
sc-im --export_tab --nocurses --quit_afterload spreadsheet.xlsx > spreadsheet.tsv
PDF
PDF has evolved over time and old reader software often can't cope.
Some documents contain high resolution phtots and can therefore
carry the same RAM and CPU overheads from high-res images.
Converting to Postscript is one option, but this sometimes produces
massive files up to and over 100MB out of relatively
innocent-looking PDFs. Another option is to convert to a folder
full of per-page images. Ghostscript can accomplish either of these
tasks, as can some command-line tools that come with Xpdf (or the
Poppler fork):
Convert to PNG images with an Xpdf tool and Image Magick:
mkdir document
pdftoppm document.pdf document
cd document
for i in document-*.ppm; do convert $i ${i%.ppm}.png; done
rm *.ppm
Convert to PNG images with Ghostscript
mkdir document
cd document
gs -r150 -dNOPAUSE -dBATCH -sDEVICE=png16m -sOutputFile=document-%06d.png ../document.pdf
Convert to Postscript with an Xpdf tool:
pdftops document.pdf
Convert to Postscript with Ghostscript's pdf2ps script:
pdf2ps document.pdf
Or directly, without the helper script:
gs -dNOPAUSE -dBATCH -sDEVICE=ps2write -sOutputFile=document.ps document.pdf
An interesting thing is that when converting to Postscript, Xpdf's
'pdftops' tends to generate much smaller Postscript files than
Ghostscript. For example one PDF which converts to a 73.8MB
Postscript file with Ghostscript makes only an 18.8MB file when
converted using pdftops, yet both seem equally readable in a
Poscript viewer. There's no such difference between them when
converting to images though.
COMPRESSION
Zip is king for compatibility, though tar.gz and tar.bz2 are
equivalent if only dealing with retro *NIX. You probably know how
to deal with these already.
TRANSFERRING FILES
Sometimes there's nothing that bad about hooking up a good old
floppy drive and waiting while it clunks about. Playing with ZIP
discs is another option if small floppy capacities are hitting you
too hard. CDs are magically big data sinks on old computers, but
generally read-only. If you've already got your old PC on the 'net,
the modern method of network file transfer is probably more
convenient:
NFS
For transferring between Linux system NFS offers surprisingly good
compatibility. Still, it takes a bit of setting up and there are
compatibility glitches which are hard to debug and fix since it's
part of the Linux kernel.
FTP
For compatibility between OSs from different eras, and ease of
set-up, FTP is king. Server and client options abound for most OSs,
and compatibility usually isn't a concern unless you're doing
something really weird like my streaming in/out video to FFmpeg via
FTP in order to convert it remotely from my old laptop. most
graphical FTP clients have support for recursion built in, so you
can shift files and directories about much like on the HDD. Some
file managers, such as Midnight Commander on Linux, even support
FTP. On very limited systems or in scripts, the 'ftpput' and
'ftpget' commands in Busybox (or the original ncftpput/get versions
from NcFTP) can be handy. I also quite like using a plain old
command-line FTP client like the one in GNU Inetutils.
FTPFS
This is really the best of both NFS and FTP combined, since it
allows mounting an FTP server directory on Linux so you can access
it like any file system, transparant to applications including file
managers. The first implementation was a Linux kernel driver, but
the FUSE-based CurlFTPfs is easier to set up and works with newer
(relatively) Linux kernels. You can use the 'curlftpfs' command
like 'mount' (but with a lot of extra '-o' options listed by
"curlftpfs --help"), but long-term it's easier to add entries in
/etc/fstab, like this for the FTP server running on my 'icli'
Internet Client system:
# Internet Client FTPFS
curlftpfs#ftp://[user]:[password]@icli//mnt/onboard_flash/DATA /mnt/idata fuse users,ftp_method=singlecwd,kernel_cache,gid=50,uid=1001 0 0
Note that CurlFTPfs tends to open silly numbers of simultaneous
connections to the FTP server when accessing large files. In
practice this doesn't really matter over LAN, but could get you
blocked from public FTP sites and may cause problems if the default
configuration for the FTP server you set up limits the number of
connections allowed from a single IP address.
CONCLUSION
Of course playing with old computers should also be taken as a
prime opportunity to explore old formats that aren't used much
today: ASCII Art, ANSI Art, Tracker Modules, GIFs from old
BBSs/shareware CDs, and more. But by using a few tools for
conversions (maybe even on a Tilde shell account where many of
those tools will probably be installed already), they can fulfill
many tasks of a new computer as well and continue to serve a
practical role just like they did decades ago.
- The Free Thinker