I've recently been visiting a site and, as usual, the various browsers
I use provide various different user experiences. The site is coded
poorly, but I like it anyway as a resource. The in-site search engine
has GET vs. POST errors. There are also errors in images that are,
judging by the nomenclature, generated by request, which probably
wastes a ton of electricity and processor power.

When you use a site and drive up traffic or introduce "strange"
traffic, sometimes the webmaster's response is to blacklist you
thinking you're part of a botnet or something. Sometimes they
blacklist the UA string. I simply change the UA string and everything
works again (usually), which is the reason for the title.

In my experience, this is most often employed by news sites, The
reason? Their sites are perfect for TUI/JS-less consumption and that's
somewhat antithetical to online adware. The most ridiculous thing is
most of the time they will block off articles that are more than 24
hours old, despite the fact that they are viewable by anyone in that
tiny time frame.

It's just sad when a legitimate user has to trick and lie just to view
content freely available on the Web. Web 1.0 or 1.5 was much better
than the current massive dump truck of bytes and dynamic fat code the
current Web is.