2020-19-11 No Mozilla, We Don't Want You To Break The Web xkp
========================================================================
I recently saw this announcement from Mozilla:
https://blog.mozilla.org/security/2020/11/17/firefox-83-introduces-https-only-mode/
From TFA:
> When you enable HTTPS-Only Mode:
>
> Firefox attempts to establish fully secure connections to every
> website, and Firefox asks for your permission before connecting to
> a website that doesn't support secure connections.
To me this is mostly reasonable. It's reasonable for two reasons:
1. It has to be enabled. I see benefits in this being manageable via
group policy or similar
2. It warns (but doesn't stop) you that content doesn't support HTTPS
connections
Just because a connection is over HTTPS doesn't mean it's secure. Just
because a connection is over HTTP doesn't mean it's insecure. The
wording implies HTTP connections are insecure, but I understand that
there's only so much space for a warning pop-up.
The part of Mozilla's piece that caught my interest was this:
> The future of the web is HTTPS-Only
>
> Once HTTPS becomes even more widely supported by websites than it is
> today, we expect it will be possible for web browsers to deprecate
> HTTP connections and require HTTPS for all websites. In summary,
> HTTPS-Only Mode is the future of web browing!
I posted a toot on Mastodon:
https://mastodon.social/@stevelord/105230641520893790
A few people didn't quite understand why Mozilla's desire to end HTTP
support is wrong. So here's an explanation of why Mozilla's desire to
remove HTTP support is a bad move and we should oppose rather than
celebrate it.
## It Breaks Parts Of The Web
There's been a strong push towards HTTPS over the past few years. It has
it's trade-offs. On the one hand the push for HTTPS helps encrypt
connections where encryption would be preferential. On the other hand
Google penalizing HTTP only sites in search results is somewhat
problematic when it comes to information better served over HTTP - e.g.
for systems that don't generally support current HTTPS.
It also introduces a moving target for websites to be indexed and
accessible. HTTP is always HTTP. HTTPS will one day be an as-yet
unreleased version of TLS over QUIC, and woe betide anyone who hasn't
upgraded to whatever Google tells you is acceptable HTTPS.
And it will be Google telling you what's acceptable, not anyone else.
They own the search visibility and they own the web browser most people
use. This is already happening. It was them driving HTTPS in the first
place.
In the case of upgrading traffic to HTTPS this is good for most users in
most situations. There are scenarios in which this is not good - e.g.
where the HTTP and HTTPS sites don't match and a HSTS file stops HTTP
from working, but thankfully such edge cases should be extremely rare.
We should acknowledge making the change introduces previously
non-existent breaking edge cases. At the same time the trade-off is an
overall net positive for situations where the connection could otherwise
be insecure.
Before this push, HTTP was commonplace. Even when this push to HTTPS
started there was a lag before HTTPS links became commonplace. Older
articles on news sites like the BBC or Guardian will still link to HTTP
URLs, that are then redirected to HTTPS by the destination server before
being served. Removing support for HTTP breaks these links.
There are two ways that this can be resolved:
1. On encountering a HTTP link, use HTTPS first and fallback to HTTP
with a warning if HTTPS is unavailable 2. Stop parsing HTTP at all and
tell people to change their links to HTTPS
Mozilla are rolling out Option 1 with an opt-in. I'm sure over time,
Firefox will try to make this opt-out.
Option 2 just isn't practical. That doesn't mean it won't be tried. It
just won't succeed in a way that gets a user clicking on a link to the
content they requested. At this point, the browser is working *against
the user's direct request*. So I'm just going to say this once:
If you're going to go against the direct will of the user you're
travelling in the wrong direction.
Mozilla might consider it a success if they never re-enable HTTP, but
people will just use a different browser to access HTTP content.
There is an idea that somehow TLS is the only way to encrypt things and
that TLS == Secure and everything else is not.
In reality things are more complicated. This is how DoH came to be
championed by browser and web devs with a facile understanding of local
networking, and villified by network protocol designers like Paul Vixie.
The reality is that security is riddled with trade-offs that need to be
considered within a greater than binary secure-or-not context.
## HTTP Doesn't mean insecure
I don't run HTTPS exclusively on my home network. Why? Because my home
network is sufficiently secured against MITM threats for my own threat
model. If I need encrypted access to a service at home I can tunnel over
SSH. TLS only adds overhead where a bearer network is already
appropriately encrypted.
> But what if someone breaks into your wifi network?
Then I have bigger problems than TLS, but I'll know when a new device
joins my network. If they work harder to get in without me picking it up
I'll know I have serious problems TLS will not address.
Where HTTP is insecure is when something messes with you between you and
the target site. In the US, some ISPs routinely mess with customer
networks. I resolved this problem by using an ISP that doesn't mess with
my network. I appreciate that not everyone has that luxury. In places
where I don't have control over possible ISP interference I use VPNs. I
have my own VPNs for exit locations not covered by my Mullvad
subscription.
I accept different people have different setups with different threat
models, I'm just writing about mine here.
So when we say HTTP is insecure it's important to say against what. The
most common thing injected into HTTP traffic in my experience of
attacking wifi hotspots are login pages and ads. Sometimes these ads
serve malware, which brings me on to my next point.
# HTTPS Doesn't mean a site is secure
If you read and trust what Mozilla, Google and summer children may tell
you, you might think HTTPS is secure and HTTP isn't, end of. Forgive me
for saying this is naive and that years of security research has shown
HTTPS security to be shall we say... complex?
TLS relies upon what's called a Web of Trust. This is a collection of
certificates that are considered trusted by your browser. Most blindly
accept all of these certificates. If you review those certificates
you'll see that many of these certificate authorities operate in spaces
you'll never visit. You don't know these guys. You don't know under what
circumstances they might screw you over.
You might look at the China Financial Certification Authority and think
they're not the kind of organisation you want to allow to issue
certificates that could be used to man-in-the-middle you.
But then you'd miss Digicert, who bought Symantec after they abused
certificate issuing so much that Google threatened to remove them from
Chrome if they didn't change their ways.
It's fair to say that an abused certificate by one issuer doesn't weaken
HTTPS as a whole. But its perfectly possible to misconfigure HTTPS in
ways that make interception or other attacks possible. Cipher and
downgrade settings can leave sites and browsers wide open. TLS is an
incredibly complicated protocol with complex parsers in complex
libraries open to complex memory corruption and configuration bugs.
Historically you would go to a website, make a request to the website,
get some HTML and then send separate requests to the same site for some
graphics and maybe a Midi file.
Nowadays you make a request to the website and are redirected to the
nearest Content Delivery Network (CDN) server. Your browser starts
downloading HTML and is redirected to other servers for stylesheets,
javascript, images.
The content you download often has some user-generated content on the
page. Thinking about what is possible, if the objective is to steal or
modify content, an attacker can target:
* The CDN infrastructure or any of the servers along the request chain
* The TLS configuration of any servers in the request chain
* The ability to upload user content (e.g. uploading malicious content
for access later)
* Any server-side applications involved in the request chain
* Any project or supply chain dependencies on systems or applications
involved in the request chain
* Other applications or user content uploads hosted on any servers along
the request chain
That is a *lot* to secure. I'm not saying HTTPS should be avoided. What
I'm saying here is that HTTPS is one part of a much broader information
assurance requirement. It is a tool in the box. It is not the box.
# Complexity And Insecurity
There are two common approaches to managing insecurity I'd like to
highlight. I'll refer to these as the Microsoft approach and the
OpenBSD approach:
1. The Microsoft Approach: Mitigating exploit classes through increased
complexity is a net good as it increases exploitation effort and cost
2. The OpenBSD Approach: Complexity increases attack surface.
There's a computing aphorism I feel is relevant here, Greenspun's Tenth
Law:
> Any sufficiently complicated C or Fortran program contains an ad hoc,
> informally-specified, bug-ridden, slow implementation of half of
> Common Lisp.
If you switch off browser scripting and use HTTP, an attacker's
exploitable (for values of remote code execution) surface area drops
through the floor compared to the same request over HTTPS with browser
scripting enabled.
Gwern as usual has fascinating comments on accidentally Turing Complete
systems:
https://www.gwern.net/Turing-complete
But short of a vulnerability in file formats or HTML parsing, with
scripting turned off the impact of MITM changes substantially. The web
was designed to display markup, not execute code. Now people who make a
lot of money from executing code want to tell you what the web should
and shouldn't do.
Security is a complicated and highly nuanced field and every decision we
make has a security tradeoff. It's easy to get distracted and convince
ourselves that we're doing good when we're doing harm. We can shout down
legitimate concerns over the consequences of our actions in the name of
security and progress theatre.
We could put effort into securing scripting languages and web extensions
but instead are focusing on the crypto as a way to avoid tackling it.
As the web browser has slowly gained the ability to rewrite the OS in
its own image, that *half of Common Lisp* is growing into a quarter of
Smalltalk. Nobody asked for it, but Google won. Mozilla exists solely so
Google can claim it doesn't have a browser monopoly, nothing more.
Cutting out HTTP means cutting out the web. The day that happens is the
day I cut out Mozilla.
Drew DeVault wrote some thoughts I share on the web here:
https://drewdevault.com/2020/08/13/Web-browsers-need-to-stop.html
Drew's gemini presence is down right now, but when back up I think his
piece will be here:
gemini://drewdevault.com/2020/08/13/Web-browsers-need-to-stop.gmi