The Tyrannies of the Browser

Published: (Draft)

Updated: 2022-04-21

Tags: #rant #security #tech #web #browsers

Amazingly Stoopid

Having spent a decade developing web sites/applications and most recently

specifically working on web application security I've been left with my head in

my hand and wondering:

Is the HTTP/HTML Browser fundamentally broken?

I firstly want to say that modern browsers are amazing peices of software, and

the WWW has revolutionised the world. However, browsers have some stoopid (and

dangerous) behaviour driven in part by:

* The evolution of software and specifications

* The dedication towards backward compatibility; and 

* Commercial interests overriding user interests.

Unfortunately:

Amazing + Stoopid = Amazingly Stoopid

What's this about Tyranny?

There really is no good alternative to HTTP/HTML Browser. I know you're

probably reading this on a Gemini Browser (if not go download one now) but it

really doesn't do a fraction of the things Chrome, Firefox, Safari, or Opera do

(this is of course by design).

So if we wish to develop a shiny interactive just-in-time internet delivered

application then we have no choice to submit ourselve to authority,

jurisdiction, and absolute rule of the HTTP/HTML Browser. We must subject

ourselves to its whims, impulses, varagies, and caprice. We are under its

tyranny.

As a developer we have to work hard to understand all the complexities and

security pitfalls that are thrown our way.

As a user we have to always be worried that clicking a wrong link will allow a

hacker to empty our bank accounts.

But why use the plural 'Tyrannies' ?

Fundamentally I think browsers suffer from a dedication towards backwards

compatibility. It's a honerable and ambitious goal to be sure; who want's to

be responsible for breaking the web?

New features get suggested, trialed, and become permanent parts of the browser.

It's very rare for a relied on feature to get removed. As a consquence browser

behaviour emerges from a complex web of features and API's. And as features

multiply so does the effort needed to understand how they all interact. The

evolution of the browser has only one direction - increasing complexity.

Some features can just be considered dangerous and oppressive by themselves,

think of Cookies.

Other dangerous behaviour comes from the interaction of browser features, for

example automatic redirects combined with the Same-Origin security model.

I refer to each these behaviours and features as a separate tyranny. They

often put us in harms way and make us less safe.

What are some examples?

The examples that follow are still largely unpolished. Most could probably

have their own gemlog entry and many drafts and revisions, but hopefully they

serve to give sufficient food for thought:

  1. Cookies: It's the elephant in the room and has enabled the survielance

economy. And when not used to track us developers still manage use them in ways

that make their applications insecure. Let's store content in a cookie you

say, what could go wrong? Initially designed to allow 'HTTP State Managedment'

over the stateless HTTP protocol, the design has turned every browser into a row

in a global distributed database.

  1. IFrames: They just result in nested security contexts that can be too easy

to misconfigure. By default they're unsanboxed. Sure getting rid of them

would elimiate a class of web application composition, but maybe that's a good

thing. Without them clickjacking would be harder (but not impossible).

Without the the tools for trackers to follow us would be fewer.

  1. Defaults are unsecure: See point above for an example. The content security

policy should be a strict as possible by default, not as open as possible. But

we don't do this as it breaks the web.

  1. Same-Origin security model: It's better than nothing, but basically binds

us to DNS as an identiy provider. Setting up CORs is apparently frustrating

enough for enough developers to just wildcard things. On top of that it doesn't

cater for domains that host multiple users. A cloud provider used to host

its cloud storage under its main domain giving a random static sites the

ability to access cookies and make queries to the cloud API's.

  1. The <script> tag: Why did we ever mix content and code? So many client

side hacks come from inserting code into content and watching it execute. The

DOM was basically turned into a shell in the Browser as soon as this was added.

  1. Everything's enabled by default: So unless you do a lot of work your simple

text page still has the ability to run a script. This is madness. Allow

listing is widely accepted as a better safety model. Scripting should have to

be enable, not the other way round. Another example is how unconfigured

service workers can turn XSS into persistent stored XSS, but this feature was

just added and turned on by default. This means that when SWs were added many

sites that had a small security problem suddenly had a much bigger security

problem.

  1. HTTP mixes agent configuration and transport configuration: Why is the CSP

header in the same data block as the Host header? HTTP is a network protocol

and headers are mutable to allow proxies to work their magic? How is a server

to ensure that the security configuration it sends a client arrives unaltered.

How does the server know transport headers weren't manipulated by the client?

  1. Meta tags duplicate header functionality: ...

  1. HTML and the DOM is the only game in town: ...

  1. ...

Proxy Information
Original URL
gemini://gemini.ctrl-c.club/~TomDotTom/gemlog-ideas-and-drafts/2022-04-21-The-Tyrannies-of-the-Browser.gmi
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
431.728283 milliseconds
Gemini-to-HTML Time
1.266672 milliseconds

This content has been proxied by September (ba2dc).