Single page apps

Now THAT'S What I Call Service Worker!

Weekly Timber is a client of mine that provides logging services in central Wisconsin. For them, a fast website is vital. Their business is located in Waushara County, and like many rural stretches in the United States, network quality and reliability isn’t great.

[…]

Wisconsin has farmland for days, but it also has plenty of forests. When you need a company that cuts logs, Google is probably your first stop. How fast a given logging company’s website is might be enough to get you looking elsewhere if you’re left waiting too long on a crappy network connection.

I initially didn’t believe a Service Worker was necessary for Weekly Timber’s website. After all, if things were plenty fast to start with, why complicate things? On the other hand, knowing that my client services not just Waushara County, but much of central Wisconsin, even a barebones Service Worker could be the kind of progressive enhancement that adds resilience in the places it might be needed most.

The first Service Worker I wrote for my client’s website—which I’ll refer to henceforth as the “standard” Service Worker—used three well-documented caching strategies:

  1. Precache CSS and JavaScript assets for all pages when the Service Worker is installed when the window’s load event fires.
  2. Serve static assets out of CacheStorage if available. If a static asset isn’t in CacheStorage, retrieve it from the network, then cache it for future visits.
  3. For HTML assets, hit the network first and place the HTML response into CacheStorage. If the network is unavailable the next time the visitor arrives, serve the cached markup from CacheStorage.

These are neither new nor special strategies, but they provide two benefits:

  • Offline capability, which is handy when network conditions are spotty.
  • A performance boost for loading static assets.

[…]

A better, faster Service Worker

The web loves itself some “innovation,” which is a word we equally love to throw around. To me, true innovation isn’t when we create new frameworks or patterns solely for the benefit of developers, but whether those inventions benefit people who end up using whatever it is we slap up on the web. The priority of constituencies is a thing we ought to respect. Users above all else, always.

[…]

There are certainly other challenges, but it’ll be up to you to weigh the user-facing benefits versus the development costs. In my opinion, this approach has broad applicability in applications such as blogs, marketing websites, news websites, ecommerce, and other typical use cases.

All in all, though, it’s akin to the performance improvements and efficiency gains that you’d get from an SPA. Only the difference is that you’re not replacing time-tested navigation mechanisms and grappling with all the messiness that entails, but enhancing them. That’s the part I think is really important to consider in a world where client-side routing is all the rage.

Spoiler: server-rendered HTML can work offline

The architecture astronauts who, for the past decade, have been selling us on the necessity of React, Redux, and megabytes of JS, cannot comprehend the possibility of building an email app in 2020 with server-rendered HTML 😴

[…]

The effects are truly toxic. Last decade’s obsession with SPAs has poisoned the minds of even the brightest teachers in our industry.

Like, there’s no way this stuff can work offline, right?!

Briefly read up on how HEY is implemented. Is this a correct summary of the pros and cons of its [server]-centric approach?
– Pro: Works fast on older devices.
– Con: Can’t be used offline.

Hey now

Progressive enhancement is at the heart of everything I do on the web. It’s the bedrock of my speaking and writing too. Whether I’m writing about JavaScript, Ajax, HTML, or service workers, it’s always through the lens of progressive enhancement. Sometimes I explicitly bang the drum, like with Resilient Web Design. Other times I don’t mention it by name at all, and instead talk only about its benefits.

I sometimes get asked to name some examples of sites that still offer their core functionality even when JavaScript fails. I usually mention Amazon.com, although that has other issues. But quite often I find that a lot of the examples I might mention are dismissed as not being “web apps” (whatever that means).

The pushback I get usually takes the form of “Well, that approach is fine for websites, but it wouldn’t work something like Gmail.”

It’s always Gmail. Which is odd. Because if you really wanted to flummox me with a product or service that defies progressive enhancement, I’d have a hard time with something like, say, a game (although it would be pretty cool to build a text adventure that’s progressively enhanced into a first-person shooter). But an email client? That would work.

[…]

Can you build something that works just like Gmail without using any JavaScript? No. But that’s not what progressive enhancement is about. It’s about providing the core functionality (reading and writing emails) with the simplest possible technology (HTML) and then enhancing using more powerful technologies (like JavaScript).

Progressive enhancement isn’t about making a choice between using simpler more robust technologies or using more advanced features; it’s about using simpler more robust technologies and then using more advanced features. Have your cake and eat it.

Fortunately I no longer need to run this thought experiment to imagine what it would be like if something like Gmail were built with a progressive enhancement approach. That’s what HEY is.

Sam Stephenson describes the approach they took:

HEY’s UI is 100% HTML over the wire. We render plain-old HTML pages on the server and send them to your browser encoded as text/html. No JSON APIs, no GraphQL, no React—just form submissions and links.

If you think that sounds like the web of 25 years ago, you’re right! Except the HEY front-end stack progressively enhances the “classic web” to work like the “2020 web,” with all the fidelity you’d expect from a well-built SPA.

See? It’s not either resilient or modern—it’s resilient and modern. Have your cake and eat it.

And yet this supremely sensible approach is not considered “modern” web development:

The architecture astronauts who, for the past decade, have been selling us on the necessity of React, Redux, and megabytes of JS, cannot comprehend the possibility of building an email app in 2020 with server-rendered HTML.

[…]

Their focus is very much on people above technology. They’ve taken a human-centric approach to their product and a human-centric approach to web development …because ultimately, that’s what progressive enhancement is.

Always bet on HTML

In September of 2019, Zach Leatherman tweeted

Which has a better First Meaningful Paint time?

  1. a raw 8.5MB HTML file with the full text of every single one of my 27,506 tweets
  2. a client rendered React site with exactly one tweet on it

(Spoiler: @____lighthouse reports 8.5MB of HTML wins by about 200ms)

Take a moment to wrap your head around that.

It’s perceivably faster to load 8.5 megabytes of HTML than it is to load a single tweet with a client-side React app.

[…]

Real companies can and do build apps with HTML first

The folks at Basecamp just released a new email product, Hey, that tries to address a lot of the stuff that people find frustrating about email.

Neither product is really my cup of tea, but what I find super interesting is how Hey is built.

It’s core is server-rendered HTML. Basecamp is a Ruby on Rails shop (their CTO created Rails). Almost every view in the app is created on a server.

Then, they sprinkle just a little vanilla JS on top to turn things up to 11.

Basecamp uses a project they open sourced called Turbolinks. This JavaScript plugin intercepts link clicks and progressively enhances a server-side app into a single-page app (or SPA) by fetching additional pages with Ajax and only replacing the stuff that needs updating.

By using this approach, if the JS fails or isn’t supported, the app still loads and works and gives people the full experience. It also means you don’t have to wait for the full JS package to load before you can start using the app.

You still get the benefits of faster page loading that SPAs sometimes give you, but you don’t have to maintain two code bases or do complicated server-to-client hand offs (“rehydration” as they call it in the React world).

Don't go single-page-app too soon, or how GitHub reimplementing navigation in JavaScript loses streaming capability

A few weeks ago I was at Heathrow airport getting a bit of work done before a flight, and I noticed something odd about the performance of GitHub: It was quicker to open links in a new window than simply click them.

[…]

When you load a page, the browser takes a network stream and pipes it to the HTML parser, and the HTML parser is piped to the document. This means the page can render progressively as it’s downloading. The page may be 100k, but it can render useful content after only 20k is received.

This is a great, ancient browser feature, but as developers we often engineer it away. Most load-time performance advice boils down to “show them what you got” - don’t hold back, don’t wait until you have everything before showing the user anything.

GitHub cares about performance so they server-render their pages. However, when navigating within the same tab navigation is entirely reimplemented using JavaScript. Something like…

Code language: JavaScript

// …lots of code to reimplement browser navigation…
const response = await fetch('page-data.inc');
const html = await response.text();
document.querySelector('.content').innerHTML = html;
// …loads more code to reimplement browser navigation…

This breaks the rule, as all of page-data.inc is downloaded before anything is done with it. The server-rendered version doesn’t hoard content this way, it streams, making it faster. For GitHub’s client-side render, a lot of JavaScript was written to make this slow.

I’m just using GitHub as an example here - this anti-pattern is used by almost every single-page-app.

Switching content in the page can have some benefits, especially if you have some heavy scripts, as you can update content without re-evaluating all that JS. But can we do that without losing streaming?

[…]

Newline-delimited JSON

A lot of sites deliver their dynamic updates as JSON. Unfortunately JSON isn’t a streaming-friendly format. There are streaming JSON parsers out there, but they aren’t easy to use.

So instead of delivering a chunk of JSON:

Code language: JavaScript

{
  "Comments": [
    {"author": "Alex", "body": "…"},
    {"author": "Jake", "body": "…"}
  ]
}

…deliver each JSON object on a new line:

Code language: JavaScript

{"author": "Alex", "body": "…"}
{"author": "Jake", "body": "…"}

This is called “newline-delimited JSON” and there’s a sort-of standard for it. Writing a parser for the above is much simpler. In 2017 we’ll be able to express this as a series of composable transform streams:

Code language: JavaScript

const response = await fetch('comments.ndjson');
const comments = response.body
  // From bytes to text:
  .pipeThrough(new TextDecoder())
  // Buffer until newlines:
  .pipeThrough(splitStream('\n'))
  // Parse chunks as JSON:
  .pipeThrough(parseJSON());
 
for await (const comment of comments) {
  // Process each comment and add it to the page:
  // (via whatever template or VDOM you're using)
  addCommentToPage(comment);
}

…where splitStream and parseJSON are reusable transform streams. But in the meantime, for maximum browser compatibility we can hack it on top of XHR.

Again, I’ve built a little demo where you can compare the two, here are the 3g results:

Versus normal JSON, ND-JSON gets content on screen 1.5 seconds sooner, although it isn’t quite as fast as the iframe solution. It has to wait for a complete JSON object before it can create elements, you may run into a lack-of-streaming if your JSON objects are huge.

Don’t go single-page-app too soon

As I mentioned above, GitHub wrote a lot of code to create this performance problem. Reimplementing navigations on the client is hard, and if you’re changing large parts of the page it might not be worth it.

[…]

[A] simple no-JavaScript browser navigation to a server rendered page is roughly as fast. The test page is really simple aside from the comments list, your mileage may vary if you have a lot of complex content repeated between pages (basically, I mean horrible ad scripts), but always test! You might be writing a lot of code for very little benefit, or even making it slower.