Enhancing Optimistically

Every so often, we come across ways to improve our more well-trodden core progressive enhancement patterns. Sometimes, we’ll utilize a new web standard to address problems we’d previously approached in a less-optimized manner, while other times we’ll make adjustments to address browser-or-network conditions that could be handled in more fault-tolerant ways. Recently, I came across an example of the latter, and this post will document a small but meaningful way I worked to accommodate it.

Serving Condiments

Permalink to 'Serving Condiments'

For quite a while now, we’ve been progressively enhancing sites using a pattern that these days many of us refer to as “Cutting the Mustard,” per Tom Maslen’s great metaphor. As the pattern goes, we run a series of feature tests relevant to enhancements we’d like to make to the page, and if the browser “cuts the mustard” we enable and load additional code that utilizes those features to improve the user experience.

The features we test vary from project to project, but the JavaScript code that performs the tests invariably sits inline in the head of the HTML page, so that it can run immediately when the HTML arrives. It tends to look something like this:

if( "querySelector" in window.document && "addEventListener" in window ){
// This is a capable browser, let's improve the UI further!
}

…in which we’ve tested the browser’s ability to query the DOM using CSS selectors, and its ability to add event listeners.

Tests at this stage are not fine-grained, but they serve as a decent diagnostic of the level of user agent we’re dealing with. Based on these sorts of tests (and sometimes a few more), we can assume that we’re dealing with a browser that is advanced enough to handle additional scripting and UI enhancements, and then apply our enhancements using additional fine-grained feature tests when necessary.

Once a browser passes the test, our first step in enhancing the page further is to add a class to the html element called enhanced, which can then be used to style elements in the page to match the enhanced functionality that they will receive when we apply additional behavior to them—say… converting a list of images into a smooth-scrolling carousel, or a heading/content pair into a toggleable progressive disclosure.

if( "querySelector" in window.document && "addEventListener" in window ){
// This is a capable browser, let's improve the UI further!
window.document.documentElement.className += " enhanced";
}

With this class in play, we can apply the presentation for our enhanced UI using a CSS parent selector, like this:

.foo {
/* basic styles for .foo go here */
}
.enhanced .foo {
/* enhanced styles for .foo go here */
}

…all of this so far is fairly common and well-known, I should note. But this pattern can be considered optimistic, perhaps too much so. That’s because often, the CSS that we apply using this class hinge the successful loading and application of some JavaScript that we begin to fetch (asynchronously) as soon as we add the class. Like this:

if( "querySelector" in window.document && "addEventListener" in window ){
// This is a capable browser, let's improve the UI further!
window.document.documentElement.className += " enhanced";

// load the enhanced scripting
loadJS( "/path/to/enhancements.js" );
}

Above, just after the enhanced class addition, I’ve requested a JavaScript file using our loadJS utility. LoadJS fetches files asynchronously (without blocking page rendering), so by using the entire enhancement pattern in this way, the user interface could briefly appear more enhanced than it functionally really is… that is, until the JavaScript loads and executes properly.

Now, this optimism is risky, but it’s useful for an important reason: by applying the CSS for the enhanced version of our UI right away, we avoid the possibility of the UI first rendering in its more basic presentation before flipping into a more enhanced state when the script finishes loading, which tends to look pretty janky. Also, rendering the enhanced UI as soon as we can allows the user to start visually digesting the page content sooner.

Of course, sometimes optimism can come back to bite you. A quick tangent, then back to the code…

A Hostile Medium

Permalink to 'A Hostile Medium'

As a web user, these sorts look-before-you-leap of coding patterns have saved me many times, but admittedly, they tend to most benefit folks who are browsing in less-than-ideal conditions, which we tech-priveleged web designers probably do less often than the typical web user. We talk about the real potential for file requests to hang, get blocked, or sometimes just fail to either load or apply, but more often than not, things work well for us and it’s easy to forget to properly plan for fault-tolerance when things aren’t working as perfectly as we’d like.

Interestly, ad and content blockers have brought these failure cases much closer to home, as their very purpose is to inflict user control over the sorts of code that can be transferred to, or executed in, their device, blocking fonts, ads, or even JavaScript itself through a flip of a switch. For a while now, I’ve been browsing with the Purify content blocker enabled on my iPhone. I installed Purify because it was the most popular app (not just the most popular content blocker, but app in general!) in the Apple App Store at the time, and I wanted to see how it would either improve or degrade my browsing experience, hopefully speeding up performance and minimizing time spent looking at loading spinners. I also wanted to install a content blocker because as a web developer, they represent an increasingly popular use case that I knew I’d need to consider in my development practices. Maybe I would learn that I could be building more robustly than I already was. So I flipped all the main switches on: Block all the things… in this case, ads, fonts, and JavaScript.

With Purify enabled, I expected to find a mixed bag of functional and broken pages on the wider web, and I sure did. When encountering a broken site, I could then decide to open Purify’s settings and “whitelist” that site, or just browse elsewhere (if I really liked the site, I’d do the former). Interestingly, a site that fails to load due to content blocking appears like it’s just plain “down” or busted… and I have a hunch the average content-blocker-user may not know enough to discern the difference and do the work of whitelisting the site. Developers take note!

Anyway, among the sites that didn’t quite work as they should was… gasp, Filament Group’s own website! In our case, some collapsible menus—including our primary navigation on interior pages—were displaying a mixed state: enhanced in appearance, but not functional because the JavaScript that controls them had failed to load due to a blocker. How could that be, I thought, when we’re so careful about these things? Indeed, our enhancement pattern appeared to be a bit too optimistic.

Weighing Options

Permalink to 'Weighing Options'

Fortunately for Filament’s site, the problem didn’t render the entire site unusable, but it did require the user to navigate to our homepage before choosing another section of the site to browse—very much not ideal. Of course, in sites that are more highly functional than Filament’s, the problem could be much more severe. A fix was definitely needed.

One obvious option was to ditch the early .enhanced class addition and simply let the external JavaScript file apply that class whenever it loads. This would fix the problem, but as I mentioned earlier, it would create a new one as well: users would see the page in its basic presentation first, then the page would re-render in its enhanced state when the script finished loading. If this took any significant amount of time, it would be pretty undesirable.

Another option was to stick with our current pattern and apply some fault tolerance, or dare I say, graceful degradation to accommodate this case should it ever arise.

We ended up taking second approach. Here’s how it works.

Cutting the Mustard, with takebacksies

Permalink to 'Cutting the Mustard, with takebacksies'

Most of the time, our pattern worked great, so we wanted to keep it as-is. But we also wanted to handle the case where things go wrong in a tolerant manner, so that the page would at least be usable. Unfortunately (amazingly!), we don’t have a reliable way to detect if the file fails to load, like we would with say, the onerror event. But, by applying a load event listener to our JavaScript file, we could at least keep tabs on whether that file ever does successfully load. As it turns out, combining a load listener with a reasonable timeout should be enough to handle this responsibly. Here we go…

Again, here is our pattern as it stands:

if( "querySelector" in window.document && "addEventListener" in window ){
// This is a capable browser, let's improve the UI further!
window.document.documentElement.className += " enhanced";

// load the enhanced scripting
loadJS( "/path/to/enhancements.js" );
}

Now let’s walk through some small changes to make it more bulletproof.

First, we want to decide a reasonable amount of loading time after which we can give up on the enhancements and just show a usable, basic experience. In our case, to degrade the UI all we have to do is remove that enhanced class from the html element. In debating a timeout, we considered that our site tends to be usable in under 2 seconds on a 3G or better connection, so we landed on a generous timeout of 8 seconds for the time at which we’d assume that something went wrong loading the file. Using a setTimeout function, we can do just that:

setTimeout( function(){
// remove the enhanced class
window.document.documentElement.className = window.document.documentElement.className.replace( " enhanced", "" );
}, 8000 );

…and now, with that added to our full example:

if( "querySelector" in window.document && "addEventListener" in window ){
// This is a capable browser, let's improve the UI further!
window.document.documentElement.className += " enhanced";

// load the enhanced scripting
loadJS( "/path/to/enhancements.js" );

// set a timeout to degrade the ui after 8 seconds
setTimeout( function(){
// remove the enhanced class
window.document.documentElement.className = window.document.documentElement.className.replace( " enhanced", "" );
}, 8000 );
}

Nice, that works great. But we’re not done yet. After all, that JavaScript file could still end up loading successfully, and if that happens, we’ll run into an entirely different mixed UI problem, where the behavior is applied but the presentation is not.

As I’d noted before, we can’t cancel that script from loading, but we can listen to see if it does load. By applying a load event listener, we could re-enhance the UI if that script ever does end up loading, long after we’ve already left it for dead.

The loadJS function returns a reference to the script we’re loading, so we can assign a variable to that call and bind event listeners to it after that.

// load the enhanced scripting
var script = loadJS( "/path/to/enhancements.js" );

// when the script loads, make sure that class is present still
script.onload = function(){
// add this class, just in case it was removed already (we can't cancel this request so it might arrive any time)
window.document.documentElement.className += " enhanced";
};

…and again with the full example:

if( "querySelector" in window.document && "addEventListener" in window ){
// This is a capable browser, let's improve the UI further!
window.document.documentElement.className += " enhanced";

// load the enhanced scripting
var script = loadJS( "/path/to/enhancements.js" );

// set a timeout to degrade the ui after 8 seconds
setTimeout( function(){
// remove the enhanced class
window.document.documentElement.className = window.document.documentElement.className.replace( " enhanced", "" );
}, 8000 );

// when the script loads, make sure that class is present still
script.onload = function(){
// add this class, just in case it was removed already (we can't cancel this request so it might arrive any time)
window.document.documentElement.className += " enhanced";
};
}

Whew… lotta logic for a small bit of code. Alas, we’re not quite done. That’s because in most cases, that JavaScript file will load very quickly—much sooner than our 8 second limit—but then the timer is still running and after 8 seconds, the UI will be degraded to a mixed, unusable state! So lastly, our script’s load handler needs to cancel that timer so that it never ends up executing its callback function. We can do that by assigning a variable to the timer, allowing us to clear it whenever we want.

Here’s that fallback variable in play, both in setting the timer, and clearing it if the script loads:

if( "querySelector" in window.document && "addEventListener" in window ){
// This is a capable browser, let's improve the UI further!
window.document.documentElement.className += " enhanced";

// load the enhanced scripting
var script = loadJS( "/path/to/enhancements.js" );

// if script hasn't loaded after 8 seconds, remove the enhanced class
var fallback = setTimeout( function(){
// remove the enhanced class
window.document.documentElement.className = window.document.documentElement.className.replace( " enhanced", "" );
}, 8000 );

// when the script loads, clear the timer out and add the class again just in case
script.onload = function(){
// clear the fallback timer
clearTimeout( fallback );
// add this class, just in case it was removed already (we can't cancel this request so it might arrive any time)
window.document.documentElement.className += " enhanced";
};
}

And that’s it! Of course, the script above could be refactored a little bit to improve its ease of maintenance. How’s this instead?

if( "querySelector" in window.document && "addEventListener" in window ){
// This is a capable browser, let's improve the UI further!
var docElem = window.document.documentElement;

// the class we'll use to enhance the UI
var enhancedClass = "enhanced";
var enhancedScriptPath = "/path/to/enhancements.js";

// add enhanced class
function addClass(){
docElem.className += " " + enhancedClass;
}

// remove enhanced class
function removeClass(){
docElem.className = docElem.className.replace( enhancedClass, " " );
}

// Let's enhance optimistically...
addClass();

// load enhanced JS file
var script = loadJS( enhancedScriptPath );

// if script hasn't loaded after 8 seconds, remove the enhanced class
var fallback = setTimeout( removeClass, 8000 );

// when the script loads, clear the timer out and add the class again just in case
script.onload = function(){
// clear the fallback timer
clearTimeout( fallback );
// add this class, just in case it was removed already (we can't cancel this request so it might arrive any time)
addClass();
};
}

Now THAT cuts the mustard! (Sorry…)

Anyway, visit any Filament Group interior page and bathe in the resilience! And of course, I should note that this approach is not purely about accommodating content blockers, but about better handling any case where a script fails to load (ever used the wifi on an Amtrak?). We’d love if you do the same on your site (seriously, I’m trying to use that thing! ;) ).

Thanks for reading, and do hit us up on twitter with any feedback or questions you may have.

All blog posts